00:00:00.002 Started by upstream project "autotest-nightly" build number 3914 00:00:00.002 originally caused by: 00:00:00.002 Started by user Latecki, Karol 00:00:00.003 Started by upstream project "autotest-nightly" build number 3912 00:00:00.003 originally caused by: 00:00:00.004 Started by user Latecki, Karol 00:00:00.004 Started by upstream project "autotest-nightly" build number 3911 00:00:00.004 originally caused by: 00:00:00.005 Started by user Latecki, Karol 00:00:00.005 Started by upstream project "autotest-nightly" build number 3909 00:00:00.005 originally caused by: 00:00:00.005 Started by user Latecki, Karol 00:00:00.006 Started by upstream project "autotest-nightly" build number 3908 00:00:00.006 originally caused by: 00:00:00.006 Started by user Latecki, Karol 00:00:00.074 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.077 The recommended git tool is: git 00:00:00.077 using credential 00000000-0000-0000-0000-000000000002 00:00:00.079 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.111 Fetching changes from the remote Git repository 00:00:00.113 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.177 Using shallow fetch with depth 1 00:00:00.177 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.177 > git --version # timeout=10 00:00:00.231 > git --version # 'git version 2.39.2' 00:00:00.231 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.284 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.284 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/changes/29/24129/6 # timeout=5 00:00:06.393 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:06.404 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:06.416 Checking out Revision e33ef006ccd688d2b66122cd0240b989d53c9017 (FETCH_HEAD) 00:00:06.416 > git config core.sparsecheckout # timeout=10 00:00:06.428 > git read-tree -mu HEAD # timeout=10 00:00:06.447 > git checkout -f e33ef006ccd688d2b66122cd0240b989d53c9017 # timeout=5 00:00:06.470 Commit message: "jenkins/jjb: remove nvme tests from distro specific jobs." 00:00:06.470 > git rev-list --no-walk 6b67f5fa1cb27c9c410cb5dac6df31d28ba79422 # timeout=10 00:00:06.566 [Pipeline] Start of Pipeline 00:00:06.582 [Pipeline] library 00:00:06.584 Loading library shm_lib@master 00:00:06.584 Library shm_lib@master is cached. Copying from home. 00:00:06.603 [Pipeline] node 00:00:06.618 Running on WFP23 in /var/jenkins/workspace/crypto-phy-autotest 00:00:06.621 [Pipeline] { 00:00:06.636 [Pipeline] catchError 00:00:06.637 [Pipeline] { 00:00:06.653 [Pipeline] wrap 00:00:06.665 [Pipeline] { 00:00:06.675 [Pipeline] stage 00:00:06.678 [Pipeline] { (Prologue) 00:00:06.893 [Pipeline] sh 00:00:07.174 + logger -p user.info -t JENKINS-CI 00:00:07.187 [Pipeline] echo 00:00:07.188 Node: WFP23 00:00:07.195 [Pipeline] sh 00:00:07.524 [Pipeline] setCustomBuildProperty 00:00:07.534 [Pipeline] echo 00:00:07.535 Cleanup processes 00:00:07.540 [Pipeline] sh 00:00:07.819 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:07.819 1195506 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:07.831 [Pipeline] sh 00:00:08.114 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:08.114 ++ grep -v 'sudo pgrep' 00:00:08.114 ++ awk '{print $1}' 00:00:08.114 + sudo kill -9 00:00:08.114 + true 00:00:08.126 [Pipeline] cleanWs 00:00:08.135 [WS-CLEANUP] Deleting project workspace... 00:00:08.135 [WS-CLEANUP] Deferred wipeout is used... 00:00:08.141 [WS-CLEANUP] done 00:00:08.145 [Pipeline] setCustomBuildProperty 00:00:08.159 [Pipeline] sh 00:00:08.441 + sudo git config --global --replace-all safe.directory '*' 00:00:08.536 [Pipeline] httpRequest 00:00:08.560 [Pipeline] echo 00:00:08.561 Sorcerer 10.211.164.101 is alive 00:00:08.569 [Pipeline] httpRequest 00:00:08.573 HttpMethod: GET 00:00:08.574 URL: http://10.211.164.101/packages/jbp_e33ef006ccd688d2b66122cd0240b989d53c9017.tar.gz 00:00:08.574 Sending request to url: http://10.211.164.101/packages/jbp_e33ef006ccd688d2b66122cd0240b989d53c9017.tar.gz 00:00:08.587 Response Code: HTTP/1.1 200 OK 00:00:08.587 Success: Status code 200 is in the accepted range: 200,404 00:00:08.588 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_e33ef006ccd688d2b66122cd0240b989d53c9017.tar.gz 00:00:11.207 [Pipeline] sh 00:00:11.488 + tar --no-same-owner -xf jbp_e33ef006ccd688d2b66122cd0240b989d53c9017.tar.gz 00:00:11.505 [Pipeline] httpRequest 00:00:11.538 [Pipeline] echo 00:00:11.541 Sorcerer 10.211.164.101 is alive 00:00:11.551 [Pipeline] httpRequest 00:00:11.555 HttpMethod: GET 00:00:11.556 URL: http://10.211.164.101/packages/spdk_f7b31b2b9679b48e9e13514a6b668058bb45fd56.tar.gz 00:00:11.557 Sending request to url: http://10.211.164.101/packages/spdk_f7b31b2b9679b48e9e13514a6b668058bb45fd56.tar.gz 00:00:11.572 Response Code: HTTP/1.1 200 OK 00:00:11.573 Success: Status code 200 is in the accepted range: 200,404 00:00:11.574 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_f7b31b2b9679b48e9e13514a6b668058bb45fd56.tar.gz 00:00:39.259 [Pipeline] sh 00:00:39.541 + tar --no-same-owner -xf spdk_f7b31b2b9679b48e9e13514a6b668058bb45fd56.tar.gz 00:00:42.086 [Pipeline] sh 00:00:42.403 + git -C spdk log --oneline -n5 00:00:42.403 f7b31b2b9 log: declare g_deprecation_epoch static 00:00:42.403 21d0c3ad6 trace: declare g_user_thread_index_start, g_ut_array and g_ut_array_mutex static 00:00:42.403 3731556bd lvol: declare g_lvol_if static 00:00:42.403 f8404a2d4 nvme: declare g_current_transport_index and g_spdk_transports static 00:00:42.403 34efb6523 dma: declare g_dma_mutex and g_dma_memory_domains static 00:00:42.414 [Pipeline] } 00:00:42.430 [Pipeline] // stage 00:00:42.439 [Pipeline] stage 00:00:42.441 [Pipeline] { (Prepare) 00:00:42.457 [Pipeline] writeFile 00:00:42.475 [Pipeline] sh 00:00:42.756 + logger -p user.info -t JENKINS-CI 00:00:42.767 [Pipeline] sh 00:00:43.048 + logger -p user.info -t JENKINS-CI 00:00:43.060 [Pipeline] sh 00:00:43.342 + cat autorun-spdk.conf 00:00:43.342 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:43.342 SPDK_TEST_BLOCKDEV=1 00:00:43.342 SPDK_TEST_ISAL=1 00:00:43.342 SPDK_TEST_CRYPTO=1 00:00:43.342 SPDK_TEST_REDUCE=1 00:00:43.342 SPDK_TEST_VBDEV_COMPRESS=1 00:00:43.342 SPDK_RUN_ASAN=1 00:00:43.342 SPDK_RUN_UBSAN=1 00:00:43.348 RUN_NIGHTLY=1 00:00:43.353 [Pipeline] readFile 00:00:43.376 [Pipeline] withEnv 00:00:43.378 [Pipeline] { 00:00:43.389 [Pipeline] sh 00:00:43.667 + set -ex 00:00:43.667 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:00:43.667 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:43.667 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:43.667 ++ SPDK_TEST_BLOCKDEV=1 00:00:43.667 ++ SPDK_TEST_ISAL=1 00:00:43.667 ++ SPDK_TEST_CRYPTO=1 00:00:43.667 ++ SPDK_TEST_REDUCE=1 00:00:43.667 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:43.667 ++ SPDK_RUN_ASAN=1 00:00:43.667 ++ SPDK_RUN_UBSAN=1 00:00:43.667 ++ RUN_NIGHTLY=1 00:00:43.667 + case $SPDK_TEST_NVMF_NICS in 00:00:43.667 + DRIVERS= 00:00:43.667 + [[ -n '' ]] 00:00:43.667 + exit 0 00:00:43.676 [Pipeline] } 00:00:43.694 [Pipeline] // withEnv 00:00:43.700 [Pipeline] } 00:00:43.718 [Pipeline] // stage 00:00:43.727 [Pipeline] catchError 00:00:43.729 [Pipeline] { 00:00:43.745 [Pipeline] timeout 00:00:43.745 Timeout set to expire in 1 hr 0 min 00:00:43.747 [Pipeline] { 00:00:43.760 [Pipeline] stage 00:00:43.762 [Pipeline] { (Tests) 00:00:43.775 [Pipeline] sh 00:00:44.057 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:00:44.057 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:00:44.057 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:00:44.057 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:00:44.057 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:44.057 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:00:44.057 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:00:44.057 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:44.057 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:00:44.057 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:44.057 + [[ crypto-phy-autotest == pkgdep-* ]] 00:00:44.057 + cd /var/jenkins/workspace/crypto-phy-autotest 00:00:44.057 + source /etc/os-release 00:00:44.057 ++ NAME='Fedora Linux' 00:00:44.057 ++ VERSION='38 (Cloud Edition)' 00:00:44.057 ++ ID=fedora 00:00:44.057 ++ VERSION_ID=38 00:00:44.057 ++ VERSION_CODENAME= 00:00:44.057 ++ PLATFORM_ID=platform:f38 00:00:44.057 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:44.057 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:44.057 ++ LOGO=fedora-logo-icon 00:00:44.057 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:44.057 ++ HOME_URL=https://fedoraproject.org/ 00:00:44.057 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:44.057 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:44.057 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:44.057 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:44.057 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:44.058 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:44.058 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:44.058 ++ SUPPORT_END=2024-05-14 00:00:44.058 ++ VARIANT='Cloud Edition' 00:00:44.058 ++ VARIANT_ID=cloud 00:00:44.058 + uname -a 00:00:44.058 Linux spdk-wfp-23 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 02:47:10 UTC 2024 x86_64 GNU/Linux 00:00:44.058 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:00:46.588 Hugepages 00:00:46.588 node hugesize free / total 00:00:46.588 node0 1048576kB 0 / 0 00:00:46.588 node0 2048kB 0 / 0 00:00:46.588 node1 1048576kB 0 / 0 00:00:46.588 node1 2048kB 0 / 0 00:00:46.588 00:00:46.588 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:46.588 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:46.588 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:46.588 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:46.588 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:46.588 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:46.588 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:46.588 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:46.588 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:46.588 NVMe 0000:60:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:00:46.588 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:46.588 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:46.588 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:46.588 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:46.588 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:46.588 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:46.588 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:46.588 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:46.588 + rm -f /tmp/spdk-ld-path 00:00:46.588 + source autorun-spdk.conf 00:00:46.588 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:46.588 ++ SPDK_TEST_BLOCKDEV=1 00:00:46.588 ++ SPDK_TEST_ISAL=1 00:00:46.588 ++ SPDK_TEST_CRYPTO=1 00:00:46.588 ++ SPDK_TEST_REDUCE=1 00:00:46.588 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:46.588 ++ SPDK_RUN_ASAN=1 00:00:46.588 ++ SPDK_RUN_UBSAN=1 00:00:46.588 ++ RUN_NIGHTLY=1 00:00:46.588 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:46.588 + [[ -n '' ]] 00:00:46.588 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:46.588 + for M in /var/spdk/build-*-manifest.txt 00:00:46.588 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:46.588 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:46.588 + for M in /var/spdk/build-*-manifest.txt 00:00:46.588 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:46.588 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:46.588 ++ uname 00:00:46.588 + [[ Linux == \L\i\n\u\x ]] 00:00:46.588 + sudo dmesg -T 00:00:46.588 + sudo dmesg --clear 00:00:46.588 + dmesg_pid=1197315 00:00:46.588 + [[ Fedora Linux == FreeBSD ]] 00:00:46.588 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:46.588 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:46.588 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:46.588 + [[ -x /usr/src/fio-static/fio ]] 00:00:46.588 + export FIO_BIN=/usr/src/fio-static/fio 00:00:46.588 + sudo dmesg -Tw 00:00:46.588 + FIO_BIN=/usr/src/fio-static/fio 00:00:46.588 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:46.588 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:46.588 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:46.588 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:46.588 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:46.588 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:46.588 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:46.588 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:46.588 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:46.588 Test configuration: 00:00:46.588 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:46.588 SPDK_TEST_BLOCKDEV=1 00:00:46.588 SPDK_TEST_ISAL=1 00:00:46.588 SPDK_TEST_CRYPTO=1 00:00:46.588 SPDK_TEST_REDUCE=1 00:00:46.588 SPDK_TEST_VBDEV_COMPRESS=1 00:00:46.588 SPDK_RUN_ASAN=1 00:00:46.588 SPDK_RUN_UBSAN=1 00:00:46.588 RUN_NIGHTLY=1 08:13:58 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:00:46.588 08:13:58 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:46.588 08:13:58 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:46.588 08:13:58 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:46.588 08:13:58 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:46.588 08:13:58 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:46.588 08:13:58 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:46.588 08:13:58 -- paths/export.sh@5 -- $ export PATH 00:00:46.588 08:13:58 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:46.588 08:13:58 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:00:46.588 08:13:58 -- common/autobuild_common.sh@447 -- $ date +%s 00:00:46.588 08:13:58 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721715238.XXXXXX 00:00:46.588 08:13:58 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721715238.gn7z0S 00:00:46.588 08:13:58 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:00:46.588 08:13:58 -- common/autobuild_common.sh@453 -- $ '[' -n '' ']' 00:00:46.588 08:13:58 -- common/autobuild_common.sh@456 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:00:46.588 08:13:58 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:46.588 08:13:58 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:46.588 08:13:58 -- common/autobuild_common.sh@463 -- $ get_config_params 00:00:46.588 08:13:58 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:00:46.588 08:13:58 -- common/autotest_common.sh@10 -- $ set +x 00:00:46.588 08:13:58 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-asan --enable-coverage --with-ublk' 00:00:46.588 08:13:58 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:00:46.588 08:13:58 -- pm/common@17 -- $ local monitor 00:00:46.588 08:13:58 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:46.588 08:13:58 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:46.588 08:13:58 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:46.588 08:13:58 -- pm/common@21 -- $ date +%s 00:00:46.589 08:13:58 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:46.589 08:13:58 -- pm/common@21 -- $ date +%s 00:00:46.589 08:13:58 -- pm/common@21 -- $ date +%s 00:00:46.589 08:13:58 -- pm/common@25 -- $ sleep 1 00:00:46.589 08:13:58 -- pm/common@21 -- $ date +%s 00:00:46.589 08:13:58 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721715238 00:00:46.589 08:13:58 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721715238 00:00:46.589 08:13:58 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721715238 00:00:46.589 08:13:58 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721715238 00:00:46.589 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721715238_collect-vmstat.pm.log 00:00:46.589 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721715238_collect-cpu-load.pm.log 00:00:46.589 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721715238_collect-cpu-temp.pm.log 00:00:46.589 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721715238_collect-bmc-pm.bmc.pm.log 00:00:47.527 08:13:59 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:00:47.527 08:13:59 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:47.527 08:13:59 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:47.527 08:13:59 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:47.527 08:13:59 -- spdk/autobuild.sh@16 -- $ date -u 00:00:47.527 Tue Jul 23 06:13:59 AM UTC 2024 00:00:47.527 08:13:59 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:47.527 v24.09-pre-297-gf7b31b2b9 00:00:47.527 08:13:59 -- spdk/autobuild.sh@19 -- $ '[' 1 -eq 1 ']' 00:00:47.527 08:13:59 -- spdk/autobuild.sh@20 -- $ run_test asan echo 'using asan' 00:00:47.527 08:13:59 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:00:47.527 08:13:59 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:00:47.527 08:13:59 -- common/autotest_common.sh@10 -- $ set +x 00:00:47.527 ************************************ 00:00:47.527 START TEST asan 00:00:47.527 ************************************ 00:00:47.527 08:13:59 asan -- common/autotest_common.sh@1123 -- $ echo 'using asan' 00:00:47.527 using asan 00:00:47.527 00:00:47.527 real 0m0.000s 00:00:47.527 user 0m0.000s 00:00:47.527 sys 0m0.000s 00:00:47.527 08:13:59 asan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:00:47.527 08:13:59 asan -- common/autotest_common.sh@10 -- $ set +x 00:00:47.527 ************************************ 00:00:47.527 END TEST asan 00:00:47.527 ************************************ 00:00:47.527 08:13:59 -- common/autotest_common.sh@1142 -- $ return 0 00:00:47.527 08:13:59 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:47.527 08:13:59 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:47.527 08:13:59 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:00:47.527 08:13:59 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:00:47.527 08:13:59 -- common/autotest_common.sh@10 -- $ set +x 00:00:47.527 ************************************ 00:00:47.527 START TEST ubsan 00:00:47.527 ************************************ 00:00:47.527 08:13:59 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:00:47.527 using ubsan 00:00:47.527 00:00:47.527 real 0m0.000s 00:00:47.527 user 0m0.000s 00:00:47.527 sys 0m0.000s 00:00:47.527 08:13:59 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:00:47.527 08:13:59 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:00:47.527 ************************************ 00:00:47.527 END TEST ubsan 00:00:47.527 ************************************ 00:00:47.527 08:14:00 -- common/autotest_common.sh@1142 -- $ return 0 00:00:47.527 08:14:00 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:47.527 08:14:00 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:47.527 08:14:00 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:47.527 08:14:00 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:00:47.527 08:14:00 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:00:47.527 08:14:00 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:00:47.527 08:14:00 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:00:47.527 08:14:00 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:00:47.527 08:14:00 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-asan --enable-coverage --with-ublk --with-shared 00:00:47.786 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:00:47.786 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:00:48.044 Using 'verbs' RDMA provider 00:01:01.200 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:13.411 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:13.411 Creating mk/config.mk...done. 00:01:13.411 Creating mk/cc.flags.mk...done. 00:01:13.411 Type 'make' to build. 00:01:13.411 08:14:24 -- spdk/autobuild.sh@69 -- $ run_test make make -j96 00:01:13.411 08:14:24 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:13.411 08:14:24 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:13.411 08:14:24 -- common/autotest_common.sh@10 -- $ set +x 00:01:13.411 ************************************ 00:01:13.411 START TEST make 00:01:13.411 ************************************ 00:01:13.411 08:14:24 make -- common/autotest_common.sh@1123 -- $ make -j96 00:01:13.411 make[1]: Nothing to be done for 'all'. 00:01:45.497 The Meson build system 00:01:45.497 Version: 1.3.1 00:01:45.497 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:01:45.497 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:01:45.497 Build type: native build 00:01:45.497 Program cat found: YES (/usr/bin/cat) 00:01:45.497 Project name: DPDK 00:01:45.497 Project version: 24.03.0 00:01:45.497 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:45.497 C linker for the host machine: cc ld.bfd 2.39-16 00:01:45.497 Host machine cpu family: x86_64 00:01:45.497 Host machine cpu: x86_64 00:01:45.497 Message: ## Building in Developer Mode ## 00:01:45.497 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:45.497 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:45.497 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:45.497 Program python3 found: YES (/usr/bin/python3) 00:01:45.497 Program cat found: YES (/usr/bin/cat) 00:01:45.497 Compiler for C supports arguments -march=native: YES 00:01:45.497 Checking for size of "void *" : 8 00:01:45.497 Checking for size of "void *" : 8 (cached) 00:01:45.497 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:01:45.497 Library m found: YES 00:01:45.497 Library numa found: YES 00:01:45.497 Has header "numaif.h" : YES 00:01:45.497 Library fdt found: NO 00:01:45.497 Library execinfo found: NO 00:01:45.497 Has header "execinfo.h" : YES 00:01:45.497 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:45.497 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:45.497 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:45.497 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:45.497 Run-time dependency openssl found: YES 3.0.9 00:01:45.497 Run-time dependency libpcap found: YES 1.10.4 00:01:45.497 Has header "pcap.h" with dependency libpcap: YES 00:01:45.497 Compiler for C supports arguments -Wcast-qual: YES 00:01:45.497 Compiler for C supports arguments -Wdeprecated: YES 00:01:45.497 Compiler for C supports arguments -Wformat: YES 00:01:45.497 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:45.497 Compiler for C supports arguments -Wformat-security: NO 00:01:45.497 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:45.497 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:45.497 Compiler for C supports arguments -Wnested-externs: YES 00:01:45.497 Compiler for C supports arguments -Wold-style-definition: YES 00:01:45.497 Compiler for C supports arguments -Wpointer-arith: YES 00:01:45.497 Compiler for C supports arguments -Wsign-compare: YES 00:01:45.497 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:45.497 Compiler for C supports arguments -Wundef: YES 00:01:45.497 Compiler for C supports arguments -Wwrite-strings: YES 00:01:45.497 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:45.497 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:45.497 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:45.497 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:45.497 Program objdump found: YES (/usr/bin/objdump) 00:01:45.497 Compiler for C supports arguments -mavx512f: YES 00:01:45.497 Checking if "AVX512 checking" compiles: YES 00:01:45.497 Fetching value of define "__SSE4_2__" : 1 00:01:45.497 Fetching value of define "__AES__" : 1 00:01:45.497 Fetching value of define "__AVX__" : 1 00:01:45.497 Fetching value of define "__AVX2__" : 1 00:01:45.497 Fetching value of define "__AVX512BW__" : 1 00:01:45.497 Fetching value of define "__AVX512CD__" : 1 00:01:45.497 Fetching value of define "__AVX512DQ__" : 1 00:01:45.497 Fetching value of define "__AVX512F__" : 1 00:01:45.497 Fetching value of define "__AVX512VL__" : 1 00:01:45.497 Fetching value of define "__PCLMUL__" : 1 00:01:45.497 Fetching value of define "__RDRND__" : 1 00:01:45.497 Fetching value of define "__RDSEED__" : 1 00:01:45.497 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:45.497 Fetching value of define "__znver1__" : (undefined) 00:01:45.497 Fetching value of define "__znver2__" : (undefined) 00:01:45.497 Fetching value of define "__znver3__" : (undefined) 00:01:45.497 Fetching value of define "__znver4__" : (undefined) 00:01:45.497 Library asan found: YES 00:01:45.497 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:45.497 Message: lib/log: Defining dependency "log" 00:01:45.497 Message: lib/kvargs: Defining dependency "kvargs" 00:01:45.497 Message: lib/telemetry: Defining dependency "telemetry" 00:01:45.497 Library rt found: YES 00:01:45.497 Checking for function "getentropy" : NO 00:01:45.497 Message: lib/eal: Defining dependency "eal" 00:01:45.497 Message: lib/ring: Defining dependency "ring" 00:01:45.497 Message: lib/rcu: Defining dependency "rcu" 00:01:45.497 Message: lib/mempool: Defining dependency "mempool" 00:01:45.497 Message: lib/mbuf: Defining dependency "mbuf" 00:01:45.497 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:45.497 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:45.497 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:45.497 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:45.497 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:45.497 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:45.497 Compiler for C supports arguments -mpclmul: YES 00:01:45.497 Compiler for C supports arguments -maes: YES 00:01:45.497 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:45.497 Compiler for C supports arguments -mavx512bw: YES 00:01:45.497 Compiler for C supports arguments -mavx512dq: YES 00:01:45.497 Compiler for C supports arguments -mavx512vl: YES 00:01:45.497 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:45.497 Compiler for C supports arguments -mavx2: YES 00:01:45.497 Compiler for C supports arguments -mavx: YES 00:01:45.497 Message: lib/net: Defining dependency "net" 00:01:45.497 Message: lib/meter: Defining dependency "meter" 00:01:45.497 Message: lib/ethdev: Defining dependency "ethdev" 00:01:45.497 Message: lib/pci: Defining dependency "pci" 00:01:45.497 Message: lib/cmdline: Defining dependency "cmdline" 00:01:45.497 Message: lib/hash: Defining dependency "hash" 00:01:45.497 Message: lib/timer: Defining dependency "timer" 00:01:45.497 Message: lib/compressdev: Defining dependency "compressdev" 00:01:45.497 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:45.497 Message: lib/dmadev: Defining dependency "dmadev" 00:01:45.497 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:45.497 Message: lib/power: Defining dependency "power" 00:01:45.497 Message: lib/reorder: Defining dependency "reorder" 00:01:45.497 Message: lib/security: Defining dependency "security" 00:01:45.497 Has header "linux/userfaultfd.h" : YES 00:01:45.497 Has header "linux/vduse.h" : YES 00:01:45.497 Message: lib/vhost: Defining dependency "vhost" 00:01:45.497 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:45.497 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:01:45.497 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:45.497 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:45.497 Compiler for C supports arguments -std=c11: YES 00:01:45.497 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:01:45.497 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:01:45.497 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:01:45.497 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:01:45.497 Run-time dependency libmlx5 found: YES 1.24.46.0 00:01:45.497 Run-time dependency libibverbs found: YES 1.14.46.0 00:01:45.497 Library mtcr_ul found: NO 00:01:45.497 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:01:45.497 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:01:45.498 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:01:45.498 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:01:45.498 Configuring mlx5_autoconf.h using configuration 00:01:45.498 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:01:45.498 Run-time dependency libcrypto found: YES 3.0.9 00:01:45.498 Library IPSec_MB found: YES 00:01:45.498 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:01:45.498 Message: drivers/common/qat: Defining dependency "common_qat" 00:01:45.498 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:45.498 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:01:45.498 Library IPSec_MB found: YES 00:01:45.498 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:01:45.498 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:01:45.498 Compiler for C supports arguments -std=c11: YES (cached) 00:01:45.498 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:01:45.498 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:01:45.498 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:01:45.498 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:01:45.498 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:01:45.498 Run-time dependency libisal found: NO (tried pkgconfig) 00:01:45.498 Library libisal found: NO 00:01:45.498 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:01:45.498 Compiler for C supports arguments -std=c11: YES (cached) 00:01:45.498 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:01:45.498 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:01:45.498 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:01:45.498 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:01:45.498 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:01:45.498 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:01:45.498 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:01:45.498 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:01:45.498 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:01:45.498 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:01:45.498 Program doxygen found: YES (/usr/bin/doxygen) 00:01:45.498 Configuring doxy-api-html.conf using configuration 00:01:45.498 Configuring doxy-api-man.conf using configuration 00:01:45.498 Program mandb found: YES (/usr/bin/mandb) 00:01:45.498 Program sphinx-build found: NO 00:01:45.498 Configuring rte_build_config.h using configuration 00:01:45.498 Message: 00:01:45.498 ================= 00:01:45.498 Applications Enabled 00:01:45.498 ================= 00:01:45.498 00:01:45.498 apps: 00:01:45.498 00:01:45.498 00:01:45.498 Message: 00:01:45.498 ================= 00:01:45.498 Libraries Enabled 00:01:45.498 ================= 00:01:45.498 00:01:45.498 libs: 00:01:45.498 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:45.498 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:01:45.498 cryptodev, dmadev, power, reorder, security, vhost, 00:01:45.498 00:01:45.498 Message: 00:01:45.498 =============== 00:01:45.498 Drivers Enabled 00:01:45.498 =============== 00:01:45.498 00:01:45.498 common: 00:01:45.498 mlx5, qat, 00:01:45.498 bus: 00:01:45.498 auxiliary, pci, vdev, 00:01:45.498 mempool: 00:01:45.498 ring, 00:01:45.498 dma: 00:01:45.498 00:01:45.498 net: 00:01:45.498 00:01:45.498 crypto: 00:01:45.498 ipsec_mb, mlx5, 00:01:45.498 compress: 00:01:45.498 isal, mlx5, 00:01:45.498 vdpa: 00:01:45.498 00:01:45.498 00:01:45.498 Message: 00:01:45.498 ================= 00:01:45.498 Content Skipped 00:01:45.499 ================= 00:01:45.499 00:01:45.499 apps: 00:01:45.499 dumpcap: explicitly disabled via build config 00:01:45.499 graph: explicitly disabled via build config 00:01:45.499 pdump: explicitly disabled via build config 00:01:45.499 proc-info: explicitly disabled via build config 00:01:45.499 test-acl: explicitly disabled via build config 00:01:45.499 test-bbdev: explicitly disabled via build config 00:01:45.499 test-cmdline: explicitly disabled via build config 00:01:45.499 test-compress-perf: explicitly disabled via build config 00:01:45.499 test-crypto-perf: explicitly disabled via build config 00:01:45.499 test-dma-perf: explicitly disabled via build config 00:01:45.499 test-eventdev: explicitly disabled via build config 00:01:45.499 test-fib: explicitly disabled via build config 00:01:45.499 test-flow-perf: explicitly disabled via build config 00:01:45.499 test-gpudev: explicitly disabled via build config 00:01:45.499 test-mldev: explicitly disabled via build config 00:01:45.499 test-pipeline: explicitly disabled via build config 00:01:45.499 test-pmd: explicitly disabled via build config 00:01:45.499 test-regex: explicitly disabled via build config 00:01:45.499 test-sad: explicitly disabled via build config 00:01:45.499 test-security-perf: explicitly disabled via build config 00:01:45.499 00:01:45.499 libs: 00:01:45.499 argparse: explicitly disabled via build config 00:01:45.499 metrics: explicitly disabled via build config 00:01:45.499 acl: explicitly disabled via build config 00:01:45.499 bbdev: explicitly disabled via build config 00:01:45.499 bitratestats: explicitly disabled via build config 00:01:45.499 bpf: explicitly disabled via build config 00:01:45.499 cfgfile: explicitly disabled via build config 00:01:45.499 distributor: explicitly disabled via build config 00:01:45.499 efd: explicitly disabled via build config 00:01:45.499 eventdev: explicitly disabled via build config 00:01:45.499 dispatcher: explicitly disabled via build config 00:01:45.499 gpudev: explicitly disabled via build config 00:01:45.499 gro: explicitly disabled via build config 00:01:45.499 gso: explicitly disabled via build config 00:01:45.499 ip_frag: explicitly disabled via build config 00:01:45.499 jobstats: explicitly disabled via build config 00:01:45.499 latencystats: explicitly disabled via build config 00:01:45.499 lpm: explicitly disabled via build config 00:01:45.499 member: explicitly disabled via build config 00:01:45.499 pcapng: explicitly disabled via build config 00:01:45.499 rawdev: explicitly disabled via build config 00:01:45.499 regexdev: explicitly disabled via build config 00:01:45.499 mldev: explicitly disabled via build config 00:01:45.499 rib: explicitly disabled via build config 00:01:45.499 sched: explicitly disabled via build config 00:01:45.499 stack: explicitly disabled via build config 00:01:45.499 ipsec: explicitly disabled via build config 00:01:45.499 pdcp: explicitly disabled via build config 00:01:45.499 fib: explicitly disabled via build config 00:01:45.499 port: explicitly disabled via build config 00:01:45.499 pdump: explicitly disabled via build config 00:01:45.499 table: explicitly disabled via build config 00:01:45.499 pipeline: explicitly disabled via build config 00:01:45.499 graph: explicitly disabled via build config 00:01:45.499 node: explicitly disabled via build config 00:01:45.499 00:01:45.499 drivers: 00:01:45.499 common/cpt: not in enabled drivers build config 00:01:45.499 common/dpaax: not in enabled drivers build config 00:01:45.499 common/iavf: not in enabled drivers build config 00:01:45.499 common/idpf: not in enabled drivers build config 00:01:45.499 common/ionic: not in enabled drivers build config 00:01:45.499 common/mvep: not in enabled drivers build config 00:01:45.499 common/octeontx: not in enabled drivers build config 00:01:45.499 bus/cdx: not in enabled drivers build config 00:01:45.499 bus/dpaa: not in enabled drivers build config 00:01:45.499 bus/fslmc: not in enabled drivers build config 00:01:45.499 bus/ifpga: not in enabled drivers build config 00:01:45.499 bus/platform: not in enabled drivers build config 00:01:45.499 bus/uacce: not in enabled drivers build config 00:01:45.499 bus/vmbus: not in enabled drivers build config 00:01:45.499 common/cnxk: not in enabled drivers build config 00:01:45.499 common/nfp: not in enabled drivers build config 00:01:45.499 common/nitrox: not in enabled drivers build config 00:01:45.499 common/sfc_efx: not in enabled drivers build config 00:01:45.499 mempool/bucket: not in enabled drivers build config 00:01:45.499 mempool/cnxk: not in enabled drivers build config 00:01:45.499 mempool/dpaa: not in enabled drivers build config 00:01:45.499 mempool/dpaa2: not in enabled drivers build config 00:01:45.499 mempool/octeontx: not in enabled drivers build config 00:01:45.499 mempool/stack: not in enabled drivers build config 00:01:45.499 dma/cnxk: not in enabled drivers build config 00:01:45.499 dma/dpaa: not in enabled drivers build config 00:01:45.499 dma/dpaa2: not in enabled drivers build config 00:01:45.499 dma/hisilicon: not in enabled drivers build config 00:01:45.499 dma/idxd: not in enabled drivers build config 00:01:45.499 dma/ioat: not in enabled drivers build config 00:01:45.499 dma/skeleton: not in enabled drivers build config 00:01:45.499 net/af_packet: not in enabled drivers build config 00:01:45.499 net/af_xdp: not in enabled drivers build config 00:01:45.499 net/ark: not in enabled drivers build config 00:01:45.499 net/atlantic: not in enabled drivers build config 00:01:45.499 net/avp: not in enabled drivers build config 00:01:45.499 net/axgbe: not in enabled drivers build config 00:01:45.499 net/bnx2x: not in enabled drivers build config 00:01:45.499 net/bnxt: not in enabled drivers build config 00:01:45.499 net/bonding: not in enabled drivers build config 00:01:45.499 net/cnxk: not in enabled drivers build config 00:01:45.499 net/cpfl: not in enabled drivers build config 00:01:45.499 net/cxgbe: not in enabled drivers build config 00:01:45.499 net/dpaa: not in enabled drivers build config 00:01:45.499 net/dpaa2: not in enabled drivers build config 00:01:45.499 net/e1000: not in enabled drivers build config 00:01:45.499 net/ena: not in enabled drivers build config 00:01:45.499 net/enetc: not in enabled drivers build config 00:01:45.499 net/enetfec: not in enabled drivers build config 00:01:45.499 net/enic: not in enabled drivers build config 00:01:45.499 net/failsafe: not in enabled drivers build config 00:01:45.499 net/fm10k: not in enabled drivers build config 00:01:45.499 net/gve: not in enabled drivers build config 00:01:45.499 net/hinic: not in enabled drivers build config 00:01:45.499 net/hns3: not in enabled drivers build config 00:01:45.499 net/i40e: not in enabled drivers build config 00:01:45.499 net/iavf: not in enabled drivers build config 00:01:45.499 net/ice: not in enabled drivers build config 00:01:45.499 net/idpf: not in enabled drivers build config 00:01:45.499 net/igc: not in enabled drivers build config 00:01:45.499 net/ionic: not in enabled drivers build config 00:01:45.499 net/ipn3ke: not in enabled drivers build config 00:01:45.499 net/ixgbe: not in enabled drivers build config 00:01:45.499 net/mana: not in enabled drivers build config 00:01:45.499 net/memif: not in enabled drivers build config 00:01:45.499 net/mlx4: not in enabled drivers build config 00:01:45.499 net/mlx5: not in enabled drivers build config 00:01:45.499 net/mvneta: not in enabled drivers build config 00:01:45.499 net/mvpp2: not in enabled drivers build config 00:01:45.499 net/netvsc: not in enabled drivers build config 00:01:45.499 net/nfb: not in enabled drivers build config 00:01:45.499 net/nfp: not in enabled drivers build config 00:01:45.499 net/ngbe: not in enabled drivers build config 00:01:45.499 net/null: not in enabled drivers build config 00:01:45.499 net/octeontx: not in enabled drivers build config 00:01:45.499 net/octeon_ep: not in enabled drivers build config 00:01:45.499 net/pcap: not in enabled drivers build config 00:01:45.499 net/pfe: not in enabled drivers build config 00:01:45.499 net/qede: not in enabled drivers build config 00:01:45.499 net/ring: not in enabled drivers build config 00:01:45.499 net/sfc: not in enabled drivers build config 00:01:45.499 net/softnic: not in enabled drivers build config 00:01:45.499 net/tap: not in enabled drivers build config 00:01:45.499 net/thunderx: not in enabled drivers build config 00:01:45.499 net/txgbe: not in enabled drivers build config 00:01:45.499 net/vdev_netvsc: not in enabled drivers build config 00:01:45.499 net/vhost: not in enabled drivers build config 00:01:45.499 net/virtio: not in enabled drivers build config 00:01:45.499 net/vmxnet3: not in enabled drivers build config 00:01:45.499 raw/*: missing internal dependency, "rawdev" 00:01:45.499 crypto/armv8: not in enabled drivers build config 00:01:45.499 crypto/bcmfs: not in enabled drivers build config 00:01:45.500 crypto/caam_jr: not in enabled drivers build config 00:01:45.500 crypto/ccp: not in enabled drivers build config 00:01:45.500 crypto/cnxk: not in enabled drivers build config 00:01:45.500 crypto/dpaa_sec: not in enabled drivers build config 00:01:45.500 crypto/dpaa2_sec: not in enabled drivers build config 00:01:45.500 crypto/mvsam: not in enabled drivers build config 00:01:45.500 crypto/nitrox: not in enabled drivers build config 00:01:45.500 crypto/null: not in enabled drivers build config 00:01:45.500 crypto/octeontx: not in enabled drivers build config 00:01:45.500 crypto/openssl: not in enabled drivers build config 00:01:45.500 crypto/scheduler: not in enabled drivers build config 00:01:45.500 crypto/uadk: not in enabled drivers build config 00:01:45.500 crypto/virtio: not in enabled drivers build config 00:01:45.500 compress/nitrox: not in enabled drivers build config 00:01:45.500 compress/octeontx: not in enabled drivers build config 00:01:45.500 compress/zlib: not in enabled drivers build config 00:01:45.500 regex/*: missing internal dependency, "regexdev" 00:01:45.500 ml/*: missing internal dependency, "mldev" 00:01:45.500 vdpa/ifc: not in enabled drivers build config 00:01:45.500 vdpa/mlx5: not in enabled drivers build config 00:01:45.500 vdpa/nfp: not in enabled drivers build config 00:01:45.500 vdpa/sfc: not in enabled drivers build config 00:01:45.500 event/*: missing internal dependency, "eventdev" 00:01:45.500 baseband/*: missing internal dependency, "bbdev" 00:01:45.500 gpu/*: missing internal dependency, "gpudev" 00:01:45.500 00:01:45.500 00:01:45.500 Build targets in project: 115 00:01:45.500 00:01:45.500 DPDK 24.03.0 00:01:45.500 00:01:45.500 User defined options 00:01:45.500 buildtype : debug 00:01:45.500 default_library : shared 00:01:45.500 libdir : lib 00:01:45.500 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:45.500 b_sanitize : address 00:01:45.500 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:01:45.500 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:01:45.500 cpu_instruction_set: native 00:01:45.500 disable_apps : test-acl,graph,test-dma-perf,test-gpudev,test-crypto-perf,test,test-security-perf,test-mldev,proc-info,test-pmd,test-pipeline,test-eventdev,test-cmdline,test-fib,pdump,test-flow-perf,test-bbdev,test-regex,test-sad,dumpcap,test-compress-perf 00:01:45.500 disable_libs : acl,bitratestats,graph,bbdev,jobstats,ipsec,gso,table,rib,node,mldev,sched,ip_frag,cfgfile,port,pcapng,pdcp,argparse,stack,eventdev,regexdev,distributor,gro,efd,pipeline,bpf,dispatcher,lpm,metrics,latencystats,pdump,gpudev,member,fib,rawdev 00:01:45.500 enable_docs : false 00:01:45.500 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:01:45.500 enable_kmods : false 00:01:45.500 max_lcores : 128 00:01:45.500 tests : false 00:01:45.500 00:01:45.500 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:45.500 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:01:45.500 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:45.500 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:45.500 [3/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:45.500 [4/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:45.500 [5/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:45.500 [6/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:45.500 [7/378] Linking static target lib/librte_kvargs.a 00:01:45.500 [8/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:45.500 [9/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:45.500 [10/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:45.500 [11/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:45.500 [12/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:45.500 [13/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:45.500 [14/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:45.500 [15/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:45.500 [16/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:45.500 [17/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:45.500 [18/378] Linking static target lib/librte_log.a 00:01:45.500 [19/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:45.500 [20/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:45.500 [21/378] Linking static target lib/librte_pci.a 00:01:45.500 [22/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:45.500 [23/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:45.500 [24/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:45.500 [25/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:45.500 [26/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:45.500 [27/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:45.500 [28/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:45.500 [29/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:45.500 [30/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:45.762 [31/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:45.762 [32/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:45.762 [33/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:45.762 [34/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:45.762 [35/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:45.762 [36/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:45.762 [37/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.762 [38/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:45.762 [39/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:45.762 [40/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:45.762 [41/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:45.762 [42/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:45.762 [43/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:45.762 [44/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:45.762 [45/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:45.762 [46/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:45.762 [47/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.762 [48/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:45.762 [49/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:45.762 [50/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:45.762 [51/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:45.762 [52/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:45.762 [53/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:45.762 [54/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:01:45.762 [55/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:45.762 [56/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:45.762 [57/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:45.762 [58/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:45.762 [59/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:45.762 [60/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:45.762 [61/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:45.762 [62/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:45.762 [63/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:45.762 [64/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:45.762 [65/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:45.762 [66/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:45.762 [67/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:45.762 [68/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:45.762 [69/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:45.762 [70/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:45.762 [71/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:45.762 [72/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:45.762 [73/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:45.762 [74/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:45.762 [75/378] Linking static target lib/librte_ring.a 00:01:45.762 [76/378] Linking static target lib/librte_meter.a 00:01:45.762 [77/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:45.762 [78/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:45.762 [79/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:45.762 [80/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:45.762 [81/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:45.762 [82/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:45.762 [83/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:45.762 [84/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:45.762 [85/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:45.762 [86/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:45.762 [87/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:45.762 [88/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:45.762 [89/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:45.762 [90/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:45.762 [91/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:45.762 [92/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:45.762 [93/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:45.762 [94/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:45.762 [95/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:45.762 [96/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:45.762 [97/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:45.762 [98/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:45.762 [99/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:45.762 [100/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:45.762 [101/378] Linking static target lib/librte_telemetry.a 00:01:45.762 [102/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:01:46.026 [103/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:46.026 [104/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:46.026 [105/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:46.026 [106/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:46.026 [107/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:01:46.026 [108/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:46.026 [109/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:46.026 [110/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:46.026 [111/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:46.026 [112/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:46.026 [113/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:46.026 [114/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:01:46.026 [115/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:46.026 [116/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:46.026 [117/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:46.026 [118/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:46.026 [119/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:46.026 [120/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:46.026 [121/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:46.027 [122/378] Linking static target lib/librte_mempool.a 00:01:46.027 [123/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:46.027 [124/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:46.027 [125/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:46.027 [126/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:46.027 [127/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:46.027 [128/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:46.027 [129/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:46.287 [130/378] Linking static target lib/librte_cmdline.a 00:01:46.287 [131/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:46.287 [132/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:01:46.287 [133/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.287 [134/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:46.287 [135/378] Linking static target lib/librte_eal.a 00:01:46.287 [136/378] Linking static target lib/librte_net.a 00:01:46.287 [137/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:01:46.287 [138/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:46.287 [139/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.287 [140/378] Linking target lib/librte_log.so.24.1 00:01:46.287 [141/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:46.287 [142/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:46.287 [143/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:46.287 [144/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:46.287 [145/378] Linking static target lib/librte_rcu.a 00:01:46.287 [146/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:46.287 [147/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:46.287 [148/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.287 [149/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:46.287 [150/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:46.546 [151/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:46.546 [152/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:46.546 [153/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:46.546 [154/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:46.546 [155/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:01:46.546 [156/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:46.546 [157/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:46.546 [158/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:01:46.546 [159/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:01:46.546 [160/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:46.546 [161/378] Linking static target lib/librte_timer.a 00:01:46.546 [162/378] Linking static target lib/librte_dmadev.a 00:01:46.546 [163/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:46.546 [164/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:46.546 [165/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:01:46.546 [166/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:01:46.546 [167/378] Linking target lib/librte_kvargs.so.24.1 00:01:46.546 [168/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:01:46.546 [169/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:01:46.546 [170/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:46.546 [171/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:01:46.546 [172/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:46.546 [173/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.546 [174/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:46.546 [175/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:01:46.546 [176/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:46.546 [177/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:46.546 [178/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:46.546 [179/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:01:46.546 [180/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:01:46.546 [181/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:01:46.546 [182/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.546 [183/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:46.546 [184/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:01:46.546 [185/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:46.546 [186/378] Linking target lib/librte_telemetry.so.24.1 00:01:46.546 [187/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:01:46.546 [188/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:01:46.546 [189/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:46.546 [190/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:46.804 [191/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:01:46.804 [192/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:46.804 [193/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:01:46.804 [194/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:01:46.804 [195/378] Linking static target lib/librte_compressdev.a 00:01:46.804 [196/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:46.804 [197/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.804 [198/378] Linking static target lib/librte_power.a 00:01:46.804 [199/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:01:46.804 [200/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:01:46.804 [201/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:01:46.804 [202/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:01:46.804 [203/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:01:46.804 [204/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:46.804 [205/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:01:46.804 [206/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:01:46.804 [207/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:01:46.804 [208/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:46.804 [209/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:01:46.804 [210/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:46.804 [211/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:01:46.804 [212/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:01:46.804 [213/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:46.804 [214/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:01:46.804 [215/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:46.804 [216/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:01:46.804 [217/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:01:46.804 [218/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:01:46.804 [219/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:01:46.804 [220/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:01:46.804 [221/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:01:46.804 [222/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:01:46.804 [223/378] Linking static target drivers/librte_bus_auxiliary.a 00:01:46.804 [224/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:46.804 [225/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:01:46.804 [226/378] Linking static target lib/librte_security.a 00:01:46.804 [227/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:01:46.804 [228/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:01:46.804 [229/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:46.804 [230/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:46.804 [231/378] Linking static target lib/librte_mbuf.a 00:01:46.804 [232/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:46.804 [233/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:46.804 [234/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:46.804 [235/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.804 [236/378] Linking static target lib/librte_reorder.a 00:01:46.804 [237/378] Linking static target drivers/librte_bus_vdev.a 00:01:46.804 [238/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.804 [239/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:01:47.093 [240/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:47.093 [241/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:01:47.093 [242/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:47.093 [243/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:47.093 [244/378] Linking static target drivers/librte_bus_pci.a 00:01:47.093 [245/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.093 [246/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.093 [247/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:01:47.093 [248/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:47.093 [249/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:47.093 [250/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:47.093 [251/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.093 [252/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:01:47.093 [253/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:01:47.093 [254/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.093 [255/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:01:47.093 [256/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:01:47.352 [257/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:01:47.352 [258/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.352 [259/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:01:47.352 [260/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.352 [261/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:47.352 [262/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:01:47.352 [263/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:01:47.352 [264/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:01:47.352 [265/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.352 [266/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:01:47.352 [267/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:47.352 [268/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:01:47.352 [269/378] Linking static target drivers/librte_mempool_ring.a 00:01:47.352 [270/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:47.352 [271/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:47.352 [272/378] Linking static target lib/librte_hash.a 00:01:47.352 [273/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:01:47.352 [274/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:47.352 [275/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:01:47.352 [276/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:01:47.352 [277/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.352 [278/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:01:47.611 [279/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:01:47.611 [280/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.611 [281/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:01:47.611 [282/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:01:47.611 [283/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:01:47.611 [284/378] Linking static target drivers/librte_crypto_mlx5.a 00:01:47.611 [285/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:01:47.611 [286/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:47.611 [287/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:01:47.611 [288/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:01:47.611 [289/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:01:47.611 [290/378] Linking static target drivers/librte_compress_mlx5.a 00:01:47.611 [291/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:01:47.612 [292/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:01:47.612 [293/378] Linking static target drivers/librte_compress_isal.a 00:01:47.612 [294/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.612 [295/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:01:47.612 [296/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:47.612 [297/378] Linking static target lib/librte_cryptodev.a 00:01:47.612 [298/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:01:47.870 [299/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:01:47.870 [300/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:01:47.870 [301/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:01:47.870 [302/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:01:48.128 [303/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:01:48.128 [304/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:01:48.128 [305/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:01:48.128 [306/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:01:48.128 [307/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:01:48.128 [308/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:01:48.128 [309/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:01:48.128 [310/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.128 [311/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:48.128 [312/378] Linking static target lib/librte_ethdev.a 00:01:48.386 [313/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:01:48.386 [314/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:01:48.386 [315/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:01:48.386 [316/378] Linking static target drivers/librte_common_mlx5.a 00:01:49.317 [317/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:49.575 [318/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.960 [319/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:01:50.960 [320/378] Linking static target drivers/libtmp_rte_common_qat.a 00:01:51.219 [321/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:01:51.219 [322/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:01:51.219 [323/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:01:51.219 [324/378] Linking static target drivers/librte_common_qat.a 00:01:52.592 [325/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:52.592 [326/378] Linking static target lib/librte_vhost.a 00:01:52.850 [327/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.749 [328/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.683 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.683 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.683 [331/378] Linking target lib/librte_eal.so.24.1 00:01:55.941 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:01:55.941 [333/378] Linking target lib/librte_dmadev.so.24.1 00:01:55.941 [334/378] Linking target lib/librte_pci.so.24.1 00:01:55.941 [335/378] Linking target lib/librte_ring.so.24.1 00:01:55.941 [336/378] Linking target lib/librte_meter.so.24.1 00:01:55.941 [337/378] Linking target lib/librte_timer.so.24.1 00:01:55.941 [338/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:01:55.941 [339/378] Linking target drivers/librte_bus_vdev.so.24.1 00:01:56.199 [340/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:01:56.199 [341/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:01:56.199 [342/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:01:56.199 [343/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:01:56.199 [344/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:01:56.199 [345/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:01:56.199 [346/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:01:56.199 [347/378] Linking target lib/librte_rcu.so.24.1 00:01:56.199 [348/378] Linking target drivers/librte_bus_pci.so.24.1 00:01:56.199 [349/378] Linking target lib/librte_mempool.so.24.1 00:01:56.199 [350/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:01:56.199 [351/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:01:56.199 [352/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:01:56.458 [353/378] Linking target drivers/librte_mempool_ring.so.24.1 00:01:56.458 [354/378] Linking target lib/librte_mbuf.so.24.1 00:01:56.458 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:01:56.458 [356/378] Linking target lib/librte_compressdev.so.24.1 00:01:56.458 [357/378] Linking target lib/librte_reorder.so.24.1 00:01:56.458 [358/378] Linking target lib/librte_net.so.24.1 00:01:56.458 [359/378] Linking target lib/librte_cryptodev.so.24.1 00:01:56.716 [360/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:01:56.716 [361/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:01:56.716 [362/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:01:56.716 [363/378] Linking target lib/librte_cmdline.so.24.1 00:01:56.716 [364/378] Linking target lib/librte_security.so.24.1 00:01:56.716 [365/378] Linking target lib/librte_hash.so.24.1 00:01:56.716 [366/378] Linking target drivers/librte_compress_isal.so.24.1 00:01:56.716 [367/378] Linking target lib/librte_ethdev.so.24.1 00:01:56.974 [368/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:01:56.974 [369/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:01:56.974 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:01:56.974 [371/378] Linking target drivers/librte_common_mlx5.so.24.1 00:01:56.974 [372/378] Linking target lib/librte_power.so.24.1 00:01:56.974 [373/378] Linking target lib/librte_vhost.so.24.1 00:01:56.974 [374/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:01:56.974 [375/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:01:57.232 [376/378] Linking target drivers/librte_common_qat.so.24.1 00:01:57.232 [377/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:01:57.232 [378/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:01:57.232 INFO: autodetecting backend as ninja 00:01:57.232 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 96 00:01:58.167 CC lib/ut/ut.o 00:01:58.167 CC lib/log/log.o 00:01:58.167 CC lib/log/log_flags.o 00:01:58.167 CC lib/log/log_deprecated.o 00:01:58.167 CC lib/ut_mock/mock.o 00:01:58.167 LIB libspdk_log.a 00:01:58.167 LIB libspdk_ut.a 00:01:58.426 LIB libspdk_ut_mock.a 00:01:58.426 SO libspdk_ut.so.2.0 00:01:58.426 SO libspdk_log.so.7.0 00:01:58.426 SO libspdk_ut_mock.so.6.0 00:01:58.426 SYMLINK libspdk_ut.so 00:01:58.426 SYMLINK libspdk_log.so 00:01:58.426 SYMLINK libspdk_ut_mock.so 00:01:58.684 CXX lib/trace_parser/trace.o 00:01:58.684 CC lib/dma/dma.o 00:01:58.684 CC lib/ioat/ioat.o 00:01:58.684 CC lib/util/bit_array.o 00:01:58.684 CC lib/util/base64.o 00:01:58.684 CC lib/util/crc16.o 00:01:58.684 CC lib/util/cpuset.o 00:01:58.684 CC lib/util/crc32.o 00:01:58.684 CC lib/util/crc32_ieee.o 00:01:58.684 CC lib/util/crc32c.o 00:01:58.684 CC lib/util/crc64.o 00:01:58.684 CC lib/util/dif.o 00:01:58.684 CC lib/util/fd.o 00:01:58.684 CC lib/util/fd_group.o 00:01:58.684 CC lib/util/file.o 00:01:58.684 CC lib/util/hexlify.o 00:01:58.684 CC lib/util/iov.o 00:01:58.684 CC lib/util/math.o 00:01:58.684 CC lib/util/net.o 00:01:58.684 CC lib/util/pipe.o 00:01:58.684 CC lib/util/strerror_tls.o 00:01:58.684 CC lib/util/string.o 00:01:58.684 CC lib/util/uuid.o 00:01:58.684 CC lib/util/xor.o 00:01:58.684 CC lib/util/zipf.o 00:01:58.941 CC lib/vfio_user/host/vfio_user_pci.o 00:01:58.941 CC lib/vfio_user/host/vfio_user.o 00:01:58.941 LIB libspdk_dma.a 00:01:58.941 SO libspdk_dma.so.4.0 00:01:58.941 SYMLINK libspdk_dma.so 00:01:58.941 LIB libspdk_ioat.a 00:01:58.941 SO libspdk_ioat.so.7.0 00:01:59.199 SYMLINK libspdk_ioat.so 00:01:59.199 LIB libspdk_vfio_user.a 00:01:59.199 SO libspdk_vfio_user.so.5.0 00:01:59.199 SYMLINK libspdk_vfio_user.so 00:01:59.199 LIB libspdk_util.a 00:01:59.457 SO libspdk_util.so.10.0 00:01:59.457 SYMLINK libspdk_util.so 00:01:59.457 LIB libspdk_trace_parser.a 00:01:59.457 SO libspdk_trace_parser.so.5.0 00:01:59.718 SYMLINK libspdk_trace_parser.so 00:01:59.718 CC lib/rdma_provider/rdma_provider_verbs.o 00:01:59.718 CC lib/rdma_provider/common.o 00:01:59.718 CC lib/reduce/reduce.o 00:01:59.718 CC lib/idxd/idxd.o 00:01:59.718 CC lib/rdma_utils/rdma_utils.o 00:01:59.718 CC lib/conf/conf.o 00:01:59.718 CC lib/idxd/idxd_user.o 00:01:59.718 CC lib/json/json_util.o 00:01:59.718 CC lib/idxd/idxd_kernel.o 00:01:59.718 CC lib/json/json_parse.o 00:01:59.718 CC lib/json/json_write.o 00:01:59.718 CC lib/env_dpdk/env.o 00:01:59.718 CC lib/env_dpdk/memory.o 00:01:59.718 CC lib/env_dpdk/pci.o 00:01:59.718 CC lib/env_dpdk/init.o 00:01:59.718 CC lib/env_dpdk/threads.o 00:01:59.718 CC lib/env_dpdk/pci_ioat.o 00:01:59.718 CC lib/vmd/vmd.o 00:01:59.718 CC lib/env_dpdk/pci_virtio.o 00:01:59.718 CC lib/vmd/led.o 00:01:59.718 CC lib/env_dpdk/pci_vmd.o 00:01:59.718 CC lib/env_dpdk/pci_idxd.o 00:01:59.718 CC lib/env_dpdk/pci_event.o 00:01:59.718 CC lib/env_dpdk/sigbus_handler.o 00:01:59.718 CC lib/env_dpdk/pci_dpdk.o 00:01:59.718 CC lib/env_dpdk/pci_dpdk_2211.o 00:01:59.718 CC lib/env_dpdk/pci_dpdk_2207.o 00:01:59.981 LIB libspdk_rdma_provider.a 00:01:59.981 SO libspdk_rdma_provider.so.6.0 00:01:59.981 LIB libspdk_conf.a 00:01:59.981 SYMLINK libspdk_rdma_provider.so 00:01:59.981 LIB libspdk_rdma_utils.a 00:01:59.981 SO libspdk_conf.so.6.0 00:01:59.981 LIB libspdk_json.a 00:01:59.981 SO libspdk_rdma_utils.so.1.0 00:02:00.239 SO libspdk_json.so.6.0 00:02:00.239 SYMLINK libspdk_conf.so 00:02:00.240 SYMLINK libspdk_rdma_utils.so 00:02:00.240 SYMLINK libspdk_json.so 00:02:00.498 LIB libspdk_idxd.a 00:02:00.498 SO libspdk_idxd.so.12.0 00:02:00.498 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:00.498 CC lib/jsonrpc/jsonrpc_server.o 00:02:00.498 CC lib/jsonrpc/jsonrpc_client.o 00:02:00.498 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:00.498 LIB libspdk_reduce.a 00:02:00.498 LIB libspdk_vmd.a 00:02:00.498 SYMLINK libspdk_idxd.so 00:02:00.498 SO libspdk_vmd.so.6.0 00:02:00.498 SO libspdk_reduce.so.6.1 00:02:00.498 SYMLINK libspdk_reduce.so 00:02:00.498 SYMLINK libspdk_vmd.so 00:02:00.757 LIB libspdk_jsonrpc.a 00:02:00.757 SO libspdk_jsonrpc.so.6.0 00:02:00.757 SYMLINK libspdk_jsonrpc.so 00:02:01.014 CC lib/rpc/rpc.o 00:02:01.273 LIB libspdk_env_dpdk.a 00:02:01.273 SO libspdk_env_dpdk.so.15.0 00:02:01.273 LIB libspdk_rpc.a 00:02:01.273 SO libspdk_rpc.so.6.0 00:02:01.273 SYMLINK libspdk_env_dpdk.so 00:02:01.273 SYMLINK libspdk_rpc.so 00:02:01.840 CC lib/trace/trace.o 00:02:01.840 CC lib/trace/trace_flags.o 00:02:01.840 CC lib/notify/notify.o 00:02:01.840 CC lib/trace/trace_rpc.o 00:02:01.840 CC lib/notify/notify_rpc.o 00:02:01.840 CC lib/keyring/keyring.o 00:02:01.840 CC lib/keyring/keyring_rpc.o 00:02:01.840 LIB libspdk_notify.a 00:02:01.840 SO libspdk_notify.so.6.0 00:02:01.840 LIB libspdk_trace.a 00:02:01.840 LIB libspdk_keyring.a 00:02:01.840 SYMLINK libspdk_notify.so 00:02:01.840 SO libspdk_trace.so.10.0 00:02:01.840 SO libspdk_keyring.so.1.0 00:02:02.098 SYMLINK libspdk_trace.so 00:02:02.098 SYMLINK libspdk_keyring.so 00:02:02.357 CC lib/sock/sock.o 00:02:02.357 CC lib/sock/sock_rpc.o 00:02:02.357 CC lib/thread/thread.o 00:02:02.357 CC lib/thread/iobuf.o 00:02:02.615 LIB libspdk_sock.a 00:02:02.615 SO libspdk_sock.so.10.0 00:02:02.874 SYMLINK libspdk_sock.so 00:02:03.132 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:03.132 CC lib/nvme/nvme_fabric.o 00:02:03.132 CC lib/nvme/nvme_ctrlr.o 00:02:03.132 CC lib/nvme/nvme_ns.o 00:02:03.132 CC lib/nvme/nvme_ns_cmd.o 00:02:03.132 CC lib/nvme/nvme_pcie_common.o 00:02:03.132 CC lib/nvme/nvme_pcie.o 00:02:03.132 CC lib/nvme/nvme_qpair.o 00:02:03.132 CC lib/nvme/nvme.o 00:02:03.132 CC lib/nvme/nvme_quirks.o 00:02:03.132 CC lib/nvme/nvme_transport.o 00:02:03.132 CC lib/nvme/nvme_discovery.o 00:02:03.132 CC lib/nvme/nvme_opal.o 00:02:03.132 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:03.132 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:03.132 CC lib/nvme/nvme_tcp.o 00:02:03.132 CC lib/nvme/nvme_io_msg.o 00:02:03.132 CC lib/nvme/nvme_poll_group.o 00:02:03.132 CC lib/nvme/nvme_zns.o 00:02:03.132 CC lib/nvme/nvme_stubs.o 00:02:03.132 CC lib/nvme/nvme_auth.o 00:02:03.132 CC lib/nvme/nvme_cuse.o 00:02:03.132 CC lib/nvme/nvme_rdma.o 00:02:03.698 LIB libspdk_thread.a 00:02:03.698 SO libspdk_thread.so.10.1 00:02:03.698 SYMLINK libspdk_thread.so 00:02:03.956 CC lib/accel/accel.o 00:02:03.956 CC lib/accel/accel_rpc.o 00:02:03.956 CC lib/accel/accel_sw.o 00:02:04.214 CC lib/virtio/virtio_vhost_user.o 00:02:04.214 CC lib/virtio/virtio.o 00:02:04.214 CC lib/virtio/virtio_vfio_user.o 00:02:04.214 CC lib/virtio/virtio_pci.o 00:02:04.214 CC lib/blob/blobstore.o 00:02:04.214 CC lib/blob/request.o 00:02:04.214 CC lib/blob/zeroes.o 00:02:04.214 CC lib/blob/blob_bs_dev.o 00:02:04.214 CC lib/init/subsystem.o 00:02:04.214 CC lib/init/json_config.o 00:02:04.214 CC lib/init/subsystem_rpc.o 00:02:04.214 CC lib/init/rpc.o 00:02:04.472 LIB libspdk_init.a 00:02:04.472 SO libspdk_init.so.5.0 00:02:04.472 LIB libspdk_virtio.a 00:02:04.472 SYMLINK libspdk_init.so 00:02:04.472 SO libspdk_virtio.so.7.0 00:02:04.472 SYMLINK libspdk_virtio.so 00:02:04.731 CC lib/event/reactor.o 00:02:04.731 CC lib/event/app.o 00:02:04.731 CC lib/event/log_rpc.o 00:02:04.731 CC lib/event/app_rpc.o 00:02:04.731 CC lib/event/scheduler_static.o 00:02:04.990 LIB libspdk_accel.a 00:02:04.990 SO libspdk_accel.so.16.0 00:02:05.248 SYMLINK libspdk_accel.so 00:02:05.248 LIB libspdk_event.a 00:02:05.248 LIB libspdk_nvme.a 00:02:05.248 SO libspdk_event.so.14.0 00:02:05.248 SO libspdk_nvme.so.13.1 00:02:05.248 SYMLINK libspdk_event.so 00:02:05.506 CC lib/bdev/bdev.o 00:02:05.506 CC lib/bdev/bdev_rpc.o 00:02:05.506 CC lib/bdev/bdev_zone.o 00:02:05.506 CC lib/bdev/scsi_nvme.o 00:02:05.506 CC lib/bdev/part.o 00:02:05.506 SYMLINK libspdk_nvme.so 00:02:06.883 LIB libspdk_blob.a 00:02:07.142 SO libspdk_blob.so.11.0 00:02:07.142 SYMLINK libspdk_blob.so 00:02:07.400 CC lib/lvol/lvol.o 00:02:07.400 CC lib/blobfs/blobfs.o 00:02:07.400 CC lib/blobfs/tree.o 00:02:07.968 LIB libspdk_bdev.a 00:02:07.968 SO libspdk_bdev.so.16.0 00:02:07.968 SYMLINK libspdk_bdev.so 00:02:08.228 LIB libspdk_blobfs.a 00:02:08.228 SO libspdk_blobfs.so.10.0 00:02:08.228 CC lib/ublk/ublk.o 00:02:08.228 CC lib/ublk/ublk_rpc.o 00:02:08.228 CC lib/ftl/ftl_core.o 00:02:08.228 CC lib/ftl/ftl_init.o 00:02:08.228 CC lib/ftl/ftl_layout.o 00:02:08.228 LIB libspdk_lvol.a 00:02:08.228 CC lib/nvmf/ctrlr_discovery.o 00:02:08.228 CC lib/ftl/ftl_debug.o 00:02:08.228 CC lib/nvmf/ctrlr.o 00:02:08.228 CC lib/ftl/ftl_sb.o 00:02:08.228 CC lib/ftl/ftl_io.o 00:02:08.228 CC lib/nvmf/ctrlr_bdev.o 00:02:08.228 CC lib/nvmf/subsystem.o 00:02:08.228 CC lib/ftl/ftl_l2p.o 00:02:08.228 CC lib/ftl/ftl_nv_cache.o 00:02:08.228 CC lib/ftl/ftl_l2p_flat.o 00:02:08.228 CC lib/nvmf/nvmf.o 00:02:08.228 CC lib/nvmf/nvmf_rpc.o 00:02:08.228 CC lib/ftl/ftl_band.o 00:02:08.228 CC lib/scsi/dev.o 00:02:08.228 CC lib/nvmf/transport.o 00:02:08.228 CC lib/ftl/ftl_band_ops.o 00:02:08.228 CC lib/scsi/lun.o 00:02:08.228 CC lib/ftl/ftl_writer.o 00:02:08.228 CC lib/nvmf/tcp.o 00:02:08.228 CC lib/ftl/ftl_rq.o 00:02:08.228 CC lib/ftl/ftl_reloc.o 00:02:08.228 CC lib/scsi/port.o 00:02:08.228 CC lib/nvmf/stubs.o 00:02:08.228 CC lib/ftl/ftl_p2l.o 00:02:08.228 CC lib/nvmf/mdns_server.o 00:02:08.228 CC lib/scsi/scsi_bdev.o 00:02:08.228 CC lib/nvmf/rdma.o 00:02:08.228 CC lib/ftl/mngt/ftl_mngt.o 00:02:08.228 CC lib/ftl/ftl_l2p_cache.o 00:02:08.228 CC lib/scsi/scsi.o 00:02:08.228 CC lib/scsi/scsi_pr.o 00:02:08.228 CC lib/nvmf/auth.o 00:02:08.228 CC lib/scsi/scsi_rpc.o 00:02:08.228 CC lib/nbd/nbd_rpc.o 00:02:08.228 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:08.228 CC lib/nbd/nbd.o 00:02:08.228 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:08.228 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:08.228 CC lib/scsi/task.o 00:02:08.228 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:08.228 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:08.228 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:08.228 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:08.228 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:08.228 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:08.228 SYMLINK libspdk_blobfs.so 00:02:08.228 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:08.228 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:08.228 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:08.228 CC lib/ftl/utils/ftl_conf.o 00:02:08.228 CC lib/ftl/utils/ftl_mempool.o 00:02:08.228 CC lib/ftl/utils/ftl_md.o 00:02:08.228 CC lib/ftl/utils/ftl_bitmap.o 00:02:08.228 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:08.228 CC lib/ftl/utils/ftl_property.o 00:02:08.228 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:08.228 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:08.228 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:08.228 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:08.228 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:08.228 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:08.228 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:08.228 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:08.228 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:08.228 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:08.228 CC lib/ftl/base/ftl_base_bdev.o 00:02:08.228 CC lib/ftl/base/ftl_base_dev.o 00:02:08.228 CC lib/ftl/ftl_trace.o 00:02:08.486 SO libspdk_lvol.so.10.0 00:02:08.486 SYMLINK libspdk_lvol.so 00:02:09.054 LIB libspdk_nbd.a 00:02:09.054 SO libspdk_nbd.so.7.0 00:02:09.054 SYMLINK libspdk_nbd.so 00:02:09.054 LIB libspdk_scsi.a 00:02:09.054 LIB libspdk_ublk.a 00:02:09.054 SO libspdk_scsi.so.9.0 00:02:09.054 SO libspdk_ublk.so.3.0 00:02:09.313 SYMLINK libspdk_scsi.so 00:02:09.313 SYMLINK libspdk_ublk.so 00:02:09.572 CC lib/iscsi/conn.o 00:02:09.572 CC lib/iscsi/init_grp.o 00:02:09.572 CC lib/iscsi/md5.o 00:02:09.572 CC lib/iscsi/iscsi.o 00:02:09.572 CC lib/iscsi/param.o 00:02:09.572 CC lib/iscsi/portal_grp.o 00:02:09.572 CC lib/iscsi/iscsi_subsystem.o 00:02:09.572 CC lib/iscsi/tgt_node.o 00:02:09.572 CC lib/iscsi/iscsi_rpc.o 00:02:09.572 CC lib/iscsi/task.o 00:02:09.572 CC lib/vhost/vhost.o 00:02:09.572 CC lib/vhost/vhost_scsi.o 00:02:09.572 CC lib/vhost/vhost_rpc.o 00:02:09.572 CC lib/vhost/rte_vhost_user.o 00:02:09.572 CC lib/vhost/vhost_blk.o 00:02:09.572 LIB libspdk_ftl.a 00:02:09.572 SO libspdk_ftl.so.9.0 00:02:10.139 SYMLINK libspdk_ftl.so 00:02:10.397 LIB libspdk_vhost.a 00:02:10.397 SO libspdk_vhost.so.8.0 00:02:10.656 SYMLINK libspdk_vhost.so 00:02:10.656 LIB libspdk_nvmf.a 00:02:10.656 SO libspdk_nvmf.so.19.0 00:02:10.914 LIB libspdk_iscsi.a 00:02:10.914 SO libspdk_iscsi.so.8.0 00:02:10.914 SYMLINK libspdk_nvmf.so 00:02:10.914 SYMLINK libspdk_iscsi.so 00:02:11.481 CC module/env_dpdk/env_dpdk_rpc.o 00:02:11.481 CC module/sock/posix/posix.o 00:02:11.481 CC module/keyring/linux/keyring.o 00:02:11.481 CC module/keyring/linux/keyring_rpc.o 00:02:11.481 CC module/accel/ioat/accel_ioat.o 00:02:11.481 CC module/accel/ioat/accel_ioat_rpc.o 00:02:11.481 CC module/accel/error/accel_error_rpc.o 00:02:11.481 CC module/accel/error/accel_error.o 00:02:11.481 LIB libspdk_env_dpdk_rpc.a 00:02:11.481 CC module/blob/bdev/blob_bdev.o 00:02:11.481 CC module/accel/iaa/accel_iaa.o 00:02:11.481 CC module/accel/iaa/accel_iaa_rpc.o 00:02:11.481 CC module/keyring/file/keyring.o 00:02:11.481 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:11.481 CC module/keyring/file/keyring_rpc.o 00:02:11.481 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:11.739 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:11.739 CC module/accel/dsa/accel_dsa.o 00:02:11.739 CC module/accel/dsa/accel_dsa_rpc.o 00:02:11.739 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:11.739 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:11.739 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:11.739 CC module/scheduler/gscheduler/gscheduler.o 00:02:11.739 SO libspdk_env_dpdk_rpc.so.6.0 00:02:11.739 SYMLINK libspdk_env_dpdk_rpc.so 00:02:11.739 LIB libspdk_keyring_linux.a 00:02:11.739 LIB libspdk_keyring_file.a 00:02:11.739 SO libspdk_keyring_linux.so.1.0 00:02:11.739 LIB libspdk_accel_ioat.a 00:02:11.739 LIB libspdk_accel_error.a 00:02:11.739 LIB libspdk_scheduler_dpdk_governor.a 00:02:11.739 LIB libspdk_scheduler_gscheduler.a 00:02:11.739 SO libspdk_accel_error.so.2.0 00:02:11.739 SO libspdk_accel_ioat.so.6.0 00:02:11.739 SO libspdk_keyring_file.so.1.0 00:02:11.739 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:11.739 LIB libspdk_scheduler_dynamic.a 00:02:11.739 LIB libspdk_accel_iaa.a 00:02:11.739 SYMLINK libspdk_keyring_linux.so 00:02:11.739 SO libspdk_scheduler_gscheduler.so.4.0 00:02:12.005 SO libspdk_scheduler_dynamic.so.4.0 00:02:12.005 SO libspdk_accel_iaa.so.3.0 00:02:12.005 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:12.005 SYMLINK libspdk_accel_error.so 00:02:12.005 SYMLINK libspdk_keyring_file.so 00:02:12.005 SYMLINK libspdk_accel_ioat.so 00:02:12.005 LIB libspdk_accel_dsa.a 00:02:12.005 SYMLINK libspdk_scheduler_gscheduler.so 00:02:12.006 LIB libspdk_blob_bdev.a 00:02:12.006 SO libspdk_accel_dsa.so.5.0 00:02:12.006 SO libspdk_blob_bdev.so.11.0 00:02:12.006 SYMLINK libspdk_scheduler_dynamic.so 00:02:12.006 SYMLINK libspdk_accel_iaa.so 00:02:12.006 SYMLINK libspdk_blob_bdev.so 00:02:12.006 SYMLINK libspdk_accel_dsa.so 00:02:12.288 LIB libspdk_sock_posix.a 00:02:12.288 SO libspdk_sock_posix.so.6.0 00:02:12.288 CC module/bdev/delay/vbdev_delay.o 00:02:12.288 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:12.288 CC module/bdev/lvol/vbdev_lvol.o 00:02:12.288 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:12.288 CC module/bdev/aio/bdev_aio.o 00:02:12.288 CC module/bdev/compress/vbdev_compress.o 00:02:12.288 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:12.288 CC module/bdev/null/bdev_null.o 00:02:12.288 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:12.288 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:12.288 CC module/bdev/aio/bdev_aio_rpc.o 00:02:12.288 CC module/bdev/error/vbdev_error.o 00:02:12.288 CC module/bdev/null/bdev_null_rpc.o 00:02:12.288 CC module/bdev/error/vbdev_error_rpc.o 00:02:12.288 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:12.546 CC module/bdev/split/vbdev_split.o 00:02:12.546 CC module/bdev/malloc/bdev_malloc.o 00:02:12.546 CC module/bdev/split/vbdev_split_rpc.o 00:02:12.546 CC module/blobfs/bdev/blobfs_bdev.o 00:02:12.546 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:12.546 CC module/bdev/nvme/bdev_nvme.o 00:02:12.546 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:12.546 CC module/bdev/ftl/bdev_ftl.o 00:02:12.546 CC module/bdev/nvme/nvme_rpc.o 00:02:12.546 CC module/bdev/nvme/bdev_mdns_client.o 00:02:12.546 CC module/bdev/iscsi/bdev_iscsi.o 00:02:12.546 CC module/bdev/nvme/vbdev_opal.o 00:02:12.546 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:12.546 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:12.546 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:12.546 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:12.546 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:12.546 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:12.546 CC module/bdev/gpt/gpt.o 00:02:12.546 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:12.546 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:12.546 CC module/bdev/gpt/vbdev_gpt.o 00:02:12.546 CC module/bdev/crypto/vbdev_crypto.o 00:02:12.546 CC module/bdev/raid/bdev_raid_rpc.o 00:02:12.546 CC module/bdev/raid/bdev_raid.o 00:02:12.546 CC module/bdev/raid/bdev_raid_sb.o 00:02:12.546 CC module/bdev/raid/raid1.o 00:02:12.546 CC module/bdev/raid/raid0.o 00:02:12.546 CC module/bdev/raid/concat.o 00:02:12.546 CC module/bdev/passthru/vbdev_passthru.o 00:02:12.546 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:12.547 SYMLINK libspdk_sock_posix.so 00:02:12.804 LIB libspdk_accel_dpdk_compressdev.a 00:02:12.804 LIB libspdk_bdev_split.a 00:02:12.804 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:12.804 LIB libspdk_bdev_null.a 00:02:12.804 LIB libspdk_blobfs_bdev.a 00:02:12.804 SO libspdk_bdev_split.so.6.0 00:02:12.804 LIB libspdk_bdev_ftl.a 00:02:12.804 SO libspdk_bdev_null.so.6.0 00:02:12.804 SO libspdk_blobfs_bdev.so.6.0 00:02:12.804 LIB libspdk_bdev_gpt.a 00:02:12.804 LIB libspdk_bdev_error.a 00:02:12.804 SO libspdk_bdev_ftl.so.6.0 00:02:12.804 SO libspdk_bdev_gpt.so.6.0 00:02:12.804 SYMLINK libspdk_bdev_split.so 00:02:12.804 LIB libspdk_bdev_passthru.a 00:02:12.804 SO libspdk_bdev_error.so.6.0 00:02:12.804 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:12.804 LIB libspdk_bdev_aio.a 00:02:12.804 LIB libspdk_bdev_compress.a 00:02:12.804 SYMLINK libspdk_bdev_null.so 00:02:12.804 LIB libspdk_bdev_delay.a 00:02:12.804 SYMLINK libspdk_blobfs_bdev.so 00:02:12.804 LIB libspdk_bdev_zone_block.a 00:02:12.804 SO libspdk_bdev_compress.so.6.0 00:02:12.804 LIB libspdk_bdev_malloc.a 00:02:12.804 SO libspdk_bdev_passthru.so.6.0 00:02:12.804 SO libspdk_bdev_aio.so.6.0 00:02:12.804 LIB libspdk_bdev_crypto.a 00:02:12.804 SYMLINK libspdk_bdev_gpt.so 00:02:12.804 SO libspdk_bdev_delay.so.6.0 00:02:12.804 SYMLINK libspdk_bdev_ftl.so 00:02:12.804 LIB libspdk_bdev_iscsi.a 00:02:12.804 SYMLINK libspdk_bdev_error.so 00:02:12.804 SO libspdk_bdev_zone_block.so.6.0 00:02:12.804 SO libspdk_bdev_malloc.so.6.0 00:02:12.804 SO libspdk_bdev_crypto.so.6.0 00:02:12.804 SO libspdk_bdev_iscsi.so.6.0 00:02:13.062 SYMLINK libspdk_bdev_compress.so 00:02:13.062 SYMLINK libspdk_bdev_aio.so 00:02:13.062 SYMLINK libspdk_bdev_passthru.so 00:02:13.062 SYMLINK libspdk_bdev_delay.so 00:02:13.062 SYMLINK libspdk_bdev_zone_block.so 00:02:13.062 SYMLINK libspdk_bdev_malloc.so 00:02:13.062 SYMLINK libspdk_bdev_iscsi.so 00:02:13.062 SYMLINK libspdk_bdev_crypto.so 00:02:13.062 LIB libspdk_bdev_lvol.a 00:02:13.062 LIB libspdk_bdev_virtio.a 00:02:13.062 SO libspdk_bdev_lvol.so.6.0 00:02:13.062 SO libspdk_bdev_virtio.so.6.0 00:02:13.062 LIB libspdk_accel_dpdk_cryptodev.a 00:02:13.062 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:13.062 SYMLINK libspdk_bdev_lvol.so 00:02:13.062 SYMLINK libspdk_bdev_virtio.so 00:02:13.321 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:13.579 LIB libspdk_bdev_raid.a 00:02:13.579 SO libspdk_bdev_raid.so.6.0 00:02:13.579 SYMLINK libspdk_bdev_raid.so 00:02:14.516 LIB libspdk_bdev_nvme.a 00:02:14.516 SO libspdk_bdev_nvme.so.7.0 00:02:14.774 SYMLINK libspdk_bdev_nvme.so 00:02:15.341 CC module/event/subsystems/vmd/vmd.o 00:02:15.341 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:15.341 CC module/event/subsystems/iobuf/iobuf.o 00:02:15.341 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:15.341 CC module/event/subsystems/scheduler/scheduler.o 00:02:15.341 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:15.341 CC module/event/subsystems/keyring/keyring.o 00:02:15.341 CC module/event/subsystems/sock/sock.o 00:02:15.341 LIB libspdk_event_vmd.a 00:02:15.341 LIB libspdk_event_scheduler.a 00:02:15.341 LIB libspdk_event_iobuf.a 00:02:15.341 LIB libspdk_event_vhost_blk.a 00:02:15.341 LIB libspdk_event_keyring.a 00:02:15.341 LIB libspdk_event_sock.a 00:02:15.341 SO libspdk_event_vmd.so.6.0 00:02:15.341 SO libspdk_event_scheduler.so.4.0 00:02:15.341 SO libspdk_event_vhost_blk.so.3.0 00:02:15.341 SO libspdk_event_iobuf.so.3.0 00:02:15.599 SO libspdk_event_keyring.so.1.0 00:02:15.599 SO libspdk_event_sock.so.5.0 00:02:15.599 SYMLINK libspdk_event_scheduler.so 00:02:15.599 SYMLINK libspdk_event_vmd.so 00:02:15.599 SYMLINK libspdk_event_vhost_blk.so 00:02:15.599 SYMLINK libspdk_event_keyring.so 00:02:15.599 SYMLINK libspdk_event_iobuf.so 00:02:15.599 SYMLINK libspdk_event_sock.so 00:02:15.857 CC module/event/subsystems/accel/accel.o 00:02:15.857 LIB libspdk_event_accel.a 00:02:16.115 SO libspdk_event_accel.so.6.0 00:02:16.115 SYMLINK libspdk_event_accel.so 00:02:16.374 CC module/event/subsystems/bdev/bdev.o 00:02:16.632 LIB libspdk_event_bdev.a 00:02:16.632 SO libspdk_event_bdev.so.6.0 00:02:16.632 SYMLINK libspdk_event_bdev.so 00:02:16.890 CC module/event/subsystems/nbd/nbd.o 00:02:16.890 CC module/event/subsystems/scsi/scsi.o 00:02:16.890 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:16.890 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:16.890 CC module/event/subsystems/ublk/ublk.o 00:02:16.890 LIB libspdk_event_nbd.a 00:02:17.149 SO libspdk_event_nbd.so.6.0 00:02:17.149 LIB libspdk_event_scsi.a 00:02:17.149 LIB libspdk_event_ublk.a 00:02:17.149 SO libspdk_event_ublk.so.3.0 00:02:17.149 SO libspdk_event_scsi.so.6.0 00:02:17.149 LIB libspdk_event_nvmf.a 00:02:17.149 SYMLINK libspdk_event_nbd.so 00:02:17.149 SO libspdk_event_nvmf.so.6.0 00:02:17.149 SYMLINK libspdk_event_ublk.so 00:02:17.149 SYMLINK libspdk_event_scsi.so 00:02:17.149 SYMLINK libspdk_event_nvmf.so 00:02:17.406 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:17.406 CC module/event/subsystems/iscsi/iscsi.o 00:02:17.665 LIB libspdk_event_vhost_scsi.a 00:02:17.665 LIB libspdk_event_iscsi.a 00:02:17.665 SO libspdk_event_iscsi.so.6.0 00:02:17.665 SO libspdk_event_vhost_scsi.so.3.0 00:02:17.665 SYMLINK libspdk_event_iscsi.so 00:02:17.665 SYMLINK libspdk_event_vhost_scsi.so 00:02:17.923 SO libspdk.so.6.0 00:02:17.923 SYMLINK libspdk.so 00:02:18.190 CC test/rpc_client/rpc_client_test.o 00:02:18.190 TEST_HEADER include/spdk/accel.h 00:02:18.190 CC app/spdk_nvme_discover/discovery_aer.o 00:02:18.190 TEST_HEADER include/spdk/accel_module.h 00:02:18.190 TEST_HEADER include/spdk/assert.h 00:02:18.190 CXX app/trace/trace.o 00:02:18.190 TEST_HEADER include/spdk/base64.h 00:02:18.190 TEST_HEADER include/spdk/barrier.h 00:02:18.190 CC app/spdk_nvme_perf/perf.o 00:02:18.190 TEST_HEADER include/spdk/bdev.h 00:02:18.190 CC app/spdk_nvme_identify/identify.o 00:02:18.190 TEST_HEADER include/spdk/bdev_zone.h 00:02:18.190 TEST_HEADER include/spdk/bdev_module.h 00:02:18.190 CC app/trace_record/trace_record.o 00:02:18.190 TEST_HEADER include/spdk/bit_pool.h 00:02:18.190 TEST_HEADER include/spdk/bit_array.h 00:02:18.190 TEST_HEADER include/spdk/blob_bdev.h 00:02:18.190 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:18.190 TEST_HEADER include/spdk/blob.h 00:02:18.190 TEST_HEADER include/spdk/blobfs.h 00:02:18.190 TEST_HEADER include/spdk/conf.h 00:02:18.190 TEST_HEADER include/spdk/cpuset.h 00:02:18.190 CC app/spdk_lspci/spdk_lspci.o 00:02:18.190 TEST_HEADER include/spdk/config.h 00:02:18.190 TEST_HEADER include/spdk/crc16.h 00:02:18.190 TEST_HEADER include/spdk/crc64.h 00:02:18.190 TEST_HEADER include/spdk/crc32.h 00:02:18.190 TEST_HEADER include/spdk/dif.h 00:02:18.190 TEST_HEADER include/spdk/endian.h 00:02:18.190 TEST_HEADER include/spdk/dma.h 00:02:18.190 TEST_HEADER include/spdk/env_dpdk.h 00:02:18.190 TEST_HEADER include/spdk/env.h 00:02:18.190 CC app/spdk_top/spdk_top.o 00:02:18.190 TEST_HEADER include/spdk/fd.h 00:02:18.190 TEST_HEADER include/spdk/event.h 00:02:18.190 TEST_HEADER include/spdk/fd_group.h 00:02:18.190 TEST_HEADER include/spdk/file.h 00:02:18.190 TEST_HEADER include/spdk/ftl.h 00:02:18.190 TEST_HEADER include/spdk/gpt_spec.h 00:02:18.190 TEST_HEADER include/spdk/histogram_data.h 00:02:18.190 TEST_HEADER include/spdk/hexlify.h 00:02:18.190 TEST_HEADER include/spdk/idxd.h 00:02:18.190 TEST_HEADER include/spdk/idxd_spec.h 00:02:18.190 TEST_HEADER include/spdk/init.h 00:02:18.190 TEST_HEADER include/spdk/ioat.h 00:02:18.190 TEST_HEADER include/spdk/ioat_spec.h 00:02:18.190 TEST_HEADER include/spdk/iscsi_spec.h 00:02:18.190 TEST_HEADER include/spdk/json.h 00:02:18.190 TEST_HEADER include/spdk/jsonrpc.h 00:02:18.190 TEST_HEADER include/spdk/keyring.h 00:02:18.190 TEST_HEADER include/spdk/keyring_module.h 00:02:18.190 TEST_HEADER include/spdk/log.h 00:02:18.190 TEST_HEADER include/spdk/likely.h 00:02:18.190 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:18.190 TEST_HEADER include/spdk/lvol.h 00:02:18.190 TEST_HEADER include/spdk/memory.h 00:02:18.190 TEST_HEADER include/spdk/mmio.h 00:02:18.190 TEST_HEADER include/spdk/nbd.h 00:02:18.190 TEST_HEADER include/spdk/net.h 00:02:18.190 TEST_HEADER include/spdk/nvme.h 00:02:18.190 TEST_HEADER include/spdk/notify.h 00:02:18.190 TEST_HEADER include/spdk/nvme_intel.h 00:02:18.190 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:18.190 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:18.190 TEST_HEADER include/spdk/nvme_zns.h 00:02:18.190 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:18.190 TEST_HEADER include/spdk/nvme_spec.h 00:02:18.190 TEST_HEADER include/spdk/nvmf.h 00:02:18.190 TEST_HEADER include/spdk/nvmf_transport.h 00:02:18.190 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:18.190 TEST_HEADER include/spdk/nvmf_spec.h 00:02:18.190 TEST_HEADER include/spdk/opal.h 00:02:18.190 TEST_HEADER include/spdk/pci_ids.h 00:02:18.190 TEST_HEADER include/spdk/opal_spec.h 00:02:18.190 TEST_HEADER include/spdk/queue.h 00:02:18.190 TEST_HEADER include/spdk/pipe.h 00:02:18.190 TEST_HEADER include/spdk/reduce.h 00:02:18.190 TEST_HEADER include/spdk/rpc.h 00:02:18.190 TEST_HEADER include/spdk/scheduler.h 00:02:18.190 TEST_HEADER include/spdk/scsi.h 00:02:18.190 CC app/spdk_dd/spdk_dd.o 00:02:18.190 TEST_HEADER include/spdk/sock.h 00:02:18.190 TEST_HEADER include/spdk/scsi_spec.h 00:02:18.190 TEST_HEADER include/spdk/string.h 00:02:18.190 TEST_HEADER include/spdk/stdinc.h 00:02:18.190 CC app/nvmf_tgt/nvmf_main.o 00:02:18.190 TEST_HEADER include/spdk/thread.h 00:02:18.190 TEST_HEADER include/spdk/trace.h 00:02:18.190 TEST_HEADER include/spdk/tree.h 00:02:18.190 TEST_HEADER include/spdk/trace_parser.h 00:02:18.190 TEST_HEADER include/spdk/ublk.h 00:02:18.190 TEST_HEADER include/spdk/util.h 00:02:18.190 TEST_HEADER include/spdk/uuid.h 00:02:18.190 TEST_HEADER include/spdk/version.h 00:02:18.190 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:18.190 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:18.190 TEST_HEADER include/spdk/vhost.h 00:02:18.190 TEST_HEADER include/spdk/vmd.h 00:02:18.190 TEST_HEADER include/spdk/xor.h 00:02:18.190 CC app/iscsi_tgt/iscsi_tgt.o 00:02:18.190 TEST_HEADER include/spdk/zipf.h 00:02:18.190 CXX test/cpp_headers/accel.o 00:02:18.190 CXX test/cpp_headers/accel_module.o 00:02:18.190 CXX test/cpp_headers/assert.o 00:02:18.190 CXX test/cpp_headers/barrier.o 00:02:18.190 CXX test/cpp_headers/base64.o 00:02:18.190 CXX test/cpp_headers/bdev.o 00:02:18.190 CXX test/cpp_headers/bdev_zone.o 00:02:18.190 CXX test/cpp_headers/bit_array.o 00:02:18.190 CXX test/cpp_headers/bdev_module.o 00:02:18.190 CXX test/cpp_headers/blob_bdev.o 00:02:18.190 CXX test/cpp_headers/bit_pool.o 00:02:18.190 CXX test/cpp_headers/blobfs.o 00:02:18.190 CXX test/cpp_headers/blobfs_bdev.o 00:02:18.190 CXX test/cpp_headers/blob.o 00:02:18.190 CXX test/cpp_headers/conf.o 00:02:18.190 CXX test/cpp_headers/config.o 00:02:18.190 CXX test/cpp_headers/cpuset.o 00:02:18.190 CXX test/cpp_headers/crc16.o 00:02:18.190 CXX test/cpp_headers/crc32.o 00:02:18.190 CXX test/cpp_headers/dif.o 00:02:18.190 CXX test/cpp_headers/crc64.o 00:02:18.190 CXX test/cpp_headers/dma.o 00:02:18.190 CXX test/cpp_headers/env_dpdk.o 00:02:18.190 CXX test/cpp_headers/endian.o 00:02:18.190 CXX test/cpp_headers/fd_group.o 00:02:18.190 CXX test/cpp_headers/env.o 00:02:18.190 CXX test/cpp_headers/event.o 00:02:18.190 CXX test/cpp_headers/fd.o 00:02:18.190 CXX test/cpp_headers/file.o 00:02:18.190 CXX test/cpp_headers/ftl.o 00:02:18.190 CXX test/cpp_headers/hexlify.o 00:02:18.190 CXX test/cpp_headers/gpt_spec.o 00:02:18.190 CXX test/cpp_headers/histogram_data.o 00:02:18.190 CXX test/cpp_headers/idxd.o 00:02:18.190 CXX test/cpp_headers/idxd_spec.o 00:02:18.190 CXX test/cpp_headers/init.o 00:02:18.190 CXX test/cpp_headers/ioat.o 00:02:18.190 CXX test/cpp_headers/ioat_spec.o 00:02:18.190 CXX test/cpp_headers/iscsi_spec.o 00:02:18.190 CXX test/cpp_headers/json.o 00:02:18.190 CXX test/cpp_headers/jsonrpc.o 00:02:18.190 CXX test/cpp_headers/keyring_module.o 00:02:18.190 CXX test/cpp_headers/keyring.o 00:02:18.190 CXX test/cpp_headers/log.o 00:02:18.190 CXX test/cpp_headers/likely.o 00:02:18.190 CXX test/cpp_headers/memory.o 00:02:18.190 CXX test/cpp_headers/mmio.o 00:02:18.190 CXX test/cpp_headers/lvol.o 00:02:18.190 CXX test/cpp_headers/nbd.o 00:02:18.190 CXX test/cpp_headers/net.o 00:02:18.190 CXX test/cpp_headers/notify.o 00:02:18.190 CXX test/cpp_headers/nvme.o 00:02:18.190 CXX test/cpp_headers/nvme_ocssd.o 00:02:18.190 CXX test/cpp_headers/nvme_intel.o 00:02:18.190 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:18.190 CC app/spdk_tgt/spdk_tgt.o 00:02:18.190 CXX test/cpp_headers/nvme_zns.o 00:02:18.190 CXX test/cpp_headers/nvme_spec.o 00:02:18.190 CXX test/cpp_headers/nvmf_cmd.o 00:02:18.190 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:18.190 CXX test/cpp_headers/nvmf_spec.o 00:02:18.190 CXX test/cpp_headers/nvmf.o 00:02:18.190 CXX test/cpp_headers/nvmf_transport.o 00:02:18.190 CXX test/cpp_headers/opal.o 00:02:18.190 CXX test/cpp_headers/pci_ids.o 00:02:18.190 CXX test/cpp_headers/pipe.o 00:02:18.190 CXX test/cpp_headers/opal_spec.o 00:02:18.190 CXX test/cpp_headers/queue.o 00:02:18.190 CC examples/util/zipf/zipf.o 00:02:18.190 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:18.190 CC test/env/memory/memory_ut.o 00:02:18.462 CC test/env/vtophys/vtophys.o 00:02:18.462 CC examples/ioat/perf/perf.o 00:02:18.462 CC test/app/jsoncat/jsoncat.o 00:02:18.462 CC test/env/pci/pci_ut.o 00:02:18.462 CC examples/ioat/verify/verify.o 00:02:18.462 CC test/app/histogram_perf/histogram_perf.o 00:02:18.462 CC test/thread/poller_perf/poller_perf.o 00:02:18.462 CC test/app/stub/stub.o 00:02:18.462 CC app/fio/nvme/fio_plugin.o 00:02:18.463 CC test/dma/test_dma/test_dma.o 00:02:18.463 CXX test/cpp_headers/reduce.o 00:02:18.463 LINK spdk_lspci 00:02:18.463 CC test/app/bdev_svc/bdev_svc.o 00:02:18.463 CC app/fio/bdev/fio_plugin.o 00:02:18.725 CC test/env/mem_callbacks/mem_callbacks.o 00:02:18.725 LINK rpc_client_test 00:02:18.725 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:18.725 LINK interrupt_tgt 00:02:18.725 LINK spdk_nvme_discover 00:02:18.725 LINK histogram_perf 00:02:18.725 LINK env_dpdk_post_init 00:02:18.725 LINK iscsi_tgt 00:02:18.989 CXX test/cpp_headers/rpc.o 00:02:18.989 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:18.989 CXX test/cpp_headers/scheduler.o 00:02:18.989 CXX test/cpp_headers/scsi.o 00:02:18.989 CXX test/cpp_headers/scsi_spec.o 00:02:18.989 CXX test/cpp_headers/sock.o 00:02:18.989 CXX test/cpp_headers/stdinc.o 00:02:18.990 CXX test/cpp_headers/string.o 00:02:18.990 CXX test/cpp_headers/thread.o 00:02:18.990 CXX test/cpp_headers/trace.o 00:02:18.990 CXX test/cpp_headers/trace_parser.o 00:02:18.990 CXX test/cpp_headers/tree.o 00:02:18.990 CXX test/cpp_headers/ublk.o 00:02:18.990 LINK nvmf_tgt 00:02:18.990 CXX test/cpp_headers/util.o 00:02:18.990 CXX test/cpp_headers/uuid.o 00:02:18.990 CXX test/cpp_headers/version.o 00:02:18.990 LINK jsoncat 00:02:18.990 CXX test/cpp_headers/vfio_user_pci.o 00:02:18.990 LINK zipf 00:02:18.990 CXX test/cpp_headers/vfio_user_spec.o 00:02:18.990 CXX test/cpp_headers/vhost.o 00:02:18.990 CXX test/cpp_headers/vmd.o 00:02:18.990 LINK vtophys 00:02:18.990 CXX test/cpp_headers/xor.o 00:02:18.990 CXX test/cpp_headers/zipf.o 00:02:18.990 LINK poller_perf 00:02:18.990 LINK bdev_svc 00:02:18.990 LINK spdk_trace_record 00:02:18.990 LINK spdk_tgt 00:02:18.990 LINK stub 00:02:18.990 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:18.990 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:18.990 LINK ioat_perf 00:02:18.990 LINK verify 00:02:19.249 LINK spdk_trace 00:02:19.249 LINK spdk_dd 00:02:19.249 LINK test_dma 00:02:19.249 LINK pci_ut 00:02:19.508 LINK mem_callbacks 00:02:19.508 CC test/event/reactor/reactor.o 00:02:19.508 CC examples/idxd/perf/perf.o 00:02:19.508 CC examples/sock/hello_world/hello_sock.o 00:02:19.508 CC test/event/reactor_perf/reactor_perf.o 00:02:19.508 LINK nvme_fuzz 00:02:19.508 CC test/event/event_perf/event_perf.o 00:02:19.508 CC examples/vmd/lsvmd/lsvmd.o 00:02:19.508 CC examples/vmd/led/led.o 00:02:19.508 CC test/event/scheduler/scheduler.o 00:02:19.508 LINK spdk_bdev 00:02:19.508 CC test/event/app_repeat/app_repeat.o 00:02:19.508 CC examples/thread/thread/thread_ex.o 00:02:19.508 LINK vhost_fuzz 00:02:19.508 LINK spdk_nvme 00:02:19.508 CC app/vhost/vhost.o 00:02:19.508 LINK reactor 00:02:19.508 LINK reactor_perf 00:02:19.508 LINK lsvmd 00:02:19.508 LINK event_perf 00:02:19.508 LINK led 00:02:19.766 LINK spdk_top 00:02:19.766 LINK app_repeat 00:02:19.766 CC test/nvme/sgl/sgl.o 00:02:19.766 CC test/nvme/aer/aer.o 00:02:19.766 CC test/nvme/err_injection/err_injection.o 00:02:19.766 CC test/nvme/fused_ordering/fused_ordering.o 00:02:19.766 CC test/nvme/overhead/overhead.o 00:02:19.766 CC test/nvme/fdp/fdp.o 00:02:19.766 CC test/nvme/cuse/cuse.o 00:02:19.766 CC test/nvme/reserve/reserve.o 00:02:19.766 CC test/nvme/reset/reset.o 00:02:19.766 CC test/nvme/compliance/nvme_compliance.o 00:02:19.766 CC test/nvme/e2edp/nvme_dp.o 00:02:19.766 LINK spdk_nvme_perf 00:02:19.766 CC test/nvme/boot_partition/boot_partition.o 00:02:19.766 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:19.766 CC test/nvme/startup/startup.o 00:02:19.766 CC test/nvme/connect_stress/connect_stress.o 00:02:19.766 LINK spdk_nvme_identify 00:02:19.766 CC test/nvme/simple_copy/simple_copy.o 00:02:19.766 CC test/blobfs/mkfs/mkfs.o 00:02:19.766 CC test/accel/dif/dif.o 00:02:19.766 LINK scheduler 00:02:19.766 LINK hello_sock 00:02:19.766 LINK vhost 00:02:19.766 CC test/lvol/esnap/esnap.o 00:02:19.766 LINK idxd_perf 00:02:19.766 LINK thread 00:02:19.766 LINK boot_partition 00:02:19.766 LINK err_injection 00:02:19.766 LINK connect_stress 00:02:19.766 LINK startup 00:02:19.766 LINK doorbell_aers 00:02:19.766 LINK fused_ordering 00:02:20.025 LINK reserve 00:02:20.025 LINK mkfs 00:02:20.025 LINK memory_ut 00:02:20.025 LINK simple_copy 00:02:20.025 LINK sgl 00:02:20.025 LINK nvme_dp 00:02:20.025 LINK reset 00:02:20.025 LINK aer 00:02:20.025 LINK overhead 00:02:20.025 LINK fdp 00:02:20.025 LINK nvme_compliance 00:02:20.025 CC examples/nvme/hello_world/hello_world.o 00:02:20.025 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:20.283 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:20.283 CC examples/nvme/abort/abort.o 00:02:20.283 CC examples/nvme/hotplug/hotplug.o 00:02:20.283 CC examples/nvme/reconnect/reconnect.o 00:02:20.283 CC examples/nvme/arbitration/arbitration.o 00:02:20.283 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:20.283 LINK dif 00:02:20.283 CC examples/accel/perf/accel_perf.o 00:02:20.283 LINK cmb_copy 00:02:20.283 LINK pmr_persistence 00:02:20.283 CC examples/blob/cli/blobcli.o 00:02:20.283 LINK hello_world 00:02:20.283 CC examples/blob/hello_world/hello_blob.o 00:02:20.543 LINK hotplug 00:02:20.543 LINK reconnect 00:02:20.543 LINK arbitration 00:02:20.543 LINK abort 00:02:20.543 LINK hello_blob 00:02:20.802 LINK nvme_manage 00:02:20.802 CC test/bdev/bdevio/bdevio.o 00:02:20.802 LINK accel_perf 00:02:20.802 LINK iscsi_fuzz 00:02:20.802 LINK blobcli 00:02:21.060 LINK cuse 00:02:21.060 LINK bdevio 00:02:21.318 CC examples/bdev/hello_world/hello_bdev.o 00:02:21.318 CC examples/bdev/bdevperf/bdevperf.o 00:02:21.577 LINK hello_bdev 00:02:22.145 LINK bdevperf 00:02:22.713 CC examples/nvmf/nvmf/nvmf.o 00:02:22.713 LINK nvmf 00:02:24.620 LINK esnap 00:02:24.879 00:02:24.879 real 1m13.123s 00:02:24.879 user 14m51.824s 00:02:24.879 sys 4m7.881s 00:02:24.879 08:15:37 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:02:24.879 08:15:37 make -- common/autotest_common.sh@10 -- $ set +x 00:02:24.879 ************************************ 00:02:24.879 END TEST make 00:02:24.879 ************************************ 00:02:24.879 08:15:37 -- common/autotest_common.sh@1142 -- $ return 0 00:02:24.879 08:15:37 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:24.879 08:15:37 -- pm/common@29 -- $ signal_monitor_resources TERM 00:02:24.879 08:15:37 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:02:24.879 08:15:37 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:24.879 08:15:37 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:24.879 08:15:37 -- pm/common@44 -- $ pid=1197364 00:02:24.879 08:15:37 -- pm/common@50 -- $ kill -TERM 1197364 00:02:24.879 08:15:37 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:24.879 08:15:37 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:24.879 08:15:37 -- pm/common@44 -- $ pid=1197366 00:02:24.879 08:15:37 -- pm/common@50 -- $ kill -TERM 1197366 00:02:24.879 08:15:37 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:24.879 08:15:37 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:24.879 08:15:37 -- pm/common@44 -- $ pid=1197368 00:02:24.879 08:15:37 -- pm/common@50 -- $ kill -TERM 1197368 00:02:24.879 08:15:37 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:24.879 08:15:37 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:24.879 08:15:37 -- pm/common@44 -- $ pid=1197398 00:02:24.879 08:15:37 -- pm/common@50 -- $ sudo -E kill -TERM 1197398 00:02:25.138 08:15:37 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:02:25.138 08:15:37 -- nvmf/common.sh@7 -- # uname -s 00:02:25.138 08:15:37 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:25.138 08:15:37 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:25.138 08:15:37 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:25.138 08:15:37 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:25.138 08:15:37 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:25.138 08:15:37 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:25.138 08:15:37 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:25.138 08:15:37 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:25.138 08:15:37 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:25.138 08:15:37 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:25.138 08:15:37 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:800e967b-538f-e911-906e-001635649f5c 00:02:25.138 08:15:37 -- nvmf/common.sh@18 -- # NVME_HOSTID=800e967b-538f-e911-906e-001635649f5c 00:02:25.138 08:15:37 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:25.138 08:15:37 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:25.138 08:15:37 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:25.138 08:15:37 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:25.138 08:15:37 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:25.138 08:15:37 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:25.138 08:15:37 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:25.138 08:15:37 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:25.138 08:15:37 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:25.138 08:15:37 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:25.138 08:15:37 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:25.138 08:15:37 -- paths/export.sh@5 -- # export PATH 00:02:25.138 08:15:37 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:25.138 08:15:37 -- nvmf/common.sh@47 -- # : 0 00:02:25.138 08:15:37 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:25.138 08:15:37 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:25.138 08:15:37 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:25.138 08:15:37 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:25.138 08:15:37 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:25.138 08:15:37 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:25.138 08:15:37 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:25.138 08:15:37 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:25.138 08:15:37 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:25.138 08:15:37 -- spdk/autotest.sh@32 -- # uname -s 00:02:25.138 08:15:37 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:25.138 08:15:37 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:25.138 08:15:37 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:25.138 08:15:37 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:25.138 08:15:37 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:25.138 08:15:37 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:25.138 08:15:37 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:25.138 08:15:37 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:25.138 08:15:37 -- spdk/autotest.sh@48 -- # udevadm_pid=1266028 00:02:25.138 08:15:37 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:25.138 08:15:37 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:25.138 08:15:37 -- pm/common@17 -- # local monitor 00:02:25.138 08:15:37 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:25.138 08:15:37 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:25.138 08:15:37 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:25.138 08:15:37 -- pm/common@21 -- # date +%s 00:02:25.138 08:15:37 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:25.138 08:15:37 -- pm/common@21 -- # date +%s 00:02:25.138 08:15:37 -- pm/common@25 -- # sleep 1 00:02:25.138 08:15:37 -- pm/common@21 -- # date +%s 00:02:25.139 08:15:37 -- pm/common@21 -- # date +%s 00:02:25.139 08:15:37 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721715337 00:02:25.139 08:15:37 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721715337 00:02:25.139 08:15:37 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721715337 00:02:25.139 08:15:37 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721715337 00:02:25.139 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721715337_collect-cpu-load.pm.log 00:02:25.139 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721715337_collect-vmstat.pm.log 00:02:25.139 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721715337_collect-cpu-temp.pm.log 00:02:25.139 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721715337_collect-bmc-pm.bmc.pm.log 00:02:26.075 08:15:38 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:26.075 08:15:38 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:26.075 08:15:38 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:26.075 08:15:38 -- common/autotest_common.sh@10 -- # set +x 00:02:26.075 08:15:38 -- spdk/autotest.sh@59 -- # create_test_list 00:02:26.075 08:15:38 -- common/autotest_common.sh@746 -- # xtrace_disable 00:02:26.075 08:15:38 -- common/autotest_common.sh@10 -- # set +x 00:02:26.075 08:15:38 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:02:26.075 08:15:38 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:26.075 08:15:38 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:26.075 08:15:38 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:26.075 08:15:38 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:26.075 08:15:38 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:26.075 08:15:38 -- common/autotest_common.sh@1455 -- # uname 00:02:26.075 08:15:38 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:02:26.075 08:15:38 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:26.075 08:15:38 -- common/autotest_common.sh@1475 -- # uname 00:02:26.075 08:15:38 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:02:26.075 08:15:38 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:26.075 08:15:38 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:02:26.075 08:15:38 -- spdk/autotest.sh@72 -- # hash lcov 00:02:26.075 08:15:38 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:02:26.075 08:15:38 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:02:26.075 --rc lcov_branch_coverage=1 00:02:26.075 --rc lcov_function_coverage=1 00:02:26.075 --rc genhtml_branch_coverage=1 00:02:26.075 --rc genhtml_function_coverage=1 00:02:26.075 --rc genhtml_legend=1 00:02:26.075 --rc geninfo_all_blocks=1 00:02:26.075 ' 00:02:26.075 08:15:38 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:02:26.075 --rc lcov_branch_coverage=1 00:02:26.075 --rc lcov_function_coverage=1 00:02:26.075 --rc genhtml_branch_coverage=1 00:02:26.075 --rc genhtml_function_coverage=1 00:02:26.075 --rc genhtml_legend=1 00:02:26.075 --rc geninfo_all_blocks=1 00:02:26.075 ' 00:02:26.075 08:15:38 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:02:26.075 --rc lcov_branch_coverage=1 00:02:26.076 --rc lcov_function_coverage=1 00:02:26.076 --rc genhtml_branch_coverage=1 00:02:26.076 --rc genhtml_function_coverage=1 00:02:26.076 --rc genhtml_legend=1 00:02:26.076 --rc geninfo_all_blocks=1 00:02:26.076 --no-external' 00:02:26.076 08:15:38 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:02:26.076 --rc lcov_branch_coverage=1 00:02:26.076 --rc lcov_function_coverage=1 00:02:26.076 --rc genhtml_branch_coverage=1 00:02:26.076 --rc genhtml_function_coverage=1 00:02:26.076 --rc genhtml_legend=1 00:02:26.076 --rc geninfo_all_blocks=1 00:02:26.076 --no-external' 00:02:26.076 08:15:38 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:02:26.076 lcov: LCOV version 1.14 00:02:26.334 08:15:38 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:02:36.331 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:02:36.331 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:02:36.332 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:02:36.332 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:02:36.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:02:36.333 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:02:44.456 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:02:44.456 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:02:51.028 08:16:02 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:02:51.028 08:16:02 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:51.028 08:16:02 -- common/autotest_common.sh@10 -- # set +x 00:02:51.028 08:16:02 -- spdk/autotest.sh@91 -- # rm -f 00:02:51.028 08:16:02 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:02:53.575 0000:60:00.0 (8086 0a54): Already using the nvme driver 00:02:53.575 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:02:53.575 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:02:53.575 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:02:53.575 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:02:53.575 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:02:53.575 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:02:53.575 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:02:53.575 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:02:53.575 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:02:53.575 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:02:53.575 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:02:53.575 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:02:53.575 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:02:53.575 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:02:53.864 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:02:53.864 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:02:53.864 08:16:06 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:02:53.864 08:16:06 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:02:53.864 08:16:06 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:02:53.864 08:16:06 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:02:53.864 08:16:06 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:02:53.864 08:16:06 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:02:53.865 08:16:06 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:02:53.865 08:16:06 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:53.865 08:16:06 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:02:53.865 08:16:06 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:02:53.865 08:16:06 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:02:53.865 08:16:06 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:02:53.865 08:16:06 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:02:53.865 08:16:06 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:02:53.865 08:16:06 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:02:53.865 No valid GPT data, bailing 00:02:53.865 08:16:06 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:02:53.865 08:16:06 -- scripts/common.sh@391 -- # pt= 00:02:53.865 08:16:06 -- scripts/common.sh@392 -- # return 1 00:02:53.865 08:16:06 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:02:53.865 1+0 records in 00:02:53.865 1+0 records out 00:02:53.865 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00637461 s, 164 MB/s 00:02:53.865 08:16:06 -- spdk/autotest.sh@118 -- # sync 00:02:53.865 08:16:06 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:02:53.865 08:16:06 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:02:53.865 08:16:06 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:02:59.137 08:16:11 -- spdk/autotest.sh@124 -- # uname -s 00:02:59.137 08:16:11 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:02:59.137 08:16:11 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:02:59.137 08:16:11 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:02:59.138 08:16:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:02:59.138 08:16:11 -- common/autotest_common.sh@10 -- # set +x 00:02:59.138 ************************************ 00:02:59.138 START TEST setup.sh 00:02:59.138 ************************************ 00:02:59.138 08:16:11 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:02:59.138 * Looking for test storage... 00:02:59.138 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:02:59.138 08:16:11 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:02:59.138 08:16:11 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:02:59.138 08:16:11 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:02:59.138 08:16:11 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:02:59.138 08:16:11 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:02:59.138 08:16:11 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:02:59.138 ************************************ 00:02:59.138 START TEST acl 00:02:59.138 ************************************ 00:02:59.138 08:16:11 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:02:59.138 * Looking for test storage... 00:02:59.138 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:02:59.138 08:16:11 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:02:59.138 08:16:11 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:02:59.138 08:16:11 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:02:59.138 08:16:11 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:02:59.138 08:16:11 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:02:59.138 08:16:11 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:02:59.138 08:16:11 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:02:59.138 08:16:11 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:59.138 08:16:11 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:02:59.138 08:16:11 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:02:59.138 08:16:11 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:02:59.138 08:16:11 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:02:59.138 08:16:11 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:02:59.138 08:16:11 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:02:59.138 08:16:11 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:59.138 08:16:11 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:02.426 08:16:14 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:03:02.426 08:16:14 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:03:02.426 08:16:14 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:02.426 08:16:14 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:03:02.426 08:16:14 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:03:02.426 08:16:14 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:04.960 Hugepages 00:03:04.960 node hugesize free / total 00:03:04.960 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:04.960 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:04.960 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:04.960 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:04.960 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:04.960 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:04.960 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:04.960 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:04.960 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:04.960 00:03:04.960 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:04.960 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:04.960 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:04.960 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:04.960 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:04.960 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:04.960 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:04.960 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:60:00.0 == *:*:*.* ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\6\0\:\0\0\.\0* ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:04.961 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:05.220 08:16:17 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:03:05.220 08:16:17 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:05.220 08:16:17 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:05.220 08:16:17 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:05.220 ************************************ 00:03:05.220 START TEST denied 00:03:05.220 ************************************ 00:03:05.220 08:16:17 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:03:05.220 08:16:17 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:60:00.0' 00:03:05.220 08:16:17 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:03:05.220 08:16:17 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:60:00.0' 00:03:05.220 08:16:17 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:03:05.220 08:16:17 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:08.509 0000:60:00.0 (8086 0a54): Skipping denied controller at 0000:60:00.0 00:03:08.509 08:16:20 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:60:00.0 00:03:08.509 08:16:20 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:03:08.509 08:16:20 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:03:08.510 08:16:20 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:60:00.0 ]] 00:03:08.510 08:16:20 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:60:00.0/driver 00:03:08.510 08:16:20 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:08.510 08:16:20 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:08.510 08:16:20 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:03:08.510 08:16:20 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:08.510 08:16:20 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:12.712 00:03:12.712 real 0m6.807s 00:03:12.712 user 0m2.147s 00:03:12.712 sys 0m3.911s 00:03:12.712 08:16:24 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:12.712 08:16:24 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:03:12.712 ************************************ 00:03:12.712 END TEST denied 00:03:12.712 ************************************ 00:03:12.712 08:16:24 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:03:12.712 08:16:24 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:12.712 08:16:24 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:12.712 08:16:24 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:12.712 08:16:24 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:12.712 ************************************ 00:03:12.712 START TEST allowed 00:03:12.712 ************************************ 00:03:12.712 08:16:24 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:03:12.712 08:16:24 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:60:00.0 00:03:12.712 08:16:24 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:03:12.712 08:16:24 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:60:00.0 .*: nvme -> .*' 00:03:12.712 08:16:24 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:03:12.712 08:16:24 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:19.294 0000:60:00.0 (8086 0a54): nvme -> vfio-pci 00:03:19.294 08:16:30 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:03:19.294 08:16:30 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:03:19.294 08:16:30 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:03:19.294 08:16:30 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:19.294 08:16:30 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:21.199 00:03:21.199 real 0m9.271s 00:03:21.199 user 0m2.129s 00:03:21.199 sys 0m3.837s 00:03:21.199 08:16:33 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:21.199 08:16:33 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:03:21.199 ************************************ 00:03:21.199 END TEST allowed 00:03:21.199 ************************************ 00:03:21.199 08:16:33 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:03:21.199 00:03:21.199 real 0m22.293s 00:03:21.199 user 0m6.508s 00:03:21.199 sys 0m11.880s 00:03:21.460 08:16:33 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:21.460 08:16:33 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:21.460 ************************************ 00:03:21.460 END TEST acl 00:03:21.460 ************************************ 00:03:21.460 08:16:33 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:03:21.460 08:16:33 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:21.460 08:16:33 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:21.460 08:16:33 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:21.460 08:16:33 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:21.460 ************************************ 00:03:21.460 START TEST hugepages 00:03:21.460 ************************************ 00:03:21.460 08:16:33 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:21.460 * Looking for test storage... 00:03:21.460 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 38074288 kB' 'MemAvailable: 43097872 kB' 'Buffers: 7316 kB' 'Cached: 11871692 kB' 'SwapCached: 0 kB' 'Active: 8875408 kB' 'Inactive: 4668692 kB' 'Active(anon): 8489984 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1668404 kB' 'Mapped: 198280 kB' 'Shmem: 6824892 kB' 'KReclaimable: 608780 kB' 'Slab: 1129052 kB' 'SReclaimable: 608780 kB' 'SUnreclaim: 520272 kB' 'KernelStack: 19952 kB' 'PageTables: 16048 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 35921788 kB' 'Committed_AS: 13448836 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216964 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.460 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.461 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:21.462 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:21.463 08:16:33 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:21.463 08:16:33 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:21.463 08:16:33 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:21.463 08:16:33 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:21.463 ************************************ 00:03:21.463 START TEST default_setup 00:03:21.463 ************************************ 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:03:21.463 08:16:33 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:24.752 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:24.752 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:24.752 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:24.752 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:24.752 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:24.752 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:24.752 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:24.752 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:24.752 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:24.752 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:24.752 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:24.752 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:24.752 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:24.752 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:24.752 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:24.752 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:28.046 0000:60:00.0 (8086 0a54): nvme -> vfio-pci 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40227640 kB' 'MemAvailable: 45250400 kB' 'Buffers: 7316 kB' 'Cached: 11880020 kB' 'SwapCached: 0 kB' 'Active: 8901756 kB' 'Inactive: 4668692 kB' 'Active(anon): 8516332 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1686512 kB' 'Mapped: 198264 kB' 'Shmem: 6833220 kB' 'KReclaimable: 607956 kB' 'Slab: 1126760 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 518804 kB' 'KernelStack: 20160 kB' 'PageTables: 13836 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 11107116 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217204 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.046 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.047 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40223808 kB' 'MemAvailable: 45246568 kB' 'Buffers: 7316 kB' 'Cached: 11880020 kB' 'SwapCached: 0 kB' 'Active: 8903648 kB' 'Inactive: 4668692 kB' 'Active(anon): 8518224 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1688480 kB' 'Mapped: 198204 kB' 'Shmem: 6833220 kB' 'KReclaimable: 607956 kB' 'Slab: 1126760 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 518804 kB' 'KernelStack: 20336 kB' 'PageTables: 16932 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 12293040 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217252 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40228408 kB' 'MemAvailable: 45251168 kB' 'Buffers: 7316 kB' 'Cached: 11880024 kB' 'SwapCached: 0 kB' 'Active: 8903684 kB' 'Inactive: 4668692 kB' 'Active(anon): 8518260 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1688452 kB' 'Mapped: 198272 kB' 'Shmem: 6833224 kB' 'KReclaimable: 607956 kB' 'Slab: 1126556 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 518600 kB' 'KernelStack: 20352 kB' 'PageTables: 14936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 11108028 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217076 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.050 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.051 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:28.052 nr_hugepages=1024 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:28.052 resv_hugepages=0 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:28.052 surplus_hugepages=0 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:28.052 anon_hugepages=0 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40230260 kB' 'MemAvailable: 45253020 kB' 'Buffers: 7316 kB' 'Cached: 11880060 kB' 'SwapCached: 0 kB' 'Active: 8901532 kB' 'Inactive: 4668692 kB' 'Active(anon): 8516108 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1686120 kB' 'Mapped: 198196 kB' 'Shmem: 6833260 kB' 'KReclaimable: 607956 kB' 'Slab: 1126536 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 518580 kB' 'KernelStack: 20160 kB' 'PageTables: 13112 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 11108052 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217268 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 31555380 kB' 'MemFree: 16477512 kB' 'MemUsed: 15077868 kB' 'SwapCached: 0 kB' 'Active: 6328184 kB' 'Inactive: 4345156 kB' 'Active(anon): 6115104 kB' 'Inactive(anon): 0 kB' 'Active(file): 213080 kB' 'Inactive(file): 4345156 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10007672 kB' 'Mapped: 124940 kB' 'AnonPages: 668908 kB' 'Shmem: 5449436 kB' 'KernelStack: 11768 kB' 'PageTables: 13828 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 234904 kB' 'Slab: 514504 kB' 'SReclaimable: 234904 kB' 'SUnreclaim: 279600 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:28.055 node0=1024 expecting 1024 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:28.055 00:03:28.055 real 0m6.535s 00:03:28.055 user 0m1.403s 00:03:28.055 sys 0m2.025s 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:28.055 08:16:40 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:03:28.055 ************************************ 00:03:28.055 END TEST default_setup 00:03:28.055 ************************************ 00:03:28.055 08:16:40 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:03:28.055 08:16:40 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:28.055 08:16:40 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:28.055 08:16:40 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:28.055 08:16:40 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:28.315 ************************************ 00:03:28.315 START TEST per_node_1G_alloc 00:03:28.315 ************************************ 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:28.315 08:16:40 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:31.627 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:31.628 0000:60:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:31.628 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:31.628 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:31.628 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:31.628 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:31.628 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:31.628 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:31.628 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:31.628 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:31.628 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:31.628 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:31.628 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:31.628 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:31.628 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:31.628 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:31.628 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40248760 kB' 'MemAvailable: 45271520 kB' 'Buffers: 7316 kB' 'Cached: 11880160 kB' 'SwapCached: 0 kB' 'Active: 8899340 kB' 'Inactive: 4668692 kB' 'Active(anon): 8513916 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1683360 kB' 'Mapped: 197460 kB' 'Shmem: 6833360 kB' 'KReclaimable: 607956 kB' 'Slab: 1125672 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 517716 kB' 'KernelStack: 19760 kB' 'PageTables: 14324 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 12279164 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216868 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.628 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.629 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.630 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40251312 kB' 'MemAvailable: 45274072 kB' 'Buffers: 7316 kB' 'Cached: 11880164 kB' 'SwapCached: 0 kB' 'Active: 8900620 kB' 'Inactive: 4668692 kB' 'Active(anon): 8515196 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1684688 kB' 'Mapped: 197540 kB' 'Shmem: 6833364 kB' 'KReclaimable: 607956 kB' 'Slab: 1125712 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 517756 kB' 'KernelStack: 19808 kB' 'PageTables: 13560 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 11094648 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216836 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.631 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.632 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.633 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40248800 kB' 'MemAvailable: 45271560 kB' 'Buffers: 7316 kB' 'Cached: 11880180 kB' 'SwapCached: 0 kB' 'Active: 8899888 kB' 'Inactive: 4668692 kB' 'Active(anon): 8514464 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1684392 kB' 'Mapped: 197448 kB' 'Shmem: 6833380 kB' 'KReclaimable: 607956 kB' 'Slab: 1125708 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 517752 kB' 'KernelStack: 19760 kB' 'PageTables: 12804 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 12279204 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216852 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.634 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.635 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.636 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:31.637 nr_hugepages=1024 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:31.637 resv_hugepages=0 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:31.637 surplus_hugepages=0 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:31.637 anon_hugepages=0 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40248044 kB' 'MemAvailable: 45270804 kB' 'Buffers: 7316 kB' 'Cached: 11880204 kB' 'SwapCached: 0 kB' 'Active: 8900460 kB' 'Inactive: 4668692 kB' 'Active(anon): 8515036 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1684976 kB' 'Mapped: 197448 kB' 'Shmem: 6833404 kB' 'KReclaimable: 607956 kB' 'Slab: 1125708 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 517752 kB' 'KernelStack: 19792 kB' 'PageTables: 17196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 13464008 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216868 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.637 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.638 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.639 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:31.640 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 31555380 kB' 'MemFree: 17517856 kB' 'MemUsed: 14037524 kB' 'SwapCached: 0 kB' 'Active: 6324112 kB' 'Inactive: 4345156 kB' 'Active(anon): 6111032 kB' 'Inactive(anon): 0 kB' 'Active(file): 213080 kB' 'Inactive(file): 4345156 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10007676 kB' 'Mapped: 124512 kB' 'AnonPages: 664824 kB' 'Shmem: 5449440 kB' 'KernelStack: 11032 kB' 'PageTables: 10116 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 234904 kB' 'Slab: 513820 kB' 'SReclaimable: 234904 kB' 'SUnreclaim: 278916 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.641 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.642 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27705296 kB' 'MemFree: 22732588 kB' 'MemUsed: 4972708 kB' 'SwapCached: 0 kB' 'Active: 2576452 kB' 'Inactive: 323536 kB' 'Active(anon): 2404108 kB' 'Inactive(anon): 0 kB' 'Active(file): 172344 kB' 'Inactive(file): 323536 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1879884 kB' 'Mapped: 72936 kB' 'AnonPages: 1020204 kB' 'Shmem: 1384004 kB' 'KernelStack: 8760 kB' 'PageTables: 5396 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 373052 kB' 'Slab: 611888 kB' 'SReclaimable: 373052 kB' 'SUnreclaim: 238836 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.643 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:31.644 node0=512 expecting 512 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:31.644 node1=512 expecting 512 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:31.644 00:03:31.644 real 0m3.357s 00:03:31.644 user 0m1.414s 00:03:31.644 sys 0m2.015s 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:31.644 08:16:43 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:31.644 ************************************ 00:03:31.644 END TEST per_node_1G_alloc 00:03:31.644 ************************************ 00:03:31.644 08:16:43 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:03:31.644 08:16:43 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:31.644 08:16:43 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:31.644 08:16:43 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:31.644 08:16:43 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:31.644 ************************************ 00:03:31.644 START TEST even_2G_alloc 00:03:31.644 ************************************ 00:03:31.644 08:16:43 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:03:31.644 08:16:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:31.644 08:16:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:31.644 08:16:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:31.644 08:16:44 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:34.943 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:34.943 0000:60:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:34.943 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:34.943 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:34.943 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:34.943 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:34.943 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:34.943 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:34.943 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:34.943 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:34.943 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:34.943 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:34.943 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:34.943 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:34.943 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:34.943 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:34.943 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:34.943 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:34.943 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:03:34.943 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:34.943 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:34.943 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:34.943 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:34.943 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:34.943 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:34.943 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:34.943 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:34.943 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:34.943 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40252424 kB' 'MemAvailable: 45275184 kB' 'Buffers: 7316 kB' 'Cached: 11880320 kB' 'SwapCached: 0 kB' 'Active: 8901856 kB' 'Inactive: 4668692 kB' 'Active(anon): 8516432 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1685704 kB' 'Mapped: 197464 kB' 'Shmem: 6833520 kB' 'KReclaimable: 607956 kB' 'Slab: 1126664 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 518708 kB' 'KernelStack: 19936 kB' 'PageTables: 18200 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 13464892 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216964 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.944 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40254848 kB' 'MemAvailable: 45277608 kB' 'Buffers: 7316 kB' 'Cached: 11880324 kB' 'SwapCached: 0 kB' 'Active: 8901020 kB' 'Inactive: 4668692 kB' 'Active(anon): 8515596 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1685408 kB' 'Mapped: 197460 kB' 'Shmem: 6833524 kB' 'KReclaimable: 607956 kB' 'Slab: 1126788 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 518832 kB' 'KernelStack: 19856 kB' 'PageTables: 15912 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 12280136 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216916 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.945 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.946 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40254972 kB' 'MemAvailable: 45277732 kB' 'Buffers: 7316 kB' 'Cached: 11880340 kB' 'SwapCached: 0 kB' 'Active: 8901016 kB' 'Inactive: 4668692 kB' 'Active(anon): 8515592 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1685416 kB' 'Mapped: 197460 kB' 'Shmem: 6833540 kB' 'KReclaimable: 607956 kB' 'Slab: 1126788 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 518832 kB' 'KernelStack: 19856 kB' 'PageTables: 13596 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 11095372 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216900 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.947 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.948 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.949 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:34.950 nr_hugepages=1024 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:34.950 resv_hugepages=0 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:34.950 surplus_hugepages=0 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:34.950 anon_hugepages=0 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:34.950 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40253712 kB' 'MemAvailable: 45276472 kB' 'Buffers: 7316 kB' 'Cached: 11880364 kB' 'SwapCached: 0 kB' 'Active: 8900060 kB' 'Inactive: 4668692 kB' 'Active(anon): 8514636 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1684400 kB' 'Mapped: 197460 kB' 'Shmem: 6833564 kB' 'KReclaimable: 607956 kB' 'Slab: 1126788 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 518832 kB' 'KernelStack: 19808 kB' 'PageTables: 12856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 12279932 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216916 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.951 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:34.952 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 31555380 kB' 'MemFree: 17531380 kB' 'MemUsed: 14024000 kB' 'SwapCached: 0 kB' 'Active: 6323132 kB' 'Inactive: 4345156 kB' 'Active(anon): 6110052 kB' 'Inactive(anon): 0 kB' 'Active(file): 213080 kB' 'Inactive(file): 4345156 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10007676 kB' 'Mapped: 124516 kB' 'AnonPages: 663788 kB' 'Shmem: 5449440 kB' 'KernelStack: 11032 kB' 'PageTables: 9420 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 234904 kB' 'Slab: 514480 kB' 'SReclaimable: 234904 kB' 'SUnreclaim: 279576 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.953 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27705296 kB' 'MemFree: 22719056 kB' 'MemUsed: 4986240 kB' 'SwapCached: 0 kB' 'Active: 2577432 kB' 'Inactive: 323536 kB' 'Active(anon): 2405088 kB' 'Inactive(anon): 0 kB' 'Active(file): 172344 kB' 'Inactive(file): 323536 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1880004 kB' 'Mapped: 72944 kB' 'AnonPages: 1021116 kB' 'Shmem: 1384124 kB' 'KernelStack: 8792 kB' 'PageTables: 5452 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 373052 kB' 'Slab: 612308 kB' 'SReclaimable: 373052 kB' 'SUnreclaim: 239256 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.954 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.955 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:34.956 node0=512 expecting 512 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:34.956 node1=512 expecting 512 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:34.956 00:03:34.956 real 0m3.341s 00:03:34.956 user 0m1.411s 00:03:34.956 sys 0m2.004s 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:34.956 08:16:47 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:34.956 ************************************ 00:03:34.956 END TEST even_2G_alloc 00:03:34.956 ************************************ 00:03:34.956 08:16:47 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:03:34.956 08:16:47 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:34.956 08:16:47 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:34.956 08:16:47 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:34.956 08:16:47 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:34.956 ************************************ 00:03:34.956 START TEST odd_alloc 00:03:34.956 ************************************ 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:34.956 08:16:47 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:38.258 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:38.258 0000:60:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:38.258 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:38.258 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:38.258 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:38.258 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:38.258 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:38.258 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:38.258 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:38.258 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:38.258 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:38.258 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:38.258 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:38.258 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:38.258 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:38.258 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:38.258 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40211064 kB' 'MemAvailable: 45233824 kB' 'Buffers: 7316 kB' 'Cached: 11880476 kB' 'SwapCached: 0 kB' 'Active: 8901940 kB' 'Inactive: 4668692 kB' 'Active(anon): 8516516 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1685796 kB' 'Mapped: 197596 kB' 'Shmem: 6833676 kB' 'KReclaimable: 607956 kB' 'Slab: 1126296 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 518340 kB' 'KernelStack: 19872 kB' 'PageTables: 16880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36969340 kB' 'Committed_AS: 13465320 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216964 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.258 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.259 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40211512 kB' 'MemAvailable: 45234272 kB' 'Buffers: 7316 kB' 'Cached: 11880480 kB' 'SwapCached: 0 kB' 'Active: 8902536 kB' 'Inactive: 4668692 kB' 'Active(anon): 8517112 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1686484 kB' 'Mapped: 197592 kB' 'Shmem: 6833680 kB' 'KReclaimable: 607956 kB' 'Slab: 1126352 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 518396 kB' 'KernelStack: 19872 kB' 'PageTables: 15956 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36969340 kB' 'Committed_AS: 12280808 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216900 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.260 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40212124 kB' 'MemAvailable: 45234884 kB' 'Buffers: 7316 kB' 'Cached: 11880480 kB' 'SwapCached: 0 kB' 'Active: 8902004 kB' 'Inactive: 4668692 kB' 'Active(anon): 8516580 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1686428 kB' 'Mapped: 197516 kB' 'Shmem: 6833680 kB' 'KReclaimable: 607956 kB' 'Slab: 1126348 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 518392 kB' 'KernelStack: 19856 kB' 'PageTables: 13592 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36969340 kB' 'Committed_AS: 11096044 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216884 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.261 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.262 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:38.263 nr_hugepages=1025 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:38.263 resv_hugepages=0 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:38.263 surplus_hugepages=0 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:38.263 anon_hugepages=0 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.263 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40211760 kB' 'MemAvailable: 45234520 kB' 'Buffers: 7316 kB' 'Cached: 11880480 kB' 'SwapCached: 0 kB' 'Active: 8901916 kB' 'Inactive: 4668692 kB' 'Active(anon): 8516492 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1686292 kB' 'Mapped: 197516 kB' 'Shmem: 6833680 kB' 'KReclaimable: 607956 kB' 'Slab: 1126344 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 518388 kB' 'KernelStack: 19824 kB' 'PageTables: 13200 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36969340 kB' 'Committed_AS: 11096064 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216900 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.264 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 31555380 kB' 'MemFree: 17515220 kB' 'MemUsed: 14040160 kB' 'SwapCached: 0 kB' 'Active: 6324084 kB' 'Inactive: 4345156 kB' 'Active(anon): 6111004 kB' 'Inactive(anon): 0 kB' 'Active(file): 213080 kB' 'Inactive(file): 4345156 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10007676 kB' 'Mapped: 124512 kB' 'AnonPages: 664892 kB' 'Shmem: 5449440 kB' 'KernelStack: 11016 kB' 'PageTables: 8920 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 234904 kB' 'Slab: 514028 kB' 'SReclaimable: 234904 kB' 'SUnreclaim: 279124 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.265 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.266 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27705296 kB' 'MemFree: 22695612 kB' 'MemUsed: 5009684 kB' 'SwapCached: 0 kB' 'Active: 2577592 kB' 'Inactive: 323536 kB' 'Active(anon): 2405248 kB' 'Inactive(anon): 0 kB' 'Active(file): 172344 kB' 'Inactive(file): 323536 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1880172 kB' 'Mapped: 73004 kB' 'AnonPages: 1021156 kB' 'Shmem: 1384292 kB' 'KernelStack: 8760 kB' 'PageTables: 5368 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 373052 kB' 'Slab: 612316 kB' 'SReclaimable: 373052 kB' 'SUnreclaim: 239264 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.267 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:38.268 node0=512 expecting 513 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:38.268 node1=513 expecting 512 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:38.268 00:03:38.268 real 0m3.267s 00:03:38.268 user 0m1.318s 00:03:38.268 sys 0m2.014s 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:38.268 08:16:50 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:38.268 ************************************ 00:03:38.268 END TEST odd_alloc 00:03:38.268 ************************************ 00:03:38.268 08:16:50 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:03:38.268 08:16:50 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:38.268 08:16:50 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:38.268 08:16:50 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:38.268 08:16:50 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:38.268 ************************************ 00:03:38.268 START TEST custom_alloc 00:03:38.268 ************************************ 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:38.268 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:38.269 08:16:50 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:41.565 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:41.565 0000:60:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:41.565 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:41.565 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:41.565 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:41.565 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:41.565 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:41.565 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:41.565 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:41.565 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:41.565 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:41.565 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:41.565 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:41.565 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:41.565 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:41.565 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:41.565 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 39172044 kB' 'MemAvailable: 44194804 kB' 'Buffers: 7316 kB' 'Cached: 11880636 kB' 'SwapCached: 0 kB' 'Active: 8902852 kB' 'Inactive: 4668692 kB' 'Active(anon): 8517428 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1686552 kB' 'Mapped: 197576 kB' 'Shmem: 6833836 kB' 'KReclaimable: 607956 kB' 'Slab: 1125764 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 517808 kB' 'KernelStack: 19952 kB' 'PageTables: 14056 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36446076 kB' 'Committed_AS: 11096548 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216804 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.565 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.566 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 39171684 kB' 'MemAvailable: 44194444 kB' 'Buffers: 7316 kB' 'Cached: 11880640 kB' 'SwapCached: 0 kB' 'Active: 8901236 kB' 'Inactive: 4668692 kB' 'Active(anon): 8515812 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1685336 kB' 'Mapped: 197496 kB' 'Shmem: 6833840 kB' 'KReclaimable: 607956 kB' 'Slab: 1125768 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 517812 kB' 'KernelStack: 19936 kB' 'PageTables: 11052 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36446076 kB' 'Committed_AS: 11096324 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216820 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.567 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.568 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 39169164 kB' 'MemAvailable: 44191924 kB' 'Buffers: 7316 kB' 'Cached: 11880640 kB' 'SwapCached: 0 kB' 'Active: 8902040 kB' 'Inactive: 4668692 kB' 'Active(anon): 8516616 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1686148 kB' 'Mapped: 197496 kB' 'Shmem: 6833840 kB' 'KReclaimable: 607956 kB' 'Slab: 1125768 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 517812 kB' 'KernelStack: 19936 kB' 'PageTables: 13072 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36446076 kB' 'Committed_AS: 12281124 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216836 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.569 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.570 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:41.571 nr_hugepages=1536 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:41.571 resv_hugepages=0 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:41.571 surplus_hugepages=0 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:41.571 anon_hugepages=0 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 39167572 kB' 'MemAvailable: 44190332 kB' 'Buffers: 7316 kB' 'Cached: 11880676 kB' 'SwapCached: 0 kB' 'Active: 8902432 kB' 'Inactive: 4668692 kB' 'Active(anon): 8517008 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1686488 kB' 'Mapped: 197496 kB' 'Shmem: 6833876 kB' 'KReclaimable: 607956 kB' 'Slab: 1125768 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 517812 kB' 'KernelStack: 19952 kB' 'PageTables: 15748 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36446076 kB' 'Committed_AS: 12281392 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216820 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.571 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.572 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 31555380 kB' 'MemFree: 17518200 kB' 'MemUsed: 14037180 kB' 'SwapCached: 0 kB' 'Active: 6324076 kB' 'Inactive: 4345156 kB' 'Active(anon): 6110996 kB' 'Inactive(anon): 0 kB' 'Active(file): 213080 kB' 'Inactive(file): 4345156 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10007676 kB' 'Mapped: 124512 kB' 'AnonPages: 664820 kB' 'Shmem: 5449440 kB' 'KernelStack: 11144 kB' 'PageTables: 7924 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 234904 kB' 'Slab: 513592 kB' 'SReclaimable: 234904 kB' 'SUnreclaim: 278688 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.573 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.574 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.575 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.575 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.575 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.575 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.575 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.575 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.575 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.575 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.575 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.575 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:41.575 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:41.575 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:41.575 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:41.575 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:41.575 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:41.575 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27705296 kB' 'MemFree: 21653068 kB' 'MemUsed: 6052228 kB' 'SwapCached: 0 kB' 'Active: 2577904 kB' 'Inactive: 323536 kB' 'Active(anon): 2405560 kB' 'Inactive(anon): 0 kB' 'Active(file): 172344 kB' 'Inactive(file): 323536 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1880340 kB' 'Mapped: 72984 kB' 'AnonPages: 1021204 kB' 'Shmem: 1384460 kB' 'KernelStack: 8760 kB' 'PageTables: 5360 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 373052 kB' 'Slab: 612176 kB' 'SReclaimable: 373052 kB' 'SUnreclaim: 239124 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.836 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:41.837 node0=512 expecting 512 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:41.837 node1=1024 expecting 1024 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:41.837 00:03:41.837 real 0m3.365s 00:03:41.837 user 0m1.340s 00:03:41.837 sys 0m2.100s 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:41.837 08:16:54 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:41.837 ************************************ 00:03:41.837 END TEST custom_alloc 00:03:41.837 ************************************ 00:03:41.837 08:16:54 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:03:41.837 08:16:54 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:41.837 08:16:54 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:41.837 08:16:54 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:41.837 08:16:54 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:41.837 ************************************ 00:03:41.837 START TEST no_shrink_alloc 00:03:41.837 ************************************ 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:41.837 08:16:54 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:45.132 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:45.132 0000:60:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:45.132 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:45.132 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:45.132 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:45.132 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:45.132 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:45.132 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:45.132 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:45.132 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:45.132 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:45.132 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:45.132 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:45.132 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:45.132 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:45.132 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:45.132 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:45.132 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:45.132 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:03:45.132 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:45.132 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:45.132 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:45.132 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:45.132 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:45.132 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40238464 kB' 'MemAvailable: 45261224 kB' 'Buffers: 7316 kB' 'Cached: 11880788 kB' 'SwapCached: 0 kB' 'Active: 8903024 kB' 'Inactive: 4668692 kB' 'Active(anon): 8517600 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1686484 kB' 'Mapped: 197624 kB' 'Shmem: 6833988 kB' 'KReclaimable: 607956 kB' 'Slab: 1125912 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 517956 kB' 'KernelStack: 19856 kB' 'PageTables: 15888 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 12282160 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217028 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.133 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40240200 kB' 'MemAvailable: 45262960 kB' 'Buffers: 7316 kB' 'Cached: 11880788 kB' 'SwapCached: 0 kB' 'Active: 8904156 kB' 'Inactive: 4668692 kB' 'Active(anon): 8518732 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1687764 kB' 'Mapped: 197580 kB' 'Shmem: 6833988 kB' 'KReclaimable: 607956 kB' 'Slab: 1125896 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 517940 kB' 'KernelStack: 19920 kB' 'PageTables: 14036 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 11100008 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217012 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.134 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.135 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40240140 kB' 'MemAvailable: 45262900 kB' 'Buffers: 7316 kB' 'Cached: 11880788 kB' 'SwapCached: 0 kB' 'Active: 8902828 kB' 'Inactive: 4668692 kB' 'Active(anon): 8517404 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1686948 kB' 'Mapped: 197504 kB' 'Shmem: 6833988 kB' 'KReclaimable: 607956 kB' 'Slab: 1125920 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 517964 kB' 'KernelStack: 19968 kB' 'PageTables: 11136 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 11098296 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217108 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.136 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.137 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:45.138 nr_hugepages=1024 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:45.138 resv_hugepages=0 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:45.138 surplus_hugepages=0 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:45.138 anon_hugepages=0 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40235448 kB' 'MemAvailable: 45258208 kB' 'Buffers: 7316 kB' 'Cached: 11880788 kB' 'SwapCached: 0 kB' 'Active: 8905220 kB' 'Inactive: 4668692 kB' 'Active(anon): 8519796 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1689388 kB' 'Mapped: 197504 kB' 'Shmem: 6833988 kB' 'KReclaimable: 607956 kB' 'Slab: 1125920 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 517964 kB' 'KernelStack: 20464 kB' 'PageTables: 16136 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 12284588 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217684 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.138 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.139 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 31555380 kB' 'MemFree: 16493136 kB' 'MemUsed: 15062244 kB' 'SwapCached: 0 kB' 'Active: 6326848 kB' 'Inactive: 4345156 kB' 'Active(anon): 6113768 kB' 'Inactive(anon): 0 kB' 'Active(file): 213080 kB' 'Inactive(file): 4345156 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10007680 kB' 'Mapped: 124512 kB' 'AnonPages: 667728 kB' 'Shmem: 5449444 kB' 'KernelStack: 11992 kB' 'PageTables: 10292 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 234904 kB' 'Slab: 514328 kB' 'SReclaimable: 234904 kB' 'SUnreclaim: 279424 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.140 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.141 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:45.142 node0=1024 expecting 1024 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:45.142 08:16:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:48.433 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:48.433 0000:60:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:48.433 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:48.433 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:48.433 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:48.433 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:48.433 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:48.433 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:48.433 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:48.433 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:48.433 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:48.433 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:48.433 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:48.433 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:48.433 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:48.433 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:48.433 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:48.433 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40252168 kB' 'MemAvailable: 45274928 kB' 'Buffers: 7316 kB' 'Cached: 11880928 kB' 'SwapCached: 0 kB' 'Active: 8902988 kB' 'Inactive: 4668692 kB' 'Active(anon): 8517564 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1686328 kB' 'Mapped: 197564 kB' 'Shmem: 6834128 kB' 'KReclaimable: 607956 kB' 'Slab: 1126064 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 518108 kB' 'KernelStack: 19888 kB' 'PageTables: 17188 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 13467352 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216980 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.433 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.434 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40252464 kB' 'MemAvailable: 45275224 kB' 'Buffers: 7316 kB' 'Cached: 11880936 kB' 'SwapCached: 0 kB' 'Active: 8903208 kB' 'Inactive: 4668692 kB' 'Active(anon): 8517784 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1686572 kB' 'Mapped: 197600 kB' 'Shmem: 6834136 kB' 'KReclaimable: 607956 kB' 'Slab: 1126112 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 518156 kB' 'KernelStack: 19856 kB' 'PageTables: 15868 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 12282836 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216964 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.435 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.436 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40252644 kB' 'MemAvailable: 45275404 kB' 'Buffers: 7316 kB' 'Cached: 11880956 kB' 'SwapCached: 0 kB' 'Active: 8902220 kB' 'Inactive: 4668692 kB' 'Active(anon): 8516796 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1686008 kB' 'Mapped: 197524 kB' 'Shmem: 6834156 kB' 'KReclaimable: 607956 kB' 'Slab: 1126080 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 518124 kB' 'KernelStack: 19824 kB' 'PageTables: 13156 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 11098076 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216932 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.437 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.438 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:48.439 nr_hugepages=1024 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:48.439 resv_hugepages=0 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:48.439 surplus_hugepages=0 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:48.439 anon_hugepages=0 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 59260676 kB' 'MemFree: 40246596 kB' 'MemAvailable: 45269356 kB' 'Buffers: 7316 kB' 'Cached: 11880976 kB' 'SwapCached: 0 kB' 'Active: 8902048 kB' 'Inactive: 4668692 kB' 'Active(anon): 8516624 kB' 'Inactive(anon): 0 kB' 'Active(file): 385424 kB' 'Inactive(file): 4668692 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 1685808 kB' 'Mapped: 197524 kB' 'Shmem: 6834176 kB' 'KReclaimable: 607956 kB' 'Slab: 1126080 kB' 'SReclaimable: 607956 kB' 'SUnreclaim: 518124 kB' 'KernelStack: 19808 kB' 'PageTables: 13312 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36970364 kB' 'Committed_AS: 12282632 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216948 kB' 'VmallocChunk: 0 kB' 'Percpu: 87168 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2356180 kB' 'DirectMap2M: 16197632 kB' 'DirectMap1G: 49283072 kB' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.439 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.440 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 31555380 kB' 'MemFree: 16494744 kB' 'MemUsed: 15060636 kB' 'SwapCached: 0 kB' 'Active: 6322788 kB' 'Inactive: 4345156 kB' 'Active(anon): 6109708 kB' 'Inactive(anon): 0 kB' 'Active(file): 213080 kB' 'Inactive(file): 4345156 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10007684 kB' 'Mapped: 124516 kB' 'AnonPages: 663524 kB' 'Shmem: 5449448 kB' 'KernelStack: 11048 kB' 'PageTables: 10064 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 234904 kB' 'Slab: 513888 kB' 'SReclaimable: 234904 kB' 'SUnreclaim: 278984 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.441 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.442 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.443 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.443 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.443 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:48.443 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.443 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.443 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.443 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:48.443 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:48.443 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:48.443 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:48.443 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:48.443 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:48.443 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:48.443 node0=1024 expecting 1024 00:03:48.443 08:17:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:48.443 00:03:48.443 real 0m6.452s 00:03:48.443 user 0m2.524s 00:03:48.443 sys 0m4.011s 00:03:48.443 08:17:00 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:48.443 08:17:00 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:48.443 ************************************ 00:03:48.443 END TEST no_shrink_alloc 00:03:48.443 ************************************ 00:03:48.443 08:17:00 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:03:48.443 08:17:00 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:03:48.443 08:17:00 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:03:48.443 08:17:00 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:48.443 08:17:00 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:48.443 08:17:00 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:48.443 08:17:00 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:48.443 08:17:00 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:48.443 08:17:00 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:48.443 08:17:00 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:48.443 08:17:00 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:48.443 08:17:00 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:48.443 08:17:00 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:48.443 08:17:00 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:48.443 08:17:00 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:48.443 00:03:48.443 real 0m26.884s 00:03:48.443 user 0m9.660s 00:03:48.443 sys 0m14.523s 00:03:48.443 08:17:00 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:48.443 08:17:00 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:48.443 ************************************ 00:03:48.443 END TEST hugepages 00:03:48.443 ************************************ 00:03:48.443 08:17:00 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:03:48.443 08:17:00 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:03:48.443 08:17:00 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:48.443 08:17:00 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:48.443 08:17:00 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:48.443 ************************************ 00:03:48.443 START TEST driver 00:03:48.443 ************************************ 00:03:48.443 08:17:00 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:03:48.443 * Looking for test storage... 00:03:48.443 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:48.443 08:17:00 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:03:48.443 08:17:00 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:48.443 08:17:00 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:52.632 08:17:04 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:03:52.632 08:17:04 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:52.632 08:17:04 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:52.632 08:17:04 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:03:52.632 ************************************ 00:03:52.632 START TEST guess_driver 00:03:52.632 ************************************ 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 221 > 0 )) 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:03:52.632 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:52.632 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:52.632 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:52.632 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:52.632 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:03:52.632 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:03:52.632 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:03:52.632 Looking for driver=vfio-pci 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:03:52.632 08:17:04 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.919 08:17:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:59.212 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:59.212 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:59.212 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:59.212 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:03:59.212 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:03:59.212 08:17:11 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:59.212 08:17:11 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:03.401 00:04:03.401 real 0m10.587s 00:04:03.401 user 0m2.455s 00:04:03.401 sys 0m4.287s 00:04:03.401 08:17:15 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:03.401 08:17:15 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:04:03.401 ************************************ 00:04:03.401 END TEST guess_driver 00:04:03.401 ************************************ 00:04:03.401 08:17:15 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:04:03.401 00:04:03.401 real 0m14.853s 00:04:03.401 user 0m3.725s 00:04:03.401 sys 0m6.519s 00:04:03.401 08:17:15 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:03.401 08:17:15 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:03.401 ************************************ 00:04:03.401 END TEST driver 00:04:03.401 ************************************ 00:04:03.401 08:17:15 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:03.401 08:17:15 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:03.401 08:17:15 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:03.401 08:17:15 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:03.401 08:17:15 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:03.402 ************************************ 00:04:03.402 START TEST devices 00:04:03.402 ************************************ 00:04:03.402 08:17:15 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:03.402 * Looking for test storage... 00:04:03.402 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:03.402 08:17:15 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:03.402 08:17:15 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:04:03.402 08:17:15 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:03.402 08:17:15 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:04:06.690 08:17:18 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:06.690 08:17:18 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:06.690 08:17:18 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:06.690 08:17:18 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:06.690 08:17:18 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:06.690 08:17:18 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:06.690 08:17:18 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:06.690 08:17:18 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:60:00.0 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\6\0\:\0\0\.\0* ]] 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:06.690 08:17:18 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:04:06.690 08:17:18 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:06.690 No valid GPT data, bailing 00:04:06.690 08:17:18 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:06.690 08:17:18 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:04:06.690 08:17:18 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:06.690 08:17:18 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:06.690 08:17:18 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:06.690 08:17:18 setup.sh.devices -- setup/common.sh@80 -- # echo 4000787030016 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@204 -- # (( 4000787030016 >= min_disk_size )) 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:60:00.0 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:06.690 08:17:18 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:06.690 08:17:18 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:06.690 08:17:18 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:06.690 08:17:18 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:06.690 ************************************ 00:04:06.690 START TEST nvme_mount 00:04:06.690 ************************************ 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:06.690 08:17:18 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:07.258 Creating new GPT entries in memory. 00:04:07.258 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:07.258 other utilities. 00:04:07.258 08:17:19 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:07.258 08:17:19 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:07.258 08:17:19 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:07.258 08:17:19 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:07.258 08:17:19 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:08.195 Creating new GPT entries in memory. 00:04:08.196 The operation has completed successfully. 00:04:08.196 08:17:20 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:08.196 08:17:20 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:08.196 08:17:20 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 1302254 00:04:08.196 08:17:20 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:08.196 08:17:20 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:08.196 08:17:20 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:08.196 08:17:20 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:08.196 08:17:20 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:08.455 08:17:20 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:08.455 08:17:20 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:60:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:08.455 08:17:20 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:60:00.0 00:04:08.455 08:17:20 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:08.455 08:17:20 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:08.455 08:17:20 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:08.455 08:17:20 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:08.455 08:17:20 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:08.455 08:17:20 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:08.455 08:17:20 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:08.455 08:17:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:08.455 08:17:20 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:60:00.0 00:04:08.455 08:17:20 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:08.455 08:17:20 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:08.455 08:17:20 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:60:00.0 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:11.751 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:11.751 08:17:23 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:11.751 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:11.751 /dev/nvme0n1: 8 bytes were erased at offset 0x3a3817d5e00 (gpt): 45 46 49 20 50 41 52 54 00:04:11.751 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:11.751 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:60:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:60:00.0 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:60:00.0 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:11.751 08:17:24 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:60:00.0 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.041 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:60:00.0 data@nvme0n1 '' '' 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:60:00.0 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:60:00.0 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:15.042 08:17:27 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:60:00.0 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:18.368 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:18.368 00:04:18.368 real 0m11.721s 00:04:18.368 user 0m3.505s 00:04:18.368 sys 0m6.069s 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:18.368 08:17:30 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:04:18.368 ************************************ 00:04:18.368 END TEST nvme_mount 00:04:18.368 ************************************ 00:04:18.368 08:17:30 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:04:18.368 08:17:30 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:18.368 08:17:30 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:18.368 08:17:30 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:18.368 08:17:30 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:18.368 ************************************ 00:04:18.368 START TEST dm_mount 00:04:18.368 ************************************ 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:18.368 08:17:30 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:18.936 Creating new GPT entries in memory. 00:04:18.936 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:18.936 other utilities. 00:04:18.937 08:17:31 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:18.937 08:17:31 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:18.937 08:17:31 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:18.937 08:17:31 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:18.937 08:17:31 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:20.317 Creating new GPT entries in memory. 00:04:20.317 The operation has completed successfully. 00:04:20.317 08:17:32 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:20.317 08:17:32 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:20.317 08:17:32 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:20.317 08:17:32 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:20.317 08:17:32 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:21.263 The operation has completed successfully. 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 1307100 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:60:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:60:00.0 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:60:00.0 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:21.263 08:17:33 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:60:00.0 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:24.552 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:24.553 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:24.553 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:24.553 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:60:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:24.553 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:60:00.0 00:04:24.553 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:24.553 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:24.553 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:04:24.553 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:04:24.553 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:24.553 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:04:24.553 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.553 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:60:00.0 00:04:24.553 08:17:36 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:04:24.553 08:17:36 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:24.553 08:17:36 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:60:00.0 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.843 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\6\0\:\0\0\.\0 ]] 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:27.844 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:27.844 00:04:27.844 real 0m9.528s 00:04:27.844 user 0m2.505s 00:04:27.844 sys 0m4.040s 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:27.844 08:17:39 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:04:27.844 ************************************ 00:04:27.844 END TEST dm_mount 00:04:27.844 ************************************ 00:04:27.844 08:17:39 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:04:27.844 08:17:39 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:04:27.844 08:17:39 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:04:27.844 08:17:39 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:27.844 08:17:39 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:27.844 08:17:39 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:27.844 08:17:39 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:27.844 08:17:39 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:27.844 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:27.844 /dev/nvme0n1: 8 bytes were erased at offset 0x3a3817d5e00 (gpt): 45 46 49 20 50 41 52 54 00:04:27.844 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:27.844 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:27.844 08:17:40 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:04:27.844 08:17:40 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:27.844 08:17:40 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:27.844 08:17:40 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:27.844 08:17:40 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:27.844 08:17:40 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:27.844 08:17:40 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:27.844 00:04:27.844 real 0m24.619s 00:04:27.844 user 0m7.110s 00:04:27.844 sys 0m12.114s 00:04:27.844 08:17:40 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:27.844 08:17:40 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:27.844 ************************************ 00:04:27.844 END TEST devices 00:04:27.844 ************************************ 00:04:27.844 08:17:40 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:27.844 00:04:27.844 real 1m29.012s 00:04:27.844 user 0m27.146s 00:04:27.844 sys 0m45.283s 00:04:27.844 08:17:40 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:27.844 08:17:40 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:27.844 ************************************ 00:04:27.844 END TEST setup.sh 00:04:27.844 ************************************ 00:04:27.844 08:17:40 -- common/autotest_common.sh@1142 -- # return 0 00:04:27.844 08:17:40 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:04:31.131 Hugepages 00:04:31.131 node hugesize free / total 00:04:31.131 node0 1048576kB 0 / 0 00:04:31.131 node0 2048kB 1024 / 1024 00:04:31.131 node1 1048576kB 0 / 0 00:04:31.131 node1 2048kB 1024 / 1024 00:04:31.131 00:04:31.131 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:31.131 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:31.131 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:31.131 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:31.131 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:31.131 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:31.131 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:31.131 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:31.131 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:31.131 NVMe 0000:60:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:04:31.131 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:31.131 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:31.131 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:31.131 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:31.131 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:31.131 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:31.131 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:31.131 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:31.131 08:17:43 -- spdk/autotest.sh@130 -- # uname -s 00:04:31.131 08:17:43 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:04:31.131 08:17:43 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:04:31.131 08:17:43 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:33.665 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:33.665 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:33.665 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:33.665 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:33.665 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:33.665 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:33.665 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:33.665 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:33.665 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:33.665 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:33.665 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:33.665 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:33.665 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:33.665 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:33.665 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:33.665 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:36.991 0000:60:00.0 (8086 0a54): nvme -> vfio-pci 00:04:36.991 08:17:49 -- common/autotest_common.sh@1532 -- # sleep 1 00:04:37.923 08:17:50 -- common/autotest_common.sh@1533 -- # bdfs=() 00:04:37.923 08:17:50 -- common/autotest_common.sh@1533 -- # local bdfs 00:04:37.923 08:17:50 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:04:37.923 08:17:50 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:04:37.923 08:17:50 -- common/autotest_common.sh@1513 -- # bdfs=() 00:04:37.923 08:17:50 -- common/autotest_common.sh@1513 -- # local bdfs 00:04:37.923 08:17:50 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:37.923 08:17:50 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:37.923 08:17:50 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:04:38.181 08:17:50 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:04:38.181 08:17:50 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:60:00.0 00:04:38.181 08:17:50 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:41.488 Waiting for block devices as requested 00:04:41.488 0000:60:00.0 (8086 0a54): vfio-pci -> nvme 00:04:41.488 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:41.488 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:41.488 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:41.488 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:41.488 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:41.488 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:41.747 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:41.747 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:41.747 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:42.006 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:42.006 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:42.006 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:42.006 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:42.264 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:42.264 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:42.264 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:42.521 08:17:54 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:04:42.521 08:17:54 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:60:00.0 00:04:42.521 08:17:54 -- common/autotest_common.sh@1502 -- # grep 0000:60:00.0/nvme/nvme 00:04:42.521 08:17:54 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:04:42.521 08:17:54 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:03.0/0000:60:00.0/nvme/nvme0 00:04:42.521 08:17:54 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:03.0/0000:60:00.0/nvme/nvme0 ]] 00:04:42.521 08:17:54 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:03.0/0000:60:00.0/nvme/nvme0 00:04:42.521 08:17:54 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:04:42.521 08:17:54 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:04:42.521 08:17:54 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:04:42.521 08:17:54 -- common/autotest_common.sh@1545 -- # grep oacs 00:04:42.521 08:17:54 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:04:42.521 08:17:54 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:04:42.521 08:17:54 -- common/autotest_common.sh@1545 -- # oacs=' 0xe' 00:04:42.521 08:17:54 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:04:42.521 08:17:54 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:04:42.521 08:17:54 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:04:42.521 08:17:54 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:04:42.521 08:17:54 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:04:42.521 08:17:54 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:04:42.521 08:17:54 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:04:42.521 08:17:54 -- common/autotest_common.sh@1557 -- # continue 00:04:42.521 08:17:54 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:04:42.521 08:17:54 -- common/autotest_common.sh@728 -- # xtrace_disable 00:04:42.521 08:17:54 -- common/autotest_common.sh@10 -- # set +x 00:04:42.521 08:17:54 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:04:42.521 08:17:54 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:42.521 08:17:54 -- common/autotest_common.sh@10 -- # set +x 00:04:42.521 08:17:54 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:45.054 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:45.054 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:45.054 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:45.054 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:45.054 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:45.054 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:45.054 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:45.054 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:45.054 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:45.054 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:45.054 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:45.054 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:45.054 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:45.054 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:45.054 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:45.314 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:48.606 0000:60:00.0 (8086 0a54): nvme -> vfio-pci 00:04:48.606 08:18:00 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:04:48.606 08:18:00 -- common/autotest_common.sh@728 -- # xtrace_disable 00:04:48.606 08:18:00 -- common/autotest_common.sh@10 -- # set +x 00:04:48.606 08:18:00 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:04:48.606 08:18:00 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:04:48.606 08:18:00 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:04:48.606 08:18:00 -- common/autotest_common.sh@1577 -- # bdfs=() 00:04:48.606 08:18:00 -- common/autotest_common.sh@1577 -- # local bdfs 00:04:48.606 08:18:00 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:04:48.606 08:18:00 -- common/autotest_common.sh@1513 -- # bdfs=() 00:04:48.606 08:18:00 -- common/autotest_common.sh@1513 -- # local bdfs 00:04:48.606 08:18:00 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:48.606 08:18:00 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:04:48.606 08:18:00 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:48.606 08:18:01 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:04:48.606 08:18:01 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:60:00.0 00:04:48.606 08:18:01 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:04:48.606 08:18:01 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:60:00.0/device 00:04:48.606 08:18:01 -- common/autotest_common.sh@1580 -- # device=0x0a54 00:04:48.606 08:18:01 -- common/autotest_common.sh@1581 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:04:48.606 08:18:01 -- common/autotest_common.sh@1582 -- # bdfs+=($bdf) 00:04:48.606 08:18:01 -- common/autotest_common.sh@1586 -- # printf '%s\n' 0000:60:00.0 00:04:48.606 08:18:01 -- common/autotest_common.sh@1592 -- # [[ -z 0000:60:00.0 ]] 00:04:48.606 08:18:01 -- common/autotest_common.sh@1597 -- # spdk_tgt_pid=1317518 00:04:48.606 08:18:01 -- common/autotest_common.sh@1598 -- # waitforlisten 1317518 00:04:48.606 08:18:01 -- common/autotest_common.sh@1596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:04:48.606 08:18:01 -- common/autotest_common.sh@829 -- # '[' -z 1317518 ']' 00:04:48.606 08:18:01 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:48.606 08:18:01 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:48.606 08:18:01 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:48.606 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:48.606 08:18:01 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:48.606 08:18:01 -- common/autotest_common.sh@10 -- # set +x 00:04:48.865 [2024-07-23 08:18:01.149291] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:04:48.865 [2024-07-23 08:18:01.149380] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1317518 ] 00:04:48.865 [2024-07-23 08:18:01.263597] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:49.124 [2024-07-23 08:18:01.480987] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:04:50.064 08:18:02 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:50.064 08:18:02 -- common/autotest_common.sh@862 -- # return 0 00:04:50.064 08:18:02 -- common/autotest_common.sh@1600 -- # bdf_id=0 00:04:50.064 08:18:02 -- common/autotest_common.sh@1601 -- # for bdf in "${bdfs[@]}" 00:04:50.064 08:18:02 -- common/autotest_common.sh@1602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:60:00.0 00:04:53.355 nvme0n1 00:04:53.355 08:18:05 -- common/autotest_common.sh@1604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:04:53.355 [2024-07-23 08:18:05.593121] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:04:53.355 request: 00:04:53.355 { 00:04:53.355 "nvme_ctrlr_name": "nvme0", 00:04:53.355 "password": "test", 00:04:53.355 "method": "bdev_nvme_opal_revert", 00:04:53.355 "req_id": 1 00:04:53.355 } 00:04:53.355 Got JSON-RPC error response 00:04:53.355 response: 00:04:53.355 { 00:04:53.355 "code": -32602, 00:04:53.355 "message": "Invalid parameters" 00:04:53.355 } 00:04:53.355 08:18:05 -- common/autotest_common.sh@1604 -- # true 00:04:53.355 08:18:05 -- common/autotest_common.sh@1605 -- # (( ++bdf_id )) 00:04:53.355 08:18:05 -- common/autotest_common.sh@1608 -- # killprocess 1317518 00:04:53.355 08:18:05 -- common/autotest_common.sh@948 -- # '[' -z 1317518 ']' 00:04:53.355 08:18:05 -- common/autotest_common.sh@952 -- # kill -0 1317518 00:04:53.355 08:18:05 -- common/autotest_common.sh@953 -- # uname 00:04:53.355 08:18:05 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:04:53.355 08:18:05 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1317518 00:04:53.355 08:18:05 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:04:53.355 08:18:05 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:04:53.355 08:18:05 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1317518' 00:04:53.355 killing process with pid 1317518 00:04:53.355 08:18:05 -- common/autotest_common.sh@967 -- # kill 1317518 00:04:53.355 08:18:05 -- common/autotest_common.sh@972 -- # wait 1317518 00:04:59.966 08:18:11 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:04:59.966 08:18:11 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:04:59.966 08:18:11 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:04:59.966 08:18:11 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:04:59.966 08:18:11 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:04:59.966 Restarting all devices. 00:05:03.269 lstat() error: No such file or directory 00:05:03.269 QAT Error: No GENERAL section found 00:05:03.269 Failed to configure qat_dev0 00:05:03.269 lstat() error: No such file or directory 00:05:03.269 QAT Error: No GENERAL section found 00:05:03.269 Failed to configure qat_dev1 00:05:03.269 lstat() error: No such file or directory 00:05:03.269 QAT Error: No GENERAL section found 00:05:03.269 Failed to configure qat_dev2 00:05:03.269 enable sriov 00:05:03.269 Checking status of all devices. 00:05:03.269 There is 3 QAT acceleration device(s) in the system: 00:05:03.269 qat_dev0 - type: c6xx, inst_id: 0, node_id: 1, bsf: 0000:b1:00.0, #accel: 5 #engines: 10 state: down 00:05:03.269 qat_dev1 - type: c6xx, inst_id: 1, node_id: 1, bsf: 0000:b3:00.0, #accel: 5 #engines: 10 state: down 00:05:03.269 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:b5:00.0, #accel: 5 #engines: 10 state: down 00:05:03.837 0000:b1:00.0 set to 16 VFs 00:05:04.406 0000:b3:00.0 set to 16 VFs 00:05:05.343 0000:b5:00.0 set to 16 VFs 00:05:06.721 Properly configured the qat device with driver uio_pci_generic. 00:05:06.721 08:18:19 -- spdk/autotest.sh@162 -- # timing_enter lib 00:05:06.721 08:18:19 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:06.721 08:18:19 -- common/autotest_common.sh@10 -- # set +x 00:05:06.721 08:18:19 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:05:06.721 08:18:19 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:06.721 08:18:19 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:06.721 08:18:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:06.721 08:18:19 -- common/autotest_common.sh@10 -- # set +x 00:05:06.721 ************************************ 00:05:06.721 START TEST env 00:05:06.721 ************************************ 00:05:06.721 08:18:19 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:06.721 * Looking for test storage... 00:05:06.721 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:05:06.721 08:18:19 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:06.721 08:18:19 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:06.721 08:18:19 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:06.721 08:18:19 env -- common/autotest_common.sh@10 -- # set +x 00:05:06.721 ************************************ 00:05:06.721 START TEST env_memory 00:05:06.721 ************************************ 00:05:06.721 08:18:19 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:06.721 00:05:06.721 00:05:06.721 CUnit - A unit testing framework for C - Version 2.1-3 00:05:06.721 http://cunit.sourceforge.net/ 00:05:06.721 00:05:06.721 00:05:06.721 Suite: memory 00:05:06.721 Test: alloc and free memory map ...[2024-07-23 08:18:19.214258] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:06.981 passed 00:05:06.981 Test: mem map translation ...[2024-07-23 08:18:19.254025] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:06.981 [2024-07-23 08:18:19.254053] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:06.981 [2024-07-23 08:18:19.254104] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:06.981 [2024-07-23 08:18:19.254120] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:06.981 passed 00:05:06.981 Test: mem map registration ...[2024-07-23 08:18:19.316468] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:06.981 [2024-07-23 08:18:19.316502] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:06.981 passed 00:05:06.981 Test: mem map adjacent registrations ...passed 00:05:06.981 00:05:06.981 Run Summary: Type Total Ran Passed Failed Inactive 00:05:06.981 suites 1 1 n/a 0 0 00:05:06.981 tests 4 4 4 0 0 00:05:06.981 asserts 152 152 152 0 n/a 00:05:06.981 00:05:06.981 Elapsed time = 0.225 seconds 00:05:06.981 00:05:06.981 real 0m0.259s 00:05:06.981 user 0m0.240s 00:05:06.981 sys 0m0.018s 00:05:06.981 08:18:19 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:06.981 08:18:19 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:05:06.981 ************************************ 00:05:06.981 END TEST env_memory 00:05:06.981 ************************************ 00:05:06.981 08:18:19 env -- common/autotest_common.sh@1142 -- # return 0 00:05:06.981 08:18:19 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:06.981 08:18:19 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:06.981 08:18:19 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:06.981 08:18:19 env -- common/autotest_common.sh@10 -- # set +x 00:05:06.981 ************************************ 00:05:06.981 START TEST env_vtophys 00:05:06.981 ************************************ 00:05:06.981 08:18:19 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:07.241 EAL: lib.eal log level changed from notice to debug 00:05:07.241 EAL: Detected lcore 0 as core 0 on socket 0 00:05:07.241 EAL: Detected lcore 1 as core 1 on socket 0 00:05:07.241 EAL: Detected lcore 2 as core 2 on socket 0 00:05:07.241 EAL: Detected lcore 3 as core 3 on socket 0 00:05:07.241 EAL: Detected lcore 4 as core 4 on socket 0 00:05:07.241 EAL: Detected lcore 5 as core 5 on socket 0 00:05:07.241 EAL: Detected lcore 6 as core 6 on socket 0 00:05:07.241 EAL: Detected lcore 7 as core 8 on socket 0 00:05:07.241 EAL: Detected lcore 8 as core 9 on socket 0 00:05:07.241 EAL: Detected lcore 9 as core 10 on socket 0 00:05:07.241 EAL: Detected lcore 10 as core 11 on socket 0 00:05:07.241 EAL: Detected lcore 11 as core 12 on socket 0 00:05:07.241 EAL: Detected lcore 12 as core 13 on socket 0 00:05:07.241 EAL: Detected lcore 13 as core 16 on socket 0 00:05:07.241 EAL: Detected lcore 14 as core 17 on socket 0 00:05:07.241 EAL: Detected lcore 15 as core 18 on socket 0 00:05:07.241 EAL: Detected lcore 16 as core 19 on socket 0 00:05:07.241 EAL: Detected lcore 17 as core 20 on socket 0 00:05:07.241 EAL: Detected lcore 18 as core 21 on socket 0 00:05:07.241 EAL: Detected lcore 19 as core 25 on socket 0 00:05:07.241 EAL: Detected lcore 20 as core 26 on socket 0 00:05:07.241 EAL: Detected lcore 21 as core 27 on socket 0 00:05:07.241 EAL: Detected lcore 22 as core 28 on socket 0 00:05:07.241 EAL: Detected lcore 23 as core 29 on socket 0 00:05:07.241 EAL: Detected lcore 24 as core 0 on socket 1 00:05:07.241 EAL: Detected lcore 25 as core 1 on socket 1 00:05:07.241 EAL: Detected lcore 26 as core 2 on socket 1 00:05:07.241 EAL: Detected lcore 27 as core 3 on socket 1 00:05:07.241 EAL: Detected lcore 28 as core 4 on socket 1 00:05:07.241 EAL: Detected lcore 29 as core 5 on socket 1 00:05:07.241 EAL: Detected lcore 30 as core 6 on socket 1 00:05:07.241 EAL: Detected lcore 31 as core 9 on socket 1 00:05:07.241 EAL: Detected lcore 32 as core 10 on socket 1 00:05:07.241 EAL: Detected lcore 33 as core 11 on socket 1 00:05:07.241 EAL: Detected lcore 34 as core 12 on socket 1 00:05:07.241 EAL: Detected lcore 35 as core 13 on socket 1 00:05:07.241 EAL: Detected lcore 36 as core 16 on socket 1 00:05:07.241 EAL: Detected lcore 37 as core 17 on socket 1 00:05:07.241 EAL: Detected lcore 38 as core 18 on socket 1 00:05:07.241 EAL: Detected lcore 39 as core 19 on socket 1 00:05:07.241 EAL: Detected lcore 40 as core 20 on socket 1 00:05:07.241 EAL: Detected lcore 41 as core 21 on socket 1 00:05:07.241 EAL: Detected lcore 42 as core 24 on socket 1 00:05:07.241 EAL: Detected lcore 43 as core 25 on socket 1 00:05:07.241 EAL: Detected lcore 44 as core 26 on socket 1 00:05:07.241 EAL: Detected lcore 45 as core 27 on socket 1 00:05:07.241 EAL: Detected lcore 46 as core 28 on socket 1 00:05:07.241 EAL: Detected lcore 47 as core 29 on socket 1 00:05:07.241 EAL: Detected lcore 48 as core 0 on socket 0 00:05:07.241 EAL: Detected lcore 49 as core 1 on socket 0 00:05:07.241 EAL: Detected lcore 50 as core 2 on socket 0 00:05:07.241 EAL: Detected lcore 51 as core 3 on socket 0 00:05:07.241 EAL: Detected lcore 52 as core 4 on socket 0 00:05:07.241 EAL: Detected lcore 53 as core 5 on socket 0 00:05:07.241 EAL: Detected lcore 54 as core 6 on socket 0 00:05:07.241 EAL: Detected lcore 55 as core 8 on socket 0 00:05:07.241 EAL: Detected lcore 56 as core 9 on socket 0 00:05:07.241 EAL: Detected lcore 57 as core 10 on socket 0 00:05:07.241 EAL: Detected lcore 58 as core 11 on socket 0 00:05:07.241 EAL: Detected lcore 59 as core 12 on socket 0 00:05:07.241 EAL: Detected lcore 60 as core 13 on socket 0 00:05:07.241 EAL: Detected lcore 61 as core 16 on socket 0 00:05:07.241 EAL: Detected lcore 62 as core 17 on socket 0 00:05:07.241 EAL: Detected lcore 63 as core 18 on socket 0 00:05:07.242 EAL: Detected lcore 64 as core 19 on socket 0 00:05:07.242 EAL: Detected lcore 65 as core 20 on socket 0 00:05:07.242 EAL: Detected lcore 66 as core 21 on socket 0 00:05:07.242 EAL: Detected lcore 67 as core 25 on socket 0 00:05:07.242 EAL: Detected lcore 68 as core 26 on socket 0 00:05:07.242 EAL: Detected lcore 69 as core 27 on socket 0 00:05:07.242 EAL: Detected lcore 70 as core 28 on socket 0 00:05:07.242 EAL: Detected lcore 71 as core 29 on socket 0 00:05:07.242 EAL: Detected lcore 72 as core 0 on socket 1 00:05:07.242 EAL: Detected lcore 73 as core 1 on socket 1 00:05:07.242 EAL: Detected lcore 74 as core 2 on socket 1 00:05:07.242 EAL: Detected lcore 75 as core 3 on socket 1 00:05:07.242 EAL: Detected lcore 76 as core 4 on socket 1 00:05:07.242 EAL: Detected lcore 77 as core 5 on socket 1 00:05:07.242 EAL: Detected lcore 78 as core 6 on socket 1 00:05:07.242 EAL: Detected lcore 79 as core 9 on socket 1 00:05:07.242 EAL: Detected lcore 80 as core 10 on socket 1 00:05:07.242 EAL: Detected lcore 81 as core 11 on socket 1 00:05:07.242 EAL: Detected lcore 82 as core 12 on socket 1 00:05:07.242 EAL: Detected lcore 83 as core 13 on socket 1 00:05:07.242 EAL: Detected lcore 84 as core 16 on socket 1 00:05:07.242 EAL: Detected lcore 85 as core 17 on socket 1 00:05:07.242 EAL: Detected lcore 86 as core 18 on socket 1 00:05:07.242 EAL: Detected lcore 87 as core 19 on socket 1 00:05:07.242 EAL: Detected lcore 88 as core 20 on socket 1 00:05:07.242 EAL: Detected lcore 89 as core 21 on socket 1 00:05:07.242 EAL: Detected lcore 90 as core 24 on socket 1 00:05:07.242 EAL: Detected lcore 91 as core 25 on socket 1 00:05:07.242 EAL: Detected lcore 92 as core 26 on socket 1 00:05:07.242 EAL: Detected lcore 93 as core 27 on socket 1 00:05:07.242 EAL: Detected lcore 94 as core 28 on socket 1 00:05:07.242 EAL: Detected lcore 95 as core 29 on socket 1 00:05:07.242 EAL: Maximum logical cores by configuration: 128 00:05:07.242 EAL: Detected CPU lcores: 96 00:05:07.242 EAL: Detected NUMA nodes: 2 00:05:07.242 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:05:07.242 EAL: Detected shared linkage of DPDK 00:05:07.242 EAL: No shared files mode enabled, IPC will be disabled 00:05:07.242 EAL: No shared files mode enabled, IPC is disabled 00:05:07.242 EAL: PCI driver qat for device 0000:b1:01.0 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b1:01.1 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b1:01.2 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b1:01.3 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b1:01.4 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b1:01.5 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b1:01.6 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b1:01.7 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b1:02.0 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b1:02.1 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b1:02.2 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b1:02.3 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b1:02.4 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b1:02.5 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b1:02.6 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b1:02.7 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b3:01.0 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b3:01.1 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b3:01.2 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b3:01.3 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b3:01.4 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b3:01.5 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b3:01.6 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b3:01.7 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b3:02.0 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b3:02.1 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b3:02.2 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b3:02.3 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b3:02.4 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b3:02.5 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b3:02.6 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b3:02.7 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b5:01.0 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b5:01.1 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b5:01.2 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b5:01.3 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b5:01.4 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b5:01.5 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b5:01.6 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b5:01.7 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b5:02.0 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b5:02.1 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b5:02.2 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b5:02.3 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b5:02.4 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b5:02.5 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b5:02.6 wants IOVA as 'PA' 00:05:07.242 EAL: PCI driver qat for device 0000:b5:02.7 wants IOVA as 'PA' 00:05:07.242 EAL: Bus pci wants IOVA as 'PA' 00:05:07.242 EAL: Bus auxiliary wants IOVA as 'DC' 00:05:07.242 EAL: Bus vdev wants IOVA as 'DC' 00:05:07.242 EAL: Selected IOVA mode 'PA' 00:05:07.242 EAL: Probing VFIO support... 00:05:07.242 EAL: IOMMU type 1 (Type 1) is supported 00:05:07.242 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:07.242 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:07.242 EAL: VFIO support initialized 00:05:07.242 EAL: Ask a virtual area of 0x2e000 bytes 00:05:07.242 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:07.242 EAL: Setting up physically contiguous memory... 00:05:07.242 EAL: Setting maximum number of open files to 524288 00:05:07.242 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:07.242 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:07.242 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:07.242 EAL: Ask a virtual area of 0x61000 bytes 00:05:07.242 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:07.242 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:07.242 EAL: Ask a virtual area of 0x400000000 bytes 00:05:07.242 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:07.242 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:07.242 EAL: Ask a virtual area of 0x61000 bytes 00:05:07.242 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:07.242 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:07.242 EAL: Ask a virtual area of 0x400000000 bytes 00:05:07.242 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:07.242 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:07.242 EAL: Ask a virtual area of 0x61000 bytes 00:05:07.242 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:07.242 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:07.242 EAL: Ask a virtual area of 0x400000000 bytes 00:05:07.242 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:07.242 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:07.242 EAL: Ask a virtual area of 0x61000 bytes 00:05:07.242 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:07.242 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:07.242 EAL: Ask a virtual area of 0x400000000 bytes 00:05:07.242 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:07.242 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:07.242 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:07.242 EAL: Ask a virtual area of 0x61000 bytes 00:05:07.242 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:07.242 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:07.242 EAL: Ask a virtual area of 0x400000000 bytes 00:05:07.242 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:07.242 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:07.242 EAL: Ask a virtual area of 0x61000 bytes 00:05:07.242 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:07.242 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:07.242 EAL: Ask a virtual area of 0x400000000 bytes 00:05:07.242 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:07.242 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:07.242 EAL: Ask a virtual area of 0x61000 bytes 00:05:07.242 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:07.242 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:07.242 EAL: Ask a virtual area of 0x400000000 bytes 00:05:07.242 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:07.242 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:07.242 EAL: Ask a virtual area of 0x61000 bytes 00:05:07.242 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:07.242 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:07.242 EAL: Ask a virtual area of 0x400000000 bytes 00:05:07.242 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:07.242 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:07.242 EAL: Hugepages will be freed exactly as allocated. 00:05:07.242 EAL: No shared files mode enabled, IPC is disabled 00:05:07.242 EAL: No shared files mode enabled, IPC is disabled 00:05:07.242 EAL: TSC frequency is ~2100000 KHz 00:05:07.242 EAL: Main lcore 0 is ready (tid=7fc56cb36b40;cpuset=[0]) 00:05:07.242 EAL: Trying to obtain current memory policy. 00:05:07.242 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:07.242 EAL: Restoring previous memory policy: 0 00:05:07.242 EAL: request: mp_malloc_sync 00:05:07.242 EAL: No shared files mode enabled, IPC is disabled 00:05:07.242 EAL: Heap on socket 0 was expanded by 2MB 00:05:07.242 EAL: PCI device 0000:b1:01.0 on NUMA socket 1 00:05:07.242 EAL: probe driver: 8086:37c9 qat 00:05:07.242 EAL: PCI memory mapped at 0x202001000000 00:05:07.242 EAL: PCI memory mapped at 0x202001001000 00:05:07.242 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.0 (socket 1) 00:05:07.242 EAL: Trying to obtain current memory policy. 00:05:07.243 EAL: Setting policy MPOL_PREFERRED for socket 1 00:05:07.243 EAL: Restoring previous memory policy: 4 00:05:07.243 EAL: request: mp_malloc_sync 00:05:07.243 EAL: No shared files mode enabled, IPC is disabled 00:05:07.243 EAL: Heap on socket 1 was expanded by 2MB 00:05:07.243 EAL: PCI device 0000:b1:01.1 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001002000 00:05:07.243 EAL: PCI memory mapped at 0x202001003000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.1 (socket 1) 00:05:07.243 EAL: PCI device 0000:b1:01.2 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001004000 00:05:07.243 EAL: PCI memory mapped at 0x202001005000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.2 (socket 1) 00:05:07.243 EAL: PCI device 0000:b1:01.3 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001006000 00:05:07.243 EAL: PCI memory mapped at 0x202001007000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.3 (socket 1) 00:05:07.243 EAL: PCI device 0000:b1:01.4 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001008000 00:05:07.243 EAL: PCI memory mapped at 0x202001009000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.4 (socket 1) 00:05:07.243 EAL: PCI device 0000:b1:01.5 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x20200100a000 00:05:07.243 EAL: PCI memory mapped at 0x20200100b000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.5 (socket 1) 00:05:07.243 EAL: PCI device 0000:b1:01.6 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x20200100c000 00:05:07.243 EAL: PCI memory mapped at 0x20200100d000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.6 (socket 1) 00:05:07.243 EAL: PCI device 0000:b1:01.7 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x20200100e000 00:05:07.243 EAL: PCI memory mapped at 0x20200100f000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.7 (socket 1) 00:05:07.243 EAL: PCI device 0000:b1:02.0 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001010000 00:05:07.243 EAL: PCI memory mapped at 0x202001011000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.0 (socket 1) 00:05:07.243 EAL: PCI device 0000:b1:02.1 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001012000 00:05:07.243 EAL: PCI memory mapped at 0x202001013000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.1 (socket 1) 00:05:07.243 EAL: PCI device 0000:b1:02.2 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001014000 00:05:07.243 EAL: PCI memory mapped at 0x202001015000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.2 (socket 1) 00:05:07.243 EAL: PCI device 0000:b1:02.3 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001016000 00:05:07.243 EAL: PCI memory mapped at 0x202001017000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.3 (socket 1) 00:05:07.243 EAL: PCI device 0000:b1:02.4 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001018000 00:05:07.243 EAL: PCI memory mapped at 0x202001019000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.4 (socket 1) 00:05:07.243 EAL: PCI device 0000:b1:02.5 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x20200101a000 00:05:07.243 EAL: PCI memory mapped at 0x20200101b000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.5 (socket 1) 00:05:07.243 EAL: PCI device 0000:b1:02.6 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x20200101c000 00:05:07.243 EAL: PCI memory mapped at 0x20200101d000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.6 (socket 1) 00:05:07.243 EAL: PCI device 0000:b1:02.7 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x20200101e000 00:05:07.243 EAL: PCI memory mapped at 0x20200101f000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.7 (socket 1) 00:05:07.243 EAL: PCI device 0000:b3:01.0 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001020000 00:05:07.243 EAL: PCI memory mapped at 0x202001021000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.0 (socket 1) 00:05:07.243 EAL: PCI device 0000:b3:01.1 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001022000 00:05:07.243 EAL: PCI memory mapped at 0x202001023000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.1 (socket 1) 00:05:07.243 EAL: PCI device 0000:b3:01.2 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001024000 00:05:07.243 EAL: PCI memory mapped at 0x202001025000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.2 (socket 1) 00:05:07.243 EAL: PCI device 0000:b3:01.3 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001026000 00:05:07.243 EAL: PCI memory mapped at 0x202001027000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.3 (socket 1) 00:05:07.243 EAL: PCI device 0000:b3:01.4 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001028000 00:05:07.243 EAL: PCI memory mapped at 0x202001029000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.4 (socket 1) 00:05:07.243 EAL: PCI device 0000:b3:01.5 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x20200102a000 00:05:07.243 EAL: PCI memory mapped at 0x20200102b000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.5 (socket 1) 00:05:07.243 EAL: PCI device 0000:b3:01.6 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x20200102c000 00:05:07.243 EAL: PCI memory mapped at 0x20200102d000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.6 (socket 1) 00:05:07.243 EAL: PCI device 0000:b3:01.7 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x20200102e000 00:05:07.243 EAL: PCI memory mapped at 0x20200102f000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.7 (socket 1) 00:05:07.243 EAL: PCI device 0000:b3:02.0 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001030000 00:05:07.243 EAL: PCI memory mapped at 0x202001031000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.0 (socket 1) 00:05:07.243 EAL: PCI device 0000:b3:02.1 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001032000 00:05:07.243 EAL: PCI memory mapped at 0x202001033000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.1 (socket 1) 00:05:07.243 EAL: PCI device 0000:b3:02.2 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001034000 00:05:07.243 EAL: PCI memory mapped at 0x202001035000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.2 (socket 1) 00:05:07.243 EAL: PCI device 0000:b3:02.3 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001036000 00:05:07.243 EAL: PCI memory mapped at 0x202001037000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.3 (socket 1) 00:05:07.243 EAL: PCI device 0000:b3:02.4 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001038000 00:05:07.243 EAL: PCI memory mapped at 0x202001039000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.4 (socket 1) 00:05:07.243 EAL: PCI device 0000:b3:02.5 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x20200103a000 00:05:07.243 EAL: PCI memory mapped at 0x20200103b000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.5 (socket 1) 00:05:07.243 EAL: PCI device 0000:b3:02.6 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x20200103c000 00:05:07.243 EAL: PCI memory mapped at 0x20200103d000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.6 (socket 1) 00:05:07.243 EAL: PCI device 0000:b3:02.7 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x20200103e000 00:05:07.243 EAL: PCI memory mapped at 0x20200103f000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.7 (socket 1) 00:05:07.243 EAL: PCI device 0000:b5:01.0 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001040000 00:05:07.243 EAL: PCI memory mapped at 0x202001041000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.0 (socket 1) 00:05:07.243 EAL: PCI device 0000:b5:01.1 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001042000 00:05:07.243 EAL: PCI memory mapped at 0x202001043000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.1 (socket 1) 00:05:07.243 EAL: PCI device 0000:b5:01.2 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001044000 00:05:07.243 EAL: PCI memory mapped at 0x202001045000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.2 (socket 1) 00:05:07.243 EAL: PCI device 0000:b5:01.3 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001046000 00:05:07.243 EAL: PCI memory mapped at 0x202001047000 00:05:07.243 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.3 (socket 1) 00:05:07.243 EAL: PCI device 0000:b5:01.4 on NUMA socket 1 00:05:07.243 EAL: probe driver: 8086:37c9 qat 00:05:07.243 EAL: PCI memory mapped at 0x202001048000 00:05:07.243 EAL: PCI memory mapped at 0x202001049000 00:05:07.244 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.4 (socket 1) 00:05:07.244 EAL: PCI device 0000:b5:01.5 on NUMA socket 1 00:05:07.244 EAL: probe driver: 8086:37c9 qat 00:05:07.244 EAL: PCI memory mapped at 0x20200104a000 00:05:07.244 EAL: PCI memory mapped at 0x20200104b000 00:05:07.244 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.5 (socket 1) 00:05:07.244 EAL: PCI device 0000:b5:01.6 on NUMA socket 1 00:05:07.244 EAL: probe driver: 8086:37c9 qat 00:05:07.244 EAL: PCI memory mapped at 0x20200104c000 00:05:07.244 EAL: PCI memory mapped at 0x20200104d000 00:05:07.244 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.6 (socket 1) 00:05:07.244 EAL: PCI device 0000:b5:01.7 on NUMA socket 1 00:05:07.244 EAL: probe driver: 8086:37c9 qat 00:05:07.244 EAL: PCI memory mapped at 0x20200104e000 00:05:07.244 EAL: PCI memory mapped at 0x20200104f000 00:05:07.244 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.7 (socket 1) 00:05:07.244 EAL: PCI device 0000:b5:02.0 on NUMA socket 1 00:05:07.244 EAL: probe driver: 8086:37c9 qat 00:05:07.244 EAL: PCI memory mapped at 0x202001050000 00:05:07.244 EAL: PCI memory mapped at 0x202001051000 00:05:07.244 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.0 (socket 1) 00:05:07.244 EAL: PCI device 0000:b5:02.1 on NUMA socket 1 00:05:07.244 EAL: probe driver: 8086:37c9 qat 00:05:07.244 EAL: PCI memory mapped at 0x202001052000 00:05:07.244 EAL: PCI memory mapped at 0x202001053000 00:05:07.244 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.1 (socket 1) 00:05:07.244 EAL: PCI device 0000:b5:02.2 on NUMA socket 1 00:05:07.244 EAL: probe driver: 8086:37c9 qat 00:05:07.244 EAL: PCI memory mapped at 0x202001054000 00:05:07.244 EAL: PCI memory mapped at 0x202001055000 00:05:07.244 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.2 (socket 1) 00:05:07.244 EAL: PCI device 0000:b5:02.3 on NUMA socket 1 00:05:07.244 EAL: probe driver: 8086:37c9 qat 00:05:07.244 EAL: PCI memory mapped at 0x202001056000 00:05:07.244 EAL: PCI memory mapped at 0x202001057000 00:05:07.244 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.3 (socket 1) 00:05:07.244 EAL: PCI device 0000:b5:02.4 on NUMA socket 1 00:05:07.244 EAL: probe driver: 8086:37c9 qat 00:05:07.244 EAL: PCI memory mapped at 0x202001058000 00:05:07.244 EAL: PCI memory mapped at 0x202001059000 00:05:07.244 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.4 (socket 1) 00:05:07.244 EAL: PCI device 0000:b5:02.5 on NUMA socket 1 00:05:07.244 EAL: probe driver: 8086:37c9 qat 00:05:07.244 EAL: PCI memory mapped at 0x20200105a000 00:05:07.244 EAL: PCI memory mapped at 0x20200105b000 00:05:07.244 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.5 (socket 1) 00:05:07.244 EAL: PCI device 0000:b5:02.6 on NUMA socket 1 00:05:07.244 EAL: probe driver: 8086:37c9 qat 00:05:07.244 EAL: PCI memory mapped at 0x20200105c000 00:05:07.244 EAL: PCI memory mapped at 0x20200105d000 00:05:07.244 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.6 (socket 1) 00:05:07.244 EAL: PCI device 0000:b5:02.7 on NUMA socket 1 00:05:07.244 EAL: probe driver: 8086:37c9 qat 00:05:07.244 EAL: PCI memory mapped at 0x20200105e000 00:05:07.244 EAL: PCI memory mapped at 0x20200105f000 00:05:07.244 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.7 (socket 1) 00:05:07.244 EAL: No shared files mode enabled, IPC is disabled 00:05:07.244 EAL: No shared files mode enabled, IPC is disabled 00:05:07.244 EAL: No PCI address specified using 'addr=' in: bus=pci 00:05:07.244 EAL: Mem event callback 'spdk:(nil)' registered 00:05:07.244 00:05:07.244 00:05:07.244 CUnit - A unit testing framework for C - Version 2.1-3 00:05:07.244 http://cunit.sourceforge.net/ 00:05:07.244 00:05:07.244 00:05:07.244 Suite: components_suite 00:05:07.503 Test: vtophys_malloc_test ...passed 00:05:07.503 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:07.503 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:07.503 EAL: Restoring previous memory policy: 4 00:05:07.503 EAL: Calling mem event callback 'spdk:(nil)' 00:05:07.503 EAL: request: mp_malloc_sync 00:05:07.503 EAL: No shared files mode enabled, IPC is disabled 00:05:07.503 EAL: Heap on socket 0 was expanded by 4MB 00:05:07.503 EAL: Calling mem event callback 'spdk:(nil)' 00:05:07.503 EAL: request: mp_malloc_sync 00:05:07.503 EAL: No shared files mode enabled, IPC is disabled 00:05:07.503 EAL: Heap on socket 0 was shrunk by 4MB 00:05:07.503 EAL: Trying to obtain current memory policy. 00:05:07.503 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:07.503 EAL: Restoring previous memory policy: 4 00:05:07.503 EAL: Calling mem event callback 'spdk:(nil)' 00:05:07.503 EAL: request: mp_malloc_sync 00:05:07.503 EAL: No shared files mode enabled, IPC is disabled 00:05:07.503 EAL: Heap on socket 0 was expanded by 6MB 00:05:07.503 EAL: Calling mem event callback 'spdk:(nil)' 00:05:07.503 EAL: request: mp_malloc_sync 00:05:07.503 EAL: No shared files mode enabled, IPC is disabled 00:05:07.503 EAL: Heap on socket 0 was shrunk by 6MB 00:05:07.503 EAL: Trying to obtain current memory policy. 00:05:07.503 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:07.503 EAL: Restoring previous memory policy: 4 00:05:07.503 EAL: Calling mem event callback 'spdk:(nil)' 00:05:07.503 EAL: request: mp_malloc_sync 00:05:07.503 EAL: No shared files mode enabled, IPC is disabled 00:05:07.503 EAL: Heap on socket 0 was expanded by 10MB 00:05:07.503 EAL: Calling mem event callback 'spdk:(nil)' 00:05:07.503 EAL: request: mp_malloc_sync 00:05:07.503 EAL: No shared files mode enabled, IPC is disabled 00:05:07.503 EAL: Heap on socket 0 was shrunk by 10MB 00:05:07.762 EAL: Trying to obtain current memory policy. 00:05:07.762 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:07.762 EAL: Restoring previous memory policy: 4 00:05:07.762 EAL: Calling mem event callback 'spdk:(nil)' 00:05:07.762 EAL: request: mp_malloc_sync 00:05:07.763 EAL: No shared files mode enabled, IPC is disabled 00:05:07.763 EAL: Heap on socket 0 was expanded by 18MB 00:05:07.763 EAL: Calling mem event callback 'spdk:(nil)' 00:05:07.763 EAL: request: mp_malloc_sync 00:05:07.763 EAL: No shared files mode enabled, IPC is disabled 00:05:07.763 EAL: Heap on socket 0 was shrunk by 18MB 00:05:07.763 EAL: Trying to obtain current memory policy. 00:05:07.763 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:07.763 EAL: Restoring previous memory policy: 4 00:05:07.763 EAL: Calling mem event callback 'spdk:(nil)' 00:05:07.763 EAL: request: mp_malloc_sync 00:05:07.763 EAL: No shared files mode enabled, IPC is disabled 00:05:07.763 EAL: Heap on socket 0 was expanded by 34MB 00:05:07.763 EAL: Calling mem event callback 'spdk:(nil)' 00:05:07.763 EAL: request: mp_malloc_sync 00:05:07.763 EAL: No shared files mode enabled, IPC is disabled 00:05:07.763 EAL: Heap on socket 0 was shrunk by 34MB 00:05:07.763 EAL: Trying to obtain current memory policy. 00:05:07.763 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:07.763 EAL: Restoring previous memory policy: 4 00:05:07.763 EAL: Calling mem event callback 'spdk:(nil)' 00:05:07.763 EAL: request: mp_malloc_sync 00:05:07.763 EAL: No shared files mode enabled, IPC is disabled 00:05:07.763 EAL: Heap on socket 0 was expanded by 66MB 00:05:08.022 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.022 EAL: request: mp_malloc_sync 00:05:08.022 EAL: No shared files mode enabled, IPC is disabled 00:05:08.022 EAL: Heap on socket 0 was shrunk by 66MB 00:05:08.022 EAL: Trying to obtain current memory policy. 00:05:08.022 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:08.022 EAL: Restoring previous memory policy: 4 00:05:08.022 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.022 EAL: request: mp_malloc_sync 00:05:08.022 EAL: No shared files mode enabled, IPC is disabled 00:05:08.022 EAL: Heap on socket 0 was expanded by 130MB 00:05:08.282 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.282 EAL: request: mp_malloc_sync 00:05:08.282 EAL: No shared files mode enabled, IPC is disabled 00:05:08.282 EAL: Heap on socket 0 was shrunk by 130MB 00:05:08.541 EAL: Trying to obtain current memory policy. 00:05:08.541 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:08.541 EAL: Restoring previous memory policy: 4 00:05:08.541 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.541 EAL: request: mp_malloc_sync 00:05:08.541 EAL: No shared files mode enabled, IPC is disabled 00:05:08.541 EAL: Heap on socket 0 was expanded by 258MB 00:05:09.111 EAL: Calling mem event callback 'spdk:(nil)' 00:05:09.111 EAL: request: mp_malloc_sync 00:05:09.111 EAL: No shared files mode enabled, IPC is disabled 00:05:09.111 EAL: Heap on socket 0 was shrunk by 258MB 00:05:09.679 EAL: Trying to obtain current memory policy. 00:05:09.679 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:09.679 EAL: Restoring previous memory policy: 4 00:05:09.679 EAL: Calling mem event callback 'spdk:(nil)' 00:05:09.679 EAL: request: mp_malloc_sync 00:05:09.679 EAL: No shared files mode enabled, IPC is disabled 00:05:09.679 EAL: Heap on socket 0 was expanded by 514MB 00:05:11.056 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.056 EAL: request: mp_malloc_sync 00:05:11.056 EAL: No shared files mode enabled, IPC is disabled 00:05:11.056 EAL: Heap on socket 0 was shrunk by 514MB 00:05:11.661 EAL: Trying to obtain current memory policy. 00:05:11.662 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.920 EAL: Restoring previous memory policy: 4 00:05:11.920 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.920 EAL: request: mp_malloc_sync 00:05:11.920 EAL: No shared files mode enabled, IPC is disabled 00:05:11.920 EAL: Heap on socket 0 was expanded by 1026MB 00:05:14.457 EAL: Calling mem event callback 'spdk:(nil)' 00:05:14.457 EAL: request: mp_malloc_sync 00:05:14.457 EAL: No shared files mode enabled, IPC is disabled 00:05:14.457 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:15.831 passed 00:05:15.831 00:05:15.831 Run Summary: Type Total Ran Passed Failed Inactive 00:05:15.831 suites 1 1 n/a 0 0 00:05:15.831 tests 2 2 2 0 0 00:05:15.831 asserts 6727 6727 6727 0 n/a 00:05:15.831 00:05:15.831 Elapsed time = 8.369 seconds 00:05:15.831 EAL: No shared files mode enabled, IPC is disabled 00:05:15.831 EAL: No shared files mode enabled, IPC is disabled 00:05:15.831 EAL: No shared files mode enabled, IPC is disabled 00:05:15.831 00:05:15.831 real 0m8.657s 00:05:15.831 user 0m7.763s 00:05:15.831 sys 0m0.820s 00:05:15.831 08:18:28 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:15.831 08:18:28 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:05:15.831 ************************************ 00:05:15.831 END TEST env_vtophys 00:05:15.831 ************************************ 00:05:15.831 08:18:28 env -- common/autotest_common.sh@1142 -- # return 0 00:05:15.831 08:18:28 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:15.831 08:18:28 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:15.831 08:18:28 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:15.831 08:18:28 env -- common/autotest_common.sh@10 -- # set +x 00:05:15.831 ************************************ 00:05:15.831 START TEST env_pci 00:05:15.831 ************************************ 00:05:15.831 08:18:28 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:15.831 00:05:15.831 00:05:15.831 CUnit - A unit testing framework for C - Version 2.1-3 00:05:15.831 http://cunit.sourceforge.net/ 00:05:15.831 00:05:15.831 00:05:15.831 Suite: pci 00:05:15.831 Test: pci_hook ...[2024-07-23 08:18:28.224291] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1322810 has claimed it 00:05:15.831 EAL: Cannot find device (10000:00:01.0) 00:05:15.831 EAL: Failed to attach device on primary process 00:05:15.831 passed 00:05:15.831 00:05:15.831 Run Summary: Type Total Ran Passed Failed Inactive 00:05:15.831 suites 1 1 n/a 0 0 00:05:15.831 tests 1 1 1 0 0 00:05:15.831 asserts 25 25 25 0 n/a 00:05:15.831 00:05:15.831 Elapsed time = 0.040 seconds 00:05:15.831 00:05:15.831 real 0m0.092s 00:05:15.831 user 0m0.037s 00:05:15.831 sys 0m0.055s 00:05:15.831 08:18:28 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:15.831 08:18:28 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:05:15.831 ************************************ 00:05:15.831 END TEST env_pci 00:05:15.831 ************************************ 00:05:15.831 08:18:28 env -- common/autotest_common.sh@1142 -- # return 0 00:05:15.831 08:18:28 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:15.831 08:18:28 env -- env/env.sh@15 -- # uname 00:05:15.831 08:18:28 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:15.831 08:18:28 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:15.831 08:18:28 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:15.831 08:18:28 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:05:15.831 08:18:28 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:15.831 08:18:28 env -- common/autotest_common.sh@10 -- # set +x 00:05:16.091 ************************************ 00:05:16.091 START TEST env_dpdk_post_init 00:05:16.091 ************************************ 00:05:16.091 08:18:28 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:16.091 EAL: Detected CPU lcores: 96 00:05:16.091 EAL: Detected NUMA nodes: 2 00:05:16.091 EAL: Detected shared linkage of DPDK 00:05:16.091 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:16.091 EAL: Selected IOVA mode 'PA' 00:05:16.091 EAL: VFIO support initialized 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.0 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:01.0_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:01.0_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.1 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:01.1_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:01.1_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.2 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:01.2_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:01.2_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.3 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:01.3_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:01.3_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.4 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:01.4_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:01.4_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.5 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:01.5_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:01.5_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.6 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:01.6_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:01.6_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.7 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:01.7_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:01.7_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.0 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:02.0_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:02.0_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.1 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:02.1_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:02.1_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.2 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:02.2_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:02.2_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.3 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:02.3_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:02.3_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.4 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:02.4_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:02.4_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.5 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:02.5_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:02.5_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.6 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:02.6_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:02.6_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.7 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:02.7_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b1:02.7_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.091 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.0 (socket 1) 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b3:01.0_qat_asym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.091 CRYPTODEV: Creating cryptodev 0000:b3:01.0_qat_sym 00:05:16.091 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.1 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:01.1_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:01.1_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.2 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:01.2_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:01.2_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.3 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:01.3_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:01.3_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.4 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:01.4_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:01.4_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.5 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:01.5_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:01.5_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.6 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:01.6_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:01.6_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.7 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:01.7_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:01.7_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.0 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:02.0_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:02.0_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.1 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:02.1_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:02.1_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.2 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:02.2_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:02.2_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.3 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:02.3_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:02.3_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.4 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:02.4_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:02.4_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.5 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:02.5_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:02.5_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.6 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:02.6_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:02.6_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.7 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:02.7_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b3:02.7_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.0 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:01.0_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:01.0_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.1 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:01.1_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:01.1_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.2 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:01.2_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:01.2_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.3 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:01.3_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:01.3_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.4 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:01.4_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:01.4_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.5 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:01.5_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:01.5_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.6 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:01.6_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:01.6_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.7 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:01.7_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:01.7_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.0 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:02.0_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:02.0_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.1 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:02.1_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:02.1_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.2 (socket 1) 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:02.2_qat_asym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.092 CRYPTODEV: Creating cryptodev 0000:b5:02.2_qat_sym 00:05:16.092 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.092 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.3 (socket 1) 00:05:16.093 CRYPTODEV: Creating cryptodev 0000:b5:02.3_qat_asym 00:05:16.093 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.093 CRYPTODEV: Creating cryptodev 0000:b5:02.3_qat_sym 00:05:16.093 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.093 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.4 (socket 1) 00:05:16.093 CRYPTODEV: Creating cryptodev 0000:b5:02.4_qat_asym 00:05:16.093 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.093 CRYPTODEV: Creating cryptodev 0000:b5:02.4_qat_sym 00:05:16.093 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.093 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.5 (socket 1) 00:05:16.093 CRYPTODEV: Creating cryptodev 0000:b5:02.5_qat_asym 00:05:16.093 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.093 CRYPTODEV: Creating cryptodev 0000:b5:02.5_qat_sym 00:05:16.093 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.093 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.6 (socket 1) 00:05:16.093 CRYPTODEV: Creating cryptodev 0000:b5:02.6_qat_asym 00:05:16.093 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.093 CRYPTODEV: Creating cryptodev 0000:b5:02.6_qat_sym 00:05:16.093 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.093 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.7 (socket 1) 00:05:16.093 CRYPTODEV: Creating cryptodev 0000:b5:02.7_qat_asym 00:05:16.093 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:16.093 CRYPTODEV: Creating cryptodev 0000:b5:02.7_qat_sym 00:05:16.093 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:16.093 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:16.093 EAL: Using IOMMU type 1 (Type 1) 00:05:16.093 EAL: Ignore mapping IO port bar(1) 00:05:16.093 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:05:16.352 EAL: Ignore mapping IO port bar(1) 00:05:16.352 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:05:16.352 EAL: Ignore mapping IO port bar(1) 00:05:16.352 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:05:16.352 EAL: Ignore mapping IO port bar(1) 00:05:16.352 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:05:16.352 EAL: Ignore mapping IO port bar(1) 00:05:16.352 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:05:16.352 EAL: Ignore mapping IO port bar(1) 00:05:16.352 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:05:16.352 EAL: Ignore mapping IO port bar(1) 00:05:16.352 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:05:16.352 EAL: Ignore mapping IO port bar(1) 00:05:16.352 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:05:16.917 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:60:00.0 (socket 0) 00:05:17.176 EAL: Ignore mapping IO port bar(1) 00:05:17.176 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:05:17.176 EAL: Ignore mapping IO port bar(1) 00:05:17.176 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:05:17.176 EAL: Ignore mapping IO port bar(1) 00:05:17.176 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:05:17.176 EAL: Ignore mapping IO port bar(1) 00:05:17.176 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:05:17.176 EAL: Ignore mapping IO port bar(1) 00:05:17.176 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:05:17.176 EAL: Ignore mapping IO port bar(1) 00:05:17.176 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:05:17.176 EAL: Ignore mapping IO port bar(1) 00:05:17.176 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:05:17.176 EAL: Ignore mapping IO port bar(1) 00:05:17.176 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:05:22.438 EAL: Releasing PCI mapped resource for 0000:60:00.0 00:05:22.438 EAL: Calling pci_unmap_resource for 0000:60:00.0 at 0x202001080000 00:05:22.696 Starting DPDK initialization... 00:05:22.696 Starting SPDK post initialization... 00:05:22.696 SPDK NVMe probe 00:05:22.696 Attaching to 0000:60:00.0 00:05:22.696 Attached to 0000:60:00.0 00:05:22.696 Cleaning up... 00:05:22.696 00:05:22.696 real 0m6.848s 00:05:22.696 user 0m5.608s 00:05:22.696 sys 0m0.298s 00:05:22.696 08:18:35 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:22.696 08:18:35 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:05:22.696 ************************************ 00:05:22.696 END TEST env_dpdk_post_init 00:05:22.696 ************************************ 00:05:22.955 08:18:35 env -- common/autotest_common.sh@1142 -- # return 0 00:05:22.955 08:18:35 env -- env/env.sh@26 -- # uname 00:05:22.955 08:18:35 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:22.955 08:18:35 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:22.955 08:18:35 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:22.955 08:18:35 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:22.955 08:18:35 env -- common/autotest_common.sh@10 -- # set +x 00:05:22.955 ************************************ 00:05:22.955 START TEST env_mem_callbacks 00:05:22.955 ************************************ 00:05:22.955 08:18:35 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:22.955 EAL: Detected CPU lcores: 96 00:05:22.955 EAL: Detected NUMA nodes: 2 00:05:22.955 EAL: Detected shared linkage of DPDK 00:05:22.955 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:22.955 EAL: Selected IOVA mode 'PA' 00:05:22.955 EAL: VFIO support initialized 00:05:22.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.0 (socket 1) 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:01.0_qat_asym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:01.0_qat_sym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.1 (socket 1) 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:01.1_qat_asym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:01.1_qat_sym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.2 (socket 1) 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:01.2_qat_asym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:01.2_qat_sym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.3 (socket 1) 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:01.3_qat_asym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:01.3_qat_sym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.4 (socket 1) 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:01.4_qat_asym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:01.4_qat_sym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.5 (socket 1) 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:01.5_qat_asym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:01.5_qat_sym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.6 (socket 1) 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:01.6_qat_asym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:01.6_qat_sym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:01.7 (socket 1) 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:01.7_qat_asym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:01.7_qat_sym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.0 (socket 1) 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:02.0_qat_asym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:02.0_qat_sym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.1 (socket 1) 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:02.1_qat_asym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:02.1_qat_sym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.2 (socket 1) 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:02.2_qat_asym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:02.2_qat_sym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.3 (socket 1) 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:02.3_qat_asym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:02.3_qat_sym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.4 (socket 1) 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:02.4_qat_asym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:02.4_qat_sym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.5 (socket 1) 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:02.5_qat_asym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:02.5_qat_sym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.6 (socket 1) 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:02.6_qat_asym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:02.6_qat_sym 00:05:22.955 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b1:02.7 (socket 1) 00:05:22.955 CRYPTODEV: Creating cryptodev 0000:b1:02.7_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b1:02.7_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b1:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.0 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:01.0_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:01.0_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.1 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:01.1_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:01.1_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.2 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:01.2_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:01.2_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.3 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:01.3_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:01.3_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.4 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:01.4_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:01.4_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.5 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:01.5_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:01.5_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.6 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:01.6_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:01.6_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:01.7 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:01.7_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:01.7_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.0 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:02.0_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:02.0_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.1 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:02.1_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:02.1_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.2 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:02.2_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:02.2_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.3 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:02.3_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:02.3_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.4 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:02.4_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:02.4_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.5 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:02.5_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:02.5_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.6 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:02.6_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:02.6_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b3:02.7 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:02.7_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b3:02.7_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b3:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.0 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:01.0_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:01.0_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.1 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:01.1_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:01.1_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.2 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:01.2_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:01.2_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.3 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:01.3_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:01.3_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.4 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:01.4_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:01.4_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.5 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:01.5_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:01.5_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.6 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:01.6_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:01.6_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:01.7 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:01.7_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:01.7_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.0 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:02.0_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:02.0_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.1 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:02.1_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:02.1_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.2 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:02.2_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:02.2_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.3 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:02.3_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:02.3_qat_sym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.4 (socket 1) 00:05:22.956 CRYPTODEV: Creating cryptodev 0000:b5:02.4_qat_asym 00:05:22.956 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.957 CRYPTODEV: Creating cryptodev 0000:b5:02.4_qat_sym 00:05:22.957 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.957 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.5 (socket 1) 00:05:22.957 CRYPTODEV: Creating cryptodev 0000:b5:02.5_qat_asym 00:05:22.957 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.957 CRYPTODEV: Creating cryptodev 0000:b5:02.5_qat_sym 00:05:22.957 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.957 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.6 (socket 1) 00:05:22.957 CRYPTODEV: Creating cryptodev 0000:b5:02.6_qat_asym 00:05:22.957 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.957 CRYPTODEV: Creating cryptodev 0000:b5:02.6_qat_sym 00:05:22.957 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.957 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:b5:02.7 (socket 1) 00:05:22.957 CRYPTODEV: Creating cryptodev 0000:b5:02.7_qat_asym 00:05:22.957 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:22.957 CRYPTODEV: Creating cryptodev 0000:b5:02.7_qat_sym 00:05:22.957 CRYPTODEV: Initialisation parameters - name: 0000:b5:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:22.957 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:22.957 00:05:22.957 00:05:22.957 CUnit - A unit testing framework for C - Version 2.1-3 00:05:22.957 http://cunit.sourceforge.net/ 00:05:22.957 00:05:22.957 00:05:22.957 Suite: memory 00:05:22.957 Test: test ... 00:05:22.957 register 0x200000200000 2097152 00:05:22.957 register 0x201000a00000 2097152 00:05:22.957 malloc 3145728 00:05:22.957 register 0x200000400000 4194304 00:05:22.957 buf 0x2000004fffc0 len 3145728 PASSED 00:05:22.957 malloc 64 00:05:22.957 buf 0x2000004ffec0 len 64 PASSED 00:05:22.957 malloc 4194304 00:05:22.957 register 0x200000800000 6291456 00:05:22.957 buf 0x2000009fffc0 len 4194304 PASSED 00:05:22.957 free 0x2000004fffc0 3145728 00:05:22.957 free 0x2000004ffec0 64 00:05:22.957 unregister 0x200000400000 4194304 PASSED 00:05:22.957 free 0x2000009fffc0 4194304 00:05:22.957 unregister 0x200000800000 6291456 PASSED 00:05:22.957 malloc 8388608 00:05:22.957 register 0x200000400000 10485760 00:05:22.957 buf 0x2000005fffc0 len 8388608 PASSED 00:05:22.957 free 0x2000005fffc0 8388608 00:05:22.957 unregister 0x200000400000 10485760 PASSED 00:05:22.957 passed 00:05:22.957 00:05:22.957 Run Summary: Type Total Ran Passed Failed Inactive 00:05:22.957 suites 1 1 n/a 0 0 00:05:22.957 tests 1 1 1 0 0 00:05:22.957 asserts 16 16 16 0 n/a 00:05:22.957 00:05:22.957 Elapsed time = 0.066 seconds 00:05:23.214 00:05:23.214 real 0m0.194s 00:05:23.214 user 0m0.099s 00:05:23.214 sys 0m0.094s 00:05:23.214 08:18:35 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:23.214 08:18:35 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:05:23.214 ************************************ 00:05:23.214 END TEST env_mem_callbacks 00:05:23.214 ************************************ 00:05:23.214 08:18:35 env -- common/autotest_common.sh@1142 -- # return 0 00:05:23.214 00:05:23.214 real 0m16.471s 00:05:23.214 user 0m13.912s 00:05:23.214 sys 0m1.569s 00:05:23.214 08:18:35 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:23.214 08:18:35 env -- common/autotest_common.sh@10 -- # set +x 00:05:23.214 ************************************ 00:05:23.214 END TEST env 00:05:23.214 ************************************ 00:05:23.214 08:18:35 -- common/autotest_common.sh@1142 -- # return 0 00:05:23.214 08:18:35 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:23.214 08:18:35 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:23.214 08:18:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:23.214 08:18:35 -- common/autotest_common.sh@10 -- # set +x 00:05:23.214 ************************************ 00:05:23.214 START TEST rpc 00:05:23.214 ************************************ 00:05:23.214 08:18:35 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:23.214 * Looking for test storage... 00:05:23.214 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:23.215 08:18:35 rpc -- rpc/rpc.sh@65 -- # spdk_pid=1324369 00:05:23.215 08:18:35 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:23.215 08:18:35 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:23.215 08:18:35 rpc -- rpc/rpc.sh@67 -- # waitforlisten 1324369 00:05:23.215 08:18:35 rpc -- common/autotest_common.sh@829 -- # '[' -z 1324369 ']' 00:05:23.215 08:18:35 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:23.215 08:18:35 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:23.215 08:18:35 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:23.215 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:23.215 08:18:35 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:23.215 08:18:35 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:23.473 [2024-07-23 08:18:35.760038] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:05:23.473 [2024-07-23 08:18:35.760130] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1324369 ] 00:05:23.473 [2024-07-23 08:18:35.881330] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:23.731 [2024-07-23 08:18:36.084656] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:23.731 [2024-07-23 08:18:36.084702] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1324369' to capture a snapshot of events at runtime. 00:05:23.731 [2024-07-23 08:18:36.084711] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:23.731 [2024-07-23 08:18:36.084721] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:23.731 [2024-07-23 08:18:36.084729] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1324369 for offline analysis/debug. 00:05:23.731 [2024-07-23 08:18:36.084764] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:24.664 08:18:37 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:24.664 08:18:37 rpc -- common/autotest_common.sh@862 -- # return 0 00:05:24.664 08:18:37 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:24.664 08:18:37 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:24.664 08:18:37 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:24.664 08:18:37 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:24.664 08:18:37 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:24.664 08:18:37 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:24.664 08:18:37 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:24.664 ************************************ 00:05:24.664 START TEST rpc_integrity 00:05:24.664 ************************************ 00:05:24.664 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:05:24.664 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:24.664 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:24.664 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:24.664 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:24.664 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:24.664 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:24.664 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:24.664 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:24.664 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:24.664 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:24.664 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:24.664 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:24.664 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:24.664 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:24.664 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:24.664 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:24.664 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:24.664 { 00:05:24.664 "name": "Malloc0", 00:05:24.664 "aliases": [ 00:05:24.664 "4d4c2fa6-dbb9-4b29-8315-23a196cdab78" 00:05:24.664 ], 00:05:24.664 "product_name": "Malloc disk", 00:05:24.664 "block_size": 512, 00:05:24.664 "num_blocks": 16384, 00:05:24.664 "uuid": "4d4c2fa6-dbb9-4b29-8315-23a196cdab78", 00:05:24.664 "assigned_rate_limits": { 00:05:24.664 "rw_ios_per_sec": 0, 00:05:24.664 "rw_mbytes_per_sec": 0, 00:05:24.664 "r_mbytes_per_sec": 0, 00:05:24.664 "w_mbytes_per_sec": 0 00:05:24.664 }, 00:05:24.664 "claimed": false, 00:05:24.664 "zoned": false, 00:05:24.664 "supported_io_types": { 00:05:24.664 "read": true, 00:05:24.664 "write": true, 00:05:24.664 "unmap": true, 00:05:24.664 "flush": true, 00:05:24.664 "reset": true, 00:05:24.664 "nvme_admin": false, 00:05:24.664 "nvme_io": false, 00:05:24.664 "nvme_io_md": false, 00:05:24.664 "write_zeroes": true, 00:05:24.664 "zcopy": true, 00:05:24.664 "get_zone_info": false, 00:05:24.664 "zone_management": false, 00:05:24.664 "zone_append": false, 00:05:24.664 "compare": false, 00:05:24.664 "compare_and_write": false, 00:05:24.664 "abort": true, 00:05:24.664 "seek_hole": false, 00:05:24.664 "seek_data": false, 00:05:24.664 "copy": true, 00:05:24.664 "nvme_iov_md": false 00:05:24.664 }, 00:05:24.664 "memory_domains": [ 00:05:24.664 { 00:05:24.664 "dma_device_id": "system", 00:05:24.664 "dma_device_type": 1 00:05:24.664 }, 00:05:24.664 { 00:05:24.664 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:24.664 "dma_device_type": 2 00:05:24.664 } 00:05:24.664 ], 00:05:24.664 "driver_specific": {} 00:05:24.664 } 00:05:24.664 ]' 00:05:24.664 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:24.664 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:24.664 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:24.664 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:24.664 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:24.664 [2024-07-23 08:18:37.172105] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:24.664 [2024-07-23 08:18:37.172159] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:24.664 [2024-07-23 08:18:37.172181] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034e80 00:05:24.664 [2024-07-23 08:18:37.172192] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:24.664 [2024-07-23 08:18:37.174059] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:24.664 [2024-07-23 08:18:37.174091] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:24.664 Passthru0 00:05:24.664 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:24.664 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:24.664 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:24.664 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:24.922 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:24.922 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:24.922 { 00:05:24.922 "name": "Malloc0", 00:05:24.922 "aliases": [ 00:05:24.922 "4d4c2fa6-dbb9-4b29-8315-23a196cdab78" 00:05:24.922 ], 00:05:24.922 "product_name": "Malloc disk", 00:05:24.922 "block_size": 512, 00:05:24.922 "num_blocks": 16384, 00:05:24.922 "uuid": "4d4c2fa6-dbb9-4b29-8315-23a196cdab78", 00:05:24.922 "assigned_rate_limits": { 00:05:24.922 "rw_ios_per_sec": 0, 00:05:24.922 "rw_mbytes_per_sec": 0, 00:05:24.922 "r_mbytes_per_sec": 0, 00:05:24.922 "w_mbytes_per_sec": 0 00:05:24.922 }, 00:05:24.922 "claimed": true, 00:05:24.922 "claim_type": "exclusive_write", 00:05:24.922 "zoned": false, 00:05:24.922 "supported_io_types": { 00:05:24.922 "read": true, 00:05:24.922 "write": true, 00:05:24.922 "unmap": true, 00:05:24.922 "flush": true, 00:05:24.922 "reset": true, 00:05:24.922 "nvme_admin": false, 00:05:24.922 "nvme_io": false, 00:05:24.922 "nvme_io_md": false, 00:05:24.922 "write_zeroes": true, 00:05:24.922 "zcopy": true, 00:05:24.922 "get_zone_info": false, 00:05:24.922 "zone_management": false, 00:05:24.922 "zone_append": false, 00:05:24.922 "compare": false, 00:05:24.922 "compare_and_write": false, 00:05:24.922 "abort": true, 00:05:24.922 "seek_hole": false, 00:05:24.922 "seek_data": false, 00:05:24.922 "copy": true, 00:05:24.922 "nvme_iov_md": false 00:05:24.922 }, 00:05:24.922 "memory_domains": [ 00:05:24.922 { 00:05:24.922 "dma_device_id": "system", 00:05:24.922 "dma_device_type": 1 00:05:24.922 }, 00:05:24.922 { 00:05:24.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:24.922 "dma_device_type": 2 00:05:24.922 } 00:05:24.922 ], 00:05:24.922 "driver_specific": {} 00:05:24.922 }, 00:05:24.922 { 00:05:24.922 "name": "Passthru0", 00:05:24.922 "aliases": [ 00:05:24.922 "ed7c4d81-e332-58c3-9567-d3b478a83eca" 00:05:24.922 ], 00:05:24.922 "product_name": "passthru", 00:05:24.922 "block_size": 512, 00:05:24.922 "num_blocks": 16384, 00:05:24.922 "uuid": "ed7c4d81-e332-58c3-9567-d3b478a83eca", 00:05:24.922 "assigned_rate_limits": { 00:05:24.922 "rw_ios_per_sec": 0, 00:05:24.922 "rw_mbytes_per_sec": 0, 00:05:24.922 "r_mbytes_per_sec": 0, 00:05:24.922 "w_mbytes_per_sec": 0 00:05:24.922 }, 00:05:24.922 "claimed": false, 00:05:24.922 "zoned": false, 00:05:24.922 "supported_io_types": { 00:05:24.922 "read": true, 00:05:24.922 "write": true, 00:05:24.922 "unmap": true, 00:05:24.922 "flush": true, 00:05:24.922 "reset": true, 00:05:24.922 "nvme_admin": false, 00:05:24.922 "nvme_io": false, 00:05:24.922 "nvme_io_md": false, 00:05:24.922 "write_zeroes": true, 00:05:24.922 "zcopy": true, 00:05:24.922 "get_zone_info": false, 00:05:24.922 "zone_management": false, 00:05:24.922 "zone_append": false, 00:05:24.922 "compare": false, 00:05:24.922 "compare_and_write": false, 00:05:24.922 "abort": true, 00:05:24.922 "seek_hole": false, 00:05:24.922 "seek_data": false, 00:05:24.922 "copy": true, 00:05:24.922 "nvme_iov_md": false 00:05:24.922 }, 00:05:24.922 "memory_domains": [ 00:05:24.922 { 00:05:24.922 "dma_device_id": "system", 00:05:24.922 "dma_device_type": 1 00:05:24.922 }, 00:05:24.922 { 00:05:24.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:24.922 "dma_device_type": 2 00:05:24.922 } 00:05:24.922 ], 00:05:24.922 "driver_specific": { 00:05:24.922 "passthru": { 00:05:24.922 "name": "Passthru0", 00:05:24.922 "base_bdev_name": "Malloc0" 00:05:24.922 } 00:05:24.922 } 00:05:24.922 } 00:05:24.922 ]' 00:05:24.922 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:24.922 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:24.922 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:24.922 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:24.922 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:24.922 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:24.922 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:24.922 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:24.922 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:24.922 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:24.922 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:24.922 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:24.922 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:24.922 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:24.922 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:24.922 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:24.922 08:18:37 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:24.922 00:05:24.922 real 0m0.273s 00:05:24.922 user 0m0.169s 00:05:24.922 sys 0m0.036s 00:05:24.922 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:24.922 08:18:37 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:24.922 ************************************ 00:05:24.922 END TEST rpc_integrity 00:05:24.922 ************************************ 00:05:24.922 08:18:37 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:24.922 08:18:37 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:24.922 08:18:37 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:24.922 08:18:37 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:24.922 08:18:37 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:24.922 ************************************ 00:05:24.922 START TEST rpc_plugins 00:05:24.922 ************************************ 00:05:24.922 08:18:37 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:05:24.922 08:18:37 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:24.922 08:18:37 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:24.922 08:18:37 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:24.922 08:18:37 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:24.922 08:18:37 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:24.922 08:18:37 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:24.922 08:18:37 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:24.922 08:18:37 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:24.922 08:18:37 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:24.922 08:18:37 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:24.922 { 00:05:24.922 "name": "Malloc1", 00:05:24.922 "aliases": [ 00:05:24.922 "c61666aa-754e-487a-a5fb-c06e3abccbaa" 00:05:24.922 ], 00:05:24.922 "product_name": "Malloc disk", 00:05:24.922 "block_size": 4096, 00:05:24.922 "num_blocks": 256, 00:05:24.922 "uuid": "c61666aa-754e-487a-a5fb-c06e3abccbaa", 00:05:24.922 "assigned_rate_limits": { 00:05:24.923 "rw_ios_per_sec": 0, 00:05:24.923 "rw_mbytes_per_sec": 0, 00:05:24.923 "r_mbytes_per_sec": 0, 00:05:24.923 "w_mbytes_per_sec": 0 00:05:24.923 }, 00:05:24.923 "claimed": false, 00:05:24.923 "zoned": false, 00:05:24.923 "supported_io_types": { 00:05:24.923 "read": true, 00:05:24.923 "write": true, 00:05:24.923 "unmap": true, 00:05:24.923 "flush": true, 00:05:24.923 "reset": true, 00:05:24.923 "nvme_admin": false, 00:05:24.923 "nvme_io": false, 00:05:24.923 "nvme_io_md": false, 00:05:24.923 "write_zeroes": true, 00:05:24.923 "zcopy": true, 00:05:24.923 "get_zone_info": false, 00:05:24.923 "zone_management": false, 00:05:24.923 "zone_append": false, 00:05:24.923 "compare": false, 00:05:24.923 "compare_and_write": false, 00:05:24.923 "abort": true, 00:05:24.923 "seek_hole": false, 00:05:24.923 "seek_data": false, 00:05:24.923 "copy": true, 00:05:24.923 "nvme_iov_md": false 00:05:24.923 }, 00:05:24.923 "memory_domains": [ 00:05:24.923 { 00:05:24.923 "dma_device_id": "system", 00:05:24.923 "dma_device_type": 1 00:05:24.923 }, 00:05:24.923 { 00:05:24.923 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:24.923 "dma_device_type": 2 00:05:24.923 } 00:05:24.923 ], 00:05:24.923 "driver_specific": {} 00:05:24.923 } 00:05:24.923 ]' 00:05:24.923 08:18:37 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:05:24.923 08:18:37 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:24.923 08:18:37 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:24.923 08:18:37 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:24.923 08:18:37 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:24.923 08:18:37 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:24.923 08:18:37 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:24.923 08:18:37 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:24.923 08:18:37 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:25.181 08:18:37 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:25.181 08:18:37 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:25.181 08:18:37 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:05:25.181 08:18:37 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:25.181 00:05:25.181 real 0m0.118s 00:05:25.181 user 0m0.078s 00:05:25.181 sys 0m0.014s 00:05:25.181 08:18:37 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:25.181 08:18:37 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:25.181 ************************************ 00:05:25.181 END TEST rpc_plugins 00:05:25.181 ************************************ 00:05:25.181 08:18:37 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:25.181 08:18:37 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:25.181 08:18:37 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:25.181 08:18:37 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:25.181 08:18:37 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:25.181 ************************************ 00:05:25.181 START TEST rpc_trace_cmd_test 00:05:25.181 ************************************ 00:05:25.181 08:18:37 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:05:25.181 08:18:37 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:05:25.181 08:18:37 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:25.181 08:18:37 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:25.181 08:18:37 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:25.181 08:18:37 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:25.181 08:18:37 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:05:25.181 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1324369", 00:05:25.181 "tpoint_group_mask": "0x8", 00:05:25.181 "iscsi_conn": { 00:05:25.181 "mask": "0x2", 00:05:25.181 "tpoint_mask": "0x0" 00:05:25.181 }, 00:05:25.181 "scsi": { 00:05:25.181 "mask": "0x4", 00:05:25.181 "tpoint_mask": "0x0" 00:05:25.181 }, 00:05:25.181 "bdev": { 00:05:25.181 "mask": "0x8", 00:05:25.181 "tpoint_mask": "0xffffffffffffffff" 00:05:25.181 }, 00:05:25.181 "nvmf_rdma": { 00:05:25.181 "mask": "0x10", 00:05:25.181 "tpoint_mask": "0x0" 00:05:25.181 }, 00:05:25.181 "nvmf_tcp": { 00:05:25.181 "mask": "0x20", 00:05:25.181 "tpoint_mask": "0x0" 00:05:25.181 }, 00:05:25.181 "ftl": { 00:05:25.181 "mask": "0x40", 00:05:25.181 "tpoint_mask": "0x0" 00:05:25.181 }, 00:05:25.181 "blobfs": { 00:05:25.181 "mask": "0x80", 00:05:25.181 "tpoint_mask": "0x0" 00:05:25.181 }, 00:05:25.181 "dsa": { 00:05:25.181 "mask": "0x200", 00:05:25.181 "tpoint_mask": "0x0" 00:05:25.181 }, 00:05:25.181 "thread": { 00:05:25.181 "mask": "0x400", 00:05:25.181 "tpoint_mask": "0x0" 00:05:25.181 }, 00:05:25.181 "nvme_pcie": { 00:05:25.181 "mask": "0x800", 00:05:25.181 "tpoint_mask": "0x0" 00:05:25.181 }, 00:05:25.181 "iaa": { 00:05:25.181 "mask": "0x1000", 00:05:25.181 "tpoint_mask": "0x0" 00:05:25.181 }, 00:05:25.181 "nvme_tcp": { 00:05:25.181 "mask": "0x2000", 00:05:25.181 "tpoint_mask": "0x0" 00:05:25.181 }, 00:05:25.181 "bdev_nvme": { 00:05:25.181 "mask": "0x4000", 00:05:25.181 "tpoint_mask": "0x0" 00:05:25.181 }, 00:05:25.181 "sock": { 00:05:25.181 "mask": "0x8000", 00:05:25.181 "tpoint_mask": "0x0" 00:05:25.181 } 00:05:25.181 }' 00:05:25.181 08:18:37 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:05:25.181 08:18:37 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:05:25.181 08:18:37 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:25.181 08:18:37 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:25.181 08:18:37 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:25.181 08:18:37 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:25.181 08:18:37 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:25.440 08:18:37 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:25.440 08:18:37 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:25.440 08:18:37 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:25.440 00:05:25.440 real 0m0.196s 00:05:25.440 user 0m0.163s 00:05:25.440 sys 0m0.028s 00:05:25.440 08:18:37 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:25.440 08:18:37 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:25.440 ************************************ 00:05:25.440 END TEST rpc_trace_cmd_test 00:05:25.440 ************************************ 00:05:25.440 08:18:37 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:25.440 08:18:37 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:25.440 08:18:37 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:25.440 08:18:37 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:25.440 08:18:37 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:25.440 08:18:37 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:25.440 08:18:37 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:25.440 ************************************ 00:05:25.440 START TEST rpc_daemon_integrity 00:05:25.440 ************************************ 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:25.440 { 00:05:25.440 "name": "Malloc2", 00:05:25.440 "aliases": [ 00:05:25.440 "20fe413e-9c35-48a0-8362-fed711fba27d" 00:05:25.440 ], 00:05:25.440 "product_name": "Malloc disk", 00:05:25.440 "block_size": 512, 00:05:25.440 "num_blocks": 16384, 00:05:25.440 "uuid": "20fe413e-9c35-48a0-8362-fed711fba27d", 00:05:25.440 "assigned_rate_limits": { 00:05:25.440 "rw_ios_per_sec": 0, 00:05:25.440 "rw_mbytes_per_sec": 0, 00:05:25.440 "r_mbytes_per_sec": 0, 00:05:25.440 "w_mbytes_per_sec": 0 00:05:25.440 }, 00:05:25.440 "claimed": false, 00:05:25.440 "zoned": false, 00:05:25.440 "supported_io_types": { 00:05:25.440 "read": true, 00:05:25.440 "write": true, 00:05:25.440 "unmap": true, 00:05:25.440 "flush": true, 00:05:25.440 "reset": true, 00:05:25.440 "nvme_admin": false, 00:05:25.440 "nvme_io": false, 00:05:25.440 "nvme_io_md": false, 00:05:25.440 "write_zeroes": true, 00:05:25.440 "zcopy": true, 00:05:25.440 "get_zone_info": false, 00:05:25.440 "zone_management": false, 00:05:25.440 "zone_append": false, 00:05:25.440 "compare": false, 00:05:25.440 "compare_and_write": false, 00:05:25.440 "abort": true, 00:05:25.440 "seek_hole": false, 00:05:25.440 "seek_data": false, 00:05:25.440 "copy": true, 00:05:25.440 "nvme_iov_md": false 00:05:25.440 }, 00:05:25.440 "memory_domains": [ 00:05:25.440 { 00:05:25.440 "dma_device_id": "system", 00:05:25.440 "dma_device_type": 1 00:05:25.440 }, 00:05:25.440 { 00:05:25.440 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:25.440 "dma_device_type": 2 00:05:25.440 } 00:05:25.440 ], 00:05:25.440 "driver_specific": {} 00:05:25.440 } 00:05:25.440 ]' 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:25.440 [2024-07-23 08:18:37.927250] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:25.440 [2024-07-23 08:18:37.927298] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:25.440 [2024-07-23 08:18:37.927315] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036080 00:05:25.440 [2024-07-23 08:18:37.927325] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:25.440 [2024-07-23 08:18:37.929078] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:25.440 [2024-07-23 08:18:37.929105] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:25.440 Passthru0 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:25.440 08:18:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:25.440 { 00:05:25.440 "name": "Malloc2", 00:05:25.440 "aliases": [ 00:05:25.440 "20fe413e-9c35-48a0-8362-fed711fba27d" 00:05:25.440 ], 00:05:25.440 "product_name": "Malloc disk", 00:05:25.440 "block_size": 512, 00:05:25.440 "num_blocks": 16384, 00:05:25.440 "uuid": "20fe413e-9c35-48a0-8362-fed711fba27d", 00:05:25.440 "assigned_rate_limits": { 00:05:25.440 "rw_ios_per_sec": 0, 00:05:25.440 "rw_mbytes_per_sec": 0, 00:05:25.440 "r_mbytes_per_sec": 0, 00:05:25.440 "w_mbytes_per_sec": 0 00:05:25.440 }, 00:05:25.440 "claimed": true, 00:05:25.440 "claim_type": "exclusive_write", 00:05:25.440 "zoned": false, 00:05:25.440 "supported_io_types": { 00:05:25.440 "read": true, 00:05:25.440 "write": true, 00:05:25.440 "unmap": true, 00:05:25.440 "flush": true, 00:05:25.440 "reset": true, 00:05:25.440 "nvme_admin": false, 00:05:25.440 "nvme_io": false, 00:05:25.440 "nvme_io_md": false, 00:05:25.440 "write_zeroes": true, 00:05:25.440 "zcopy": true, 00:05:25.440 "get_zone_info": false, 00:05:25.441 "zone_management": false, 00:05:25.441 "zone_append": false, 00:05:25.441 "compare": false, 00:05:25.441 "compare_and_write": false, 00:05:25.441 "abort": true, 00:05:25.441 "seek_hole": false, 00:05:25.441 "seek_data": false, 00:05:25.441 "copy": true, 00:05:25.441 "nvme_iov_md": false 00:05:25.441 }, 00:05:25.441 "memory_domains": [ 00:05:25.441 { 00:05:25.441 "dma_device_id": "system", 00:05:25.441 "dma_device_type": 1 00:05:25.441 }, 00:05:25.441 { 00:05:25.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:25.441 "dma_device_type": 2 00:05:25.441 } 00:05:25.441 ], 00:05:25.441 "driver_specific": {} 00:05:25.441 }, 00:05:25.441 { 00:05:25.441 "name": "Passthru0", 00:05:25.441 "aliases": [ 00:05:25.441 "5af8612c-3f19-55fb-844d-6ba2c42a7109" 00:05:25.441 ], 00:05:25.441 "product_name": "passthru", 00:05:25.441 "block_size": 512, 00:05:25.441 "num_blocks": 16384, 00:05:25.441 "uuid": "5af8612c-3f19-55fb-844d-6ba2c42a7109", 00:05:25.441 "assigned_rate_limits": { 00:05:25.441 "rw_ios_per_sec": 0, 00:05:25.441 "rw_mbytes_per_sec": 0, 00:05:25.441 "r_mbytes_per_sec": 0, 00:05:25.441 "w_mbytes_per_sec": 0 00:05:25.441 }, 00:05:25.441 "claimed": false, 00:05:25.441 "zoned": false, 00:05:25.441 "supported_io_types": { 00:05:25.441 "read": true, 00:05:25.441 "write": true, 00:05:25.441 "unmap": true, 00:05:25.441 "flush": true, 00:05:25.441 "reset": true, 00:05:25.441 "nvme_admin": false, 00:05:25.441 "nvme_io": false, 00:05:25.441 "nvme_io_md": false, 00:05:25.441 "write_zeroes": true, 00:05:25.441 "zcopy": true, 00:05:25.441 "get_zone_info": false, 00:05:25.441 "zone_management": false, 00:05:25.441 "zone_append": false, 00:05:25.441 "compare": false, 00:05:25.441 "compare_and_write": false, 00:05:25.441 "abort": true, 00:05:25.441 "seek_hole": false, 00:05:25.441 "seek_data": false, 00:05:25.441 "copy": true, 00:05:25.441 "nvme_iov_md": false 00:05:25.441 }, 00:05:25.441 "memory_domains": [ 00:05:25.441 { 00:05:25.441 "dma_device_id": "system", 00:05:25.441 "dma_device_type": 1 00:05:25.441 }, 00:05:25.441 { 00:05:25.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:25.441 "dma_device_type": 2 00:05:25.441 } 00:05:25.441 ], 00:05:25.441 "driver_specific": { 00:05:25.441 "passthru": { 00:05:25.441 "name": "Passthru0", 00:05:25.441 "base_bdev_name": "Malloc2" 00:05:25.441 } 00:05:25.441 } 00:05:25.441 } 00:05:25.441 ]' 00:05:25.441 08:18:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:25.699 08:18:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:25.699 08:18:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:25.699 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:25.699 08:18:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:25.699 08:18:38 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:25.699 08:18:38 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:25.699 08:18:38 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:25.699 08:18:38 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:25.699 08:18:38 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:25.699 08:18:38 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:25.699 08:18:38 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:25.699 08:18:38 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:25.699 08:18:38 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:25.699 08:18:38 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:25.699 08:18:38 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:25.699 08:18:38 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:25.699 00:05:25.699 real 0m0.259s 00:05:25.699 user 0m0.165s 00:05:25.699 sys 0m0.033s 00:05:25.699 08:18:38 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:25.699 08:18:38 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:25.699 ************************************ 00:05:25.699 END TEST rpc_daemon_integrity 00:05:25.699 ************************************ 00:05:25.699 08:18:38 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:25.699 08:18:38 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:25.699 08:18:38 rpc -- rpc/rpc.sh@84 -- # killprocess 1324369 00:05:25.699 08:18:38 rpc -- common/autotest_common.sh@948 -- # '[' -z 1324369 ']' 00:05:25.699 08:18:38 rpc -- common/autotest_common.sh@952 -- # kill -0 1324369 00:05:25.699 08:18:38 rpc -- common/autotest_common.sh@953 -- # uname 00:05:25.699 08:18:38 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:25.699 08:18:38 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1324369 00:05:25.699 08:18:38 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:25.699 08:18:38 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:25.699 08:18:38 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1324369' 00:05:25.699 killing process with pid 1324369 00:05:25.699 08:18:38 rpc -- common/autotest_common.sh@967 -- # kill 1324369 00:05:25.699 08:18:38 rpc -- common/autotest_common.sh@972 -- # wait 1324369 00:05:28.260 00:05:28.260 real 0m5.023s 00:05:28.260 user 0m5.512s 00:05:28.260 sys 0m0.796s 00:05:28.260 08:18:40 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:28.260 08:18:40 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:28.260 ************************************ 00:05:28.260 END TEST rpc 00:05:28.260 ************************************ 00:05:28.260 08:18:40 -- common/autotest_common.sh@1142 -- # return 0 00:05:28.260 08:18:40 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:28.260 08:18:40 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:28.260 08:18:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:28.260 08:18:40 -- common/autotest_common.sh@10 -- # set +x 00:05:28.260 ************************************ 00:05:28.260 START TEST skip_rpc 00:05:28.260 ************************************ 00:05:28.260 08:18:40 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:28.260 * Looking for test storage... 00:05:28.260 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:28.260 08:18:40 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:28.260 08:18:40 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:28.260 08:18:40 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:05:28.260 08:18:40 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:28.260 08:18:40 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:28.260 08:18:40 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:28.520 ************************************ 00:05:28.520 START TEST skip_rpc 00:05:28.520 ************************************ 00:05:28.520 08:18:40 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:05:28.520 08:18:40 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1325399 00:05:28.520 08:18:40 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:28.520 08:18:40 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:05:28.520 08:18:40 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:05:28.520 [2024-07-23 08:18:40.862925] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:05:28.520 [2024-07-23 08:18:40.863011] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1325399 ] 00:05:28.520 [2024-07-23 08:18:40.985999] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:28.778 [2024-07-23 08:18:41.202835] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 1325399 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 1325399 ']' 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 1325399 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1325399 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1325399' 00:05:34.043 killing process with pid 1325399 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 1325399 00:05:34.043 08:18:45 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 1325399 00:05:35.943 00:05:35.943 real 0m7.499s 00:05:35.943 user 0m7.061s 00:05:35.943 sys 0m0.390s 00:05:35.943 08:18:48 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:35.943 08:18:48 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:35.943 ************************************ 00:05:35.943 END TEST skip_rpc 00:05:35.943 ************************************ 00:05:35.943 08:18:48 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:35.943 08:18:48 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:05:35.943 08:18:48 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:35.943 08:18:48 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.943 08:18:48 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:35.943 ************************************ 00:05:35.943 START TEST skip_rpc_with_json 00:05:35.943 ************************************ 00:05:35.943 08:18:48 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:05:35.943 08:18:48 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:05:35.943 08:18:48 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1326901 00:05:35.943 08:18:48 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:35.943 08:18:48 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:35.943 08:18:48 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 1326901 00:05:35.943 08:18:48 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 1326901 ']' 00:05:35.943 08:18:48 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:35.943 08:18:48 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:35.943 08:18:48 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:35.943 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:35.943 08:18:48 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:35.943 08:18:48 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:35.943 [2024-07-23 08:18:48.429073] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:05:35.943 [2024-07-23 08:18:48.429167] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1326901 ] 00:05:36.201 [2024-07-23 08:18:48.552304] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:36.459 [2024-07-23 08:18:48.771735] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:37.392 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:37.392 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:05:37.393 08:18:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:05:37.393 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:37.393 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:37.393 [2024-07-23 08:18:49.711451] nvmf_rpc.c:2569:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:05:37.393 request: 00:05:37.393 { 00:05:37.393 "trtype": "tcp", 00:05:37.393 "method": "nvmf_get_transports", 00:05:37.393 "req_id": 1 00:05:37.393 } 00:05:37.393 Got JSON-RPC error response 00:05:37.393 response: 00:05:37.393 { 00:05:37.393 "code": -19, 00:05:37.393 "message": "No such device" 00:05:37.393 } 00:05:37.393 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:05:37.393 08:18:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:05:37.393 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:37.393 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:37.393 [2024-07-23 08:18:49.723556] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:05:37.393 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:37.393 08:18:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:05:37.393 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:37.393 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:37.393 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:37.393 08:18:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:37.393 { 00:05:37.393 "subsystems": [ 00:05:37.393 { 00:05:37.393 "subsystem": "keyring", 00:05:37.393 "config": [] 00:05:37.393 }, 00:05:37.393 { 00:05:37.393 "subsystem": "iobuf", 00:05:37.393 "config": [ 00:05:37.393 { 00:05:37.393 "method": "iobuf_set_options", 00:05:37.393 "params": { 00:05:37.393 "small_pool_count": 8192, 00:05:37.393 "large_pool_count": 1024, 00:05:37.393 "small_bufsize": 8192, 00:05:37.393 "large_bufsize": 135168 00:05:37.393 } 00:05:37.393 } 00:05:37.393 ] 00:05:37.393 }, 00:05:37.393 { 00:05:37.393 "subsystem": "sock", 00:05:37.393 "config": [ 00:05:37.393 { 00:05:37.393 "method": "sock_set_default_impl", 00:05:37.393 "params": { 00:05:37.393 "impl_name": "posix" 00:05:37.393 } 00:05:37.393 }, 00:05:37.393 { 00:05:37.393 "method": "sock_impl_set_options", 00:05:37.393 "params": { 00:05:37.393 "impl_name": "ssl", 00:05:37.393 "recv_buf_size": 4096, 00:05:37.393 "send_buf_size": 4096, 00:05:37.393 "enable_recv_pipe": true, 00:05:37.393 "enable_quickack": false, 00:05:37.393 "enable_placement_id": 0, 00:05:37.393 "enable_zerocopy_send_server": true, 00:05:37.393 "enable_zerocopy_send_client": false, 00:05:37.393 "zerocopy_threshold": 0, 00:05:37.393 "tls_version": 0, 00:05:37.393 "enable_ktls": false 00:05:37.393 } 00:05:37.393 }, 00:05:37.393 { 00:05:37.393 "method": "sock_impl_set_options", 00:05:37.393 "params": { 00:05:37.393 "impl_name": "posix", 00:05:37.393 "recv_buf_size": 2097152, 00:05:37.393 "send_buf_size": 2097152, 00:05:37.393 "enable_recv_pipe": true, 00:05:37.393 "enable_quickack": false, 00:05:37.393 "enable_placement_id": 0, 00:05:37.393 "enable_zerocopy_send_server": true, 00:05:37.393 "enable_zerocopy_send_client": false, 00:05:37.393 "zerocopy_threshold": 0, 00:05:37.393 "tls_version": 0, 00:05:37.393 "enable_ktls": false 00:05:37.393 } 00:05:37.393 } 00:05:37.393 ] 00:05:37.393 }, 00:05:37.393 { 00:05:37.393 "subsystem": "vmd", 00:05:37.393 "config": [] 00:05:37.393 }, 00:05:37.393 { 00:05:37.393 "subsystem": "accel", 00:05:37.393 "config": [ 00:05:37.393 { 00:05:37.393 "method": "accel_set_options", 00:05:37.393 "params": { 00:05:37.393 "small_cache_size": 128, 00:05:37.393 "large_cache_size": 16, 00:05:37.393 "task_count": 2048, 00:05:37.393 "sequence_count": 2048, 00:05:37.393 "buf_count": 2048 00:05:37.393 } 00:05:37.393 } 00:05:37.393 ] 00:05:37.393 }, 00:05:37.393 { 00:05:37.393 "subsystem": "bdev", 00:05:37.393 "config": [ 00:05:37.393 { 00:05:37.393 "method": "bdev_set_options", 00:05:37.393 "params": { 00:05:37.393 "bdev_io_pool_size": 65535, 00:05:37.393 "bdev_io_cache_size": 256, 00:05:37.393 "bdev_auto_examine": true, 00:05:37.393 "iobuf_small_cache_size": 128, 00:05:37.393 "iobuf_large_cache_size": 16 00:05:37.393 } 00:05:37.393 }, 00:05:37.393 { 00:05:37.393 "method": "bdev_raid_set_options", 00:05:37.393 "params": { 00:05:37.393 "process_window_size_kb": 1024, 00:05:37.393 "process_max_bandwidth_mb_sec": 0 00:05:37.393 } 00:05:37.393 }, 00:05:37.393 { 00:05:37.393 "method": "bdev_iscsi_set_options", 00:05:37.393 "params": { 00:05:37.393 "timeout_sec": 30 00:05:37.393 } 00:05:37.393 }, 00:05:37.393 { 00:05:37.393 "method": "bdev_nvme_set_options", 00:05:37.393 "params": { 00:05:37.393 "action_on_timeout": "none", 00:05:37.393 "timeout_us": 0, 00:05:37.393 "timeout_admin_us": 0, 00:05:37.393 "keep_alive_timeout_ms": 10000, 00:05:37.393 "arbitration_burst": 0, 00:05:37.393 "low_priority_weight": 0, 00:05:37.393 "medium_priority_weight": 0, 00:05:37.393 "high_priority_weight": 0, 00:05:37.393 "nvme_adminq_poll_period_us": 10000, 00:05:37.393 "nvme_ioq_poll_period_us": 0, 00:05:37.393 "io_queue_requests": 0, 00:05:37.393 "delay_cmd_submit": true, 00:05:37.393 "transport_retry_count": 4, 00:05:37.393 "bdev_retry_count": 3, 00:05:37.393 "transport_ack_timeout": 0, 00:05:37.393 "ctrlr_loss_timeout_sec": 0, 00:05:37.393 "reconnect_delay_sec": 0, 00:05:37.393 "fast_io_fail_timeout_sec": 0, 00:05:37.393 "disable_auto_failback": false, 00:05:37.393 "generate_uuids": false, 00:05:37.393 "transport_tos": 0, 00:05:37.393 "nvme_error_stat": false, 00:05:37.393 "rdma_srq_size": 0, 00:05:37.393 "io_path_stat": false, 00:05:37.393 "allow_accel_sequence": false, 00:05:37.393 "rdma_max_cq_size": 0, 00:05:37.393 "rdma_cm_event_timeout_ms": 0, 00:05:37.393 "dhchap_digests": [ 00:05:37.393 "sha256", 00:05:37.393 "sha384", 00:05:37.393 "sha512" 00:05:37.393 ], 00:05:37.393 "dhchap_dhgroups": [ 00:05:37.393 "null", 00:05:37.393 "ffdhe2048", 00:05:37.393 "ffdhe3072", 00:05:37.393 "ffdhe4096", 00:05:37.393 "ffdhe6144", 00:05:37.393 "ffdhe8192" 00:05:37.393 ] 00:05:37.393 } 00:05:37.393 }, 00:05:37.393 { 00:05:37.393 "method": "bdev_nvme_set_hotplug", 00:05:37.393 "params": { 00:05:37.393 "period_us": 100000, 00:05:37.393 "enable": false 00:05:37.393 } 00:05:37.394 }, 00:05:37.394 { 00:05:37.394 "method": "bdev_wait_for_examine" 00:05:37.394 } 00:05:37.394 ] 00:05:37.394 }, 00:05:37.394 { 00:05:37.394 "subsystem": "scsi", 00:05:37.394 "config": null 00:05:37.394 }, 00:05:37.394 { 00:05:37.394 "subsystem": "scheduler", 00:05:37.394 "config": [ 00:05:37.394 { 00:05:37.394 "method": "framework_set_scheduler", 00:05:37.394 "params": { 00:05:37.394 "name": "static" 00:05:37.394 } 00:05:37.394 } 00:05:37.394 ] 00:05:37.394 }, 00:05:37.394 { 00:05:37.394 "subsystem": "vhost_scsi", 00:05:37.394 "config": [] 00:05:37.394 }, 00:05:37.394 { 00:05:37.394 "subsystem": "vhost_blk", 00:05:37.394 "config": [] 00:05:37.394 }, 00:05:37.394 { 00:05:37.394 "subsystem": "ublk", 00:05:37.394 "config": [] 00:05:37.394 }, 00:05:37.394 { 00:05:37.394 "subsystem": "nbd", 00:05:37.394 "config": [] 00:05:37.394 }, 00:05:37.394 { 00:05:37.394 "subsystem": "nvmf", 00:05:37.394 "config": [ 00:05:37.394 { 00:05:37.394 "method": "nvmf_set_config", 00:05:37.394 "params": { 00:05:37.394 "discovery_filter": "match_any", 00:05:37.394 "admin_cmd_passthru": { 00:05:37.394 "identify_ctrlr": false 00:05:37.394 } 00:05:37.394 } 00:05:37.394 }, 00:05:37.394 { 00:05:37.394 "method": "nvmf_set_max_subsystems", 00:05:37.394 "params": { 00:05:37.394 "max_subsystems": 1024 00:05:37.394 } 00:05:37.394 }, 00:05:37.394 { 00:05:37.394 "method": "nvmf_set_crdt", 00:05:37.394 "params": { 00:05:37.394 "crdt1": 0, 00:05:37.394 "crdt2": 0, 00:05:37.394 "crdt3": 0 00:05:37.394 } 00:05:37.394 }, 00:05:37.394 { 00:05:37.394 "method": "nvmf_create_transport", 00:05:37.394 "params": { 00:05:37.394 "trtype": "TCP", 00:05:37.394 "max_queue_depth": 128, 00:05:37.394 "max_io_qpairs_per_ctrlr": 127, 00:05:37.394 "in_capsule_data_size": 4096, 00:05:37.394 "max_io_size": 131072, 00:05:37.394 "io_unit_size": 131072, 00:05:37.394 "max_aq_depth": 128, 00:05:37.394 "num_shared_buffers": 511, 00:05:37.394 "buf_cache_size": 4294967295, 00:05:37.394 "dif_insert_or_strip": false, 00:05:37.394 "zcopy": false, 00:05:37.394 "c2h_success": true, 00:05:37.394 "sock_priority": 0, 00:05:37.394 "abort_timeout_sec": 1, 00:05:37.394 "ack_timeout": 0, 00:05:37.394 "data_wr_pool_size": 0 00:05:37.394 } 00:05:37.394 } 00:05:37.394 ] 00:05:37.394 }, 00:05:37.394 { 00:05:37.394 "subsystem": "iscsi", 00:05:37.394 "config": [ 00:05:37.394 { 00:05:37.394 "method": "iscsi_set_options", 00:05:37.394 "params": { 00:05:37.394 "node_base": "iqn.2016-06.io.spdk", 00:05:37.394 "max_sessions": 128, 00:05:37.394 "max_connections_per_session": 2, 00:05:37.394 "max_queue_depth": 64, 00:05:37.394 "default_time2wait": 2, 00:05:37.394 "default_time2retain": 20, 00:05:37.394 "first_burst_length": 8192, 00:05:37.394 "immediate_data": true, 00:05:37.394 "allow_duplicated_isid": false, 00:05:37.394 "error_recovery_level": 0, 00:05:37.394 "nop_timeout": 60, 00:05:37.394 "nop_in_interval": 30, 00:05:37.394 "disable_chap": false, 00:05:37.394 "require_chap": false, 00:05:37.394 "mutual_chap": false, 00:05:37.394 "chap_group": 0, 00:05:37.394 "max_large_datain_per_connection": 64, 00:05:37.394 "max_r2t_per_connection": 4, 00:05:37.394 "pdu_pool_size": 36864, 00:05:37.394 "immediate_data_pool_size": 16384, 00:05:37.394 "data_out_pool_size": 2048 00:05:37.394 } 00:05:37.394 } 00:05:37.394 ] 00:05:37.394 } 00:05:37.394 ] 00:05:37.394 } 00:05:37.394 08:18:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:05:37.394 08:18:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 1326901 00:05:37.394 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 1326901 ']' 00:05:37.394 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 1326901 00:05:37.394 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:05:37.394 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:37.394 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1326901 00:05:37.652 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:37.652 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:37.652 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1326901' 00:05:37.652 killing process with pid 1326901 00:05:37.652 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 1326901 00:05:37.652 08:18:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 1326901 00:05:40.180 08:18:52 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1327668 00:05:40.180 08:18:52 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:05:40.180 08:18:52 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:45.446 08:18:57 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 1327668 00:05:45.446 08:18:57 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 1327668 ']' 00:05:45.446 08:18:57 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 1327668 00:05:45.446 08:18:57 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:05:45.446 08:18:57 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:45.446 08:18:57 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1327668 00:05:45.446 08:18:57 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:45.446 08:18:57 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:45.446 08:18:57 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1327668' 00:05:45.446 killing process with pid 1327668 00:05:45.446 08:18:57 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 1327668 00:05:45.446 08:18:57 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 1327668 00:05:47.362 08:18:59 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:47.362 08:18:59 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:47.362 00:05:47.362 real 0m11.471s 00:05:47.362 user 0m10.955s 00:05:47.362 sys 0m0.857s 00:05:47.362 08:18:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:47.362 08:18:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:47.362 ************************************ 00:05:47.362 END TEST skip_rpc_with_json 00:05:47.362 ************************************ 00:05:47.363 08:18:59 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:47.363 08:18:59 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:05:47.363 08:18:59 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:47.363 08:18:59 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:47.363 08:18:59 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:47.363 ************************************ 00:05:47.363 START TEST skip_rpc_with_delay 00:05:47.363 ************************************ 00:05:47.363 08:18:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:05:47.363 08:18:59 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:47.363 08:18:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:05:47.363 08:18:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:47.363 08:18:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:47.363 08:18:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:47.363 08:18:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:47.363 08:18:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:47.363 08:18:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:47.363 08:18:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:47.363 08:18:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:47.363 08:18:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:47.363 08:18:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:47.620 [2024-07-23 08:18:59.962660] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:05:47.620 [2024-07-23 08:18:59.962742] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:05:47.620 08:19:00 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:05:47.620 08:19:00 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:47.620 08:19:00 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:05:47.620 08:19:00 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:47.620 00:05:47.620 real 0m0.148s 00:05:47.620 user 0m0.082s 00:05:47.620 sys 0m0.065s 00:05:47.620 08:19:00 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:47.620 08:19:00 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:05:47.620 ************************************ 00:05:47.620 END TEST skip_rpc_with_delay 00:05:47.620 ************************************ 00:05:47.620 08:19:00 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:47.621 08:19:00 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:05:47.621 08:19:00 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:05:47.621 08:19:00 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:05:47.621 08:19:00 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:47.621 08:19:00 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:47.621 08:19:00 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:47.621 ************************************ 00:05:47.621 START TEST exit_on_failed_rpc_init 00:05:47.621 ************************************ 00:05:47.621 08:19:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:05:47.621 08:19:00 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1329216 00:05:47.621 08:19:00 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 1329216 00:05:47.621 08:19:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 1329216 ']' 00:05:47.621 08:19:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:47.621 08:19:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:47.621 08:19:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:47.621 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:47.621 08:19:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:47.621 08:19:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:47.621 08:19:00 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:47.878 [2024-07-23 08:19:00.164857] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:05:47.878 [2024-07-23 08:19:00.164932] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1329216 ] 00:05:47.878 [2024-07-23 08:19:00.285390] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:48.138 [2024-07-23 08:19:00.490550] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.076 08:19:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:49.076 08:19:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:05:49.076 08:19:01 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:49.076 08:19:01 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:49.076 08:19:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:05:49.076 08:19:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:49.076 08:19:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:49.076 08:19:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:49.076 08:19:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:49.076 08:19:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:49.076 08:19:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:49.076 08:19:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:49.076 08:19:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:49.076 08:19:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:49.076 08:19:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:49.076 [2024-07-23 08:19:01.526513] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:05:49.076 [2024-07-23 08:19:01.526606] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1329467 ] 00:05:49.335 [2024-07-23 08:19:01.649262] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:49.593 [2024-07-23 08:19:01.867386] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:05:49.593 [2024-07-23 08:19:01.867468] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:05:49.593 [2024-07-23 08:19:01.867483] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:05:49.593 [2024-07-23 08:19:01.867494] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 1329216 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 1329216 ']' 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 1329216 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1329216 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1329216' 00:05:49.852 killing process with pid 1329216 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 1329216 00:05:49.852 08:19:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 1329216 00:05:52.385 00:05:52.385 real 0m4.726s 00:05:52.385 user 0m5.293s 00:05:52.385 sys 0m0.627s 00:05:52.385 08:19:04 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:52.385 08:19:04 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:52.385 ************************************ 00:05:52.385 END TEST exit_on_failed_rpc_init 00:05:52.385 ************************************ 00:05:52.385 08:19:04 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:52.385 08:19:04 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:52.385 00:05:52.385 real 0m24.167s 00:05:52.385 user 0m23.496s 00:05:52.385 sys 0m2.180s 00:05:52.385 08:19:04 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:52.385 08:19:04 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:52.385 ************************************ 00:05:52.385 END TEST skip_rpc 00:05:52.385 ************************************ 00:05:52.385 08:19:04 -- common/autotest_common.sh@1142 -- # return 0 00:05:52.385 08:19:04 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:52.385 08:19:04 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:52.385 08:19:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:52.385 08:19:04 -- common/autotest_common.sh@10 -- # set +x 00:05:52.385 ************************************ 00:05:52.385 START TEST rpc_client 00:05:52.385 ************************************ 00:05:52.385 08:19:04 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:52.644 * Looking for test storage... 00:05:52.644 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:05:52.644 08:19:04 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:52.644 OK 00:05:52.644 08:19:05 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:52.644 00:05:52.644 real 0m0.127s 00:05:52.644 user 0m0.046s 00:05:52.644 sys 0m0.089s 00:05:52.644 08:19:05 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:52.644 08:19:05 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:05:52.644 ************************************ 00:05:52.644 END TEST rpc_client 00:05:52.644 ************************************ 00:05:52.644 08:19:05 -- common/autotest_common.sh@1142 -- # return 0 00:05:52.644 08:19:05 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:05:52.644 08:19:05 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:52.644 08:19:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:52.644 08:19:05 -- common/autotest_common.sh@10 -- # set +x 00:05:52.644 ************************************ 00:05:52.644 START TEST json_config 00:05:52.644 ************************************ 00:05:52.645 08:19:05 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:05:52.645 08:19:05 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@7 -- # uname -s 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:800e967b-538f-e911-906e-001635649f5c 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=800e967b-538f-e911-906e-001635649f5c 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:05:52.645 08:19:05 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:52.645 08:19:05 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:52.645 08:19:05 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:52.645 08:19:05 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:52.645 08:19:05 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:52.645 08:19:05 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:52.645 08:19:05 json_config -- paths/export.sh@5 -- # export PATH 00:05:52.645 08:19:05 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@47 -- # : 0 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:52.645 08:19:05 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:52.645 08:19:05 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:05:52.645 08:19:05 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@359 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@360 -- # echo 'INFO: JSON configuration test init' 00:05:52.904 INFO: JSON configuration test init 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@361 -- # json_config_test_init 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@266 -- # timing_enter json_config_test_init 00:05:52.904 08:19:05 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:52.904 08:19:05 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@267 -- # timing_enter json_config_setup_target 00:05:52.904 08:19:05 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:52.904 08:19:05 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:52.904 08:19:05 json_config -- json_config/json_config.sh@269 -- # json_config_test_start_app target --wait-for-rpc 00:05:52.904 08:19:05 json_config -- json_config/common.sh@9 -- # local app=target 00:05:52.904 08:19:05 json_config -- json_config/common.sh@10 -- # shift 00:05:52.904 08:19:05 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:52.904 08:19:05 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:52.904 08:19:05 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:05:52.904 08:19:05 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:52.904 08:19:05 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:52.904 08:19:05 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1330302 00:05:52.904 08:19:05 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:52.904 Waiting for target to run... 00:05:52.904 08:19:05 json_config -- json_config/common.sh@25 -- # waitforlisten 1330302 /var/tmp/spdk_tgt.sock 00:05:52.904 08:19:05 json_config -- common/autotest_common.sh@829 -- # '[' -z 1330302 ']' 00:05:52.904 08:19:05 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:05:52.904 08:19:05 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:52.904 08:19:05 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:52.904 08:19:05 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:52.904 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:52.904 08:19:05 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:52.904 08:19:05 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:52.904 [2024-07-23 08:19:05.269497] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:05:52.904 [2024-07-23 08:19:05.269592] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1330302 ] 00:05:53.163 [2024-07-23 08:19:05.662895] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:53.421 [2024-07-23 08:19:05.853575] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.680 08:19:06 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:53.680 08:19:06 json_config -- common/autotest_common.sh@862 -- # return 0 00:05:53.680 08:19:06 json_config -- json_config/common.sh@26 -- # echo '' 00:05:53.680 00:05:53.680 08:19:06 json_config -- json_config/json_config.sh@273 -- # create_accel_config 00:05:53.680 08:19:06 json_config -- json_config/json_config.sh@97 -- # timing_enter create_accel_config 00:05:53.680 08:19:06 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:53.680 08:19:06 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:53.680 08:19:06 json_config -- json_config/json_config.sh@99 -- # [[ 1 -eq 1 ]] 00:05:53.680 08:19:06 json_config -- json_config/json_config.sh@100 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:05:53.680 08:19:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:05:53.680 08:19:06 json_config -- json_config/json_config.sh@101 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:05:53.680 08:19:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:05:53.939 [2024-07-23 08:19:06.323294] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:05:53.939 08:19:06 json_config -- json_config/json_config.sh@102 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:05:53.939 08:19:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:05:54.197 [2024-07-23 08:19:06.487746] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:05:54.197 08:19:06 json_config -- json_config/json_config.sh@105 -- # timing_exit create_accel_config 00:05:54.197 08:19:06 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:54.197 08:19:06 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:54.197 08:19:06 json_config -- json_config/json_config.sh@277 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:05:54.197 08:19:06 json_config -- json_config/json_config.sh@278 -- # tgt_rpc load_config 00:05:54.197 08:19:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:05:54.455 [2024-07-23 08:19:06.902929] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:01.024 08:19:12 json_config -- json_config/json_config.sh@280 -- # tgt_check_notification_types 00:06:01.024 08:19:12 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:06:01.024 08:19:12 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:01.024 08:19:12 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:01.024 08:19:12 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:06:01.024 08:19:12 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:06:01.024 08:19:12 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:06:01.024 08:19:12 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:06:01.024 08:19:12 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:06:01.024 08:19:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@48 -- # local get_types 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@50 -- # local type_diff 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@51 -- # tr ' ' '\n' 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@51 -- # echo bdev_register bdev_unregister bdev_register bdev_unregister 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@51 -- # sort 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@51 -- # uniq -u 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@51 -- # type_diff= 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@53 -- # [[ -n '' ]] 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@58 -- # timing_exit tgt_check_notification_types 00:06:01.024 08:19:13 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:01.024 08:19:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@59 -- # return 0 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@282 -- # [[ 1 -eq 1 ]] 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@283 -- # create_bdev_subsystem_config 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@109 -- # timing_enter create_bdev_subsystem_config 00:06:01.024 08:19:13 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:01.024 08:19:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@111 -- # expected_notifications=() 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@111 -- # local expected_notifications 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@115 -- # expected_notifications+=($(get_notifications)) 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@115 -- # get_notifications 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:01.024 08:19:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@117 -- # [[ 1 -eq 1 ]] 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@118 -- # local lvol_store_base_bdev=Nvme0n1 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@120 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:06:01.024 08:19:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:06:01.024 Nvme0n1p0 Nvme0n1p1 00:06:01.024 08:19:13 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_split_create Malloc0 3 00:06:01.024 08:19:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:06:01.282 [2024-07-23 08:19:13.620374] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:01.282 [2024-07-23 08:19:13.620428] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:01.282 00:06:01.282 08:19:13 json_config -- json_config/json_config.sh@122 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:06:01.282 08:19:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:06:01.282 Malloc3 00:06:01.541 08:19:13 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:01.541 08:19:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:01.541 [2024-07-23 08:19:13.956122] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:01.541 [2024-07-23 08:19:13.956175] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:01.541 [2024-07-23 08:19:13.956197] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036380 00:06:01.541 [2024-07-23 08:19:13.956206] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:01.541 [2024-07-23 08:19:13.958259] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:01.541 [2024-07-23 08:19:13.958290] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:01.541 PTBdevFromMalloc3 00:06:01.541 08:19:13 json_config -- json_config/json_config.sh@125 -- # tgt_rpc bdev_null_create Null0 32 512 00:06:01.541 08:19:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:06:01.800 Null0 00:06:01.800 08:19:14 json_config -- json_config/json_config.sh@127 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:06:01.800 08:19:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:06:01.800 Malloc0 00:06:02.059 08:19:14 json_config -- json_config/json_config.sh@128 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:06:02.059 08:19:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:06:02.059 Malloc1 00:06:02.059 08:19:14 json_config -- json_config/json_config.sh@141 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:06:02.059 08:19:14 json_config -- json_config/json_config.sh@144 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:06:02.317 102400+0 records in 00:06:02.317 102400+0 records out 00:06:02.317 104857600 bytes (105 MB, 100 MiB) copied, 0.121775 s, 861 MB/s 00:06:02.317 08:19:14 json_config -- json_config/json_config.sh@145 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:06:02.317 08:19:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:06:02.317 aio_disk 00:06:02.317 08:19:14 json_config -- json_config/json_config.sh@146 -- # expected_notifications+=(bdev_register:aio_disk) 00:06:02.317 08:19:14 json_config -- json_config/json_config.sh@151 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:02.317 08:19:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:10.471 60519796-cc09-49ab-b49a-6f74ae1faeab 00:06:10.471 08:19:22 json_config -- json_config/json_config.sh@158 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:06:10.471 08:19:22 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:06:10.472 08:19:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:06:10.472 08:19:22 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:06:10.472 08:19:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:06:10.472 08:19:22 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:10.472 08:19:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:10.472 08:19:22 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:10.472 08:19:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:10.731 08:19:22 json_config -- json_config/json_config.sh@161 -- # [[ 1 -eq 1 ]] 00:06:10.731 08:19:22 json_config -- json_config/json_config.sh@162 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:10.731 08:19:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:10.731 MallocForCryptoBdev 00:06:10.731 08:19:23 json_config -- json_config/json_config.sh@163 -- # lspci -d:37c8 00:06:10.731 08:19:23 json_config -- json_config/json_config.sh@163 -- # wc -l 00:06:10.731 08:19:23 json_config -- json_config/json_config.sh@163 -- # [[ 3 -eq 0 ]] 00:06:10.731 08:19:23 json_config -- json_config/json_config.sh@166 -- # local crypto_driver=crypto_qat 00:06:10.731 08:19:23 json_config -- json_config/json_config.sh@169 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:10.731 08:19:23 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:10.989 [2024-07-23 08:19:23.349247] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:06:10.989 CryptoMallocBdev 00:06:10.989 08:19:23 json_config -- json_config/json_config.sh@173 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:06:10.989 08:19:23 json_config -- json_config/json_config.sh@176 -- # [[ 0 -eq 1 ]] 00:06:10.989 08:19:23 json_config -- json_config/json_config.sh@182 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:62506f13-98a4-4e26-abd0-8113749ad177 bdev_register:401f4047-b3a8-40b2-9ded-8df430e017dd bdev_register:c8114bfc-4031-4fee-a775-7173bfdf9595 bdev_register:65ca221b-1187-4393-8706-d6b4d24226f5 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:10.989 08:19:23 json_config -- json_config/json_config.sh@71 -- # local events_to_check 00:06:10.989 08:19:23 json_config -- json_config/json_config.sh@72 -- # local recorded_events 00:06:10.989 08:19:23 json_config -- json_config/json_config.sh@75 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:06:10.989 08:19:23 json_config -- json_config/json_config.sh@75 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:62506f13-98a4-4e26-abd0-8113749ad177 bdev_register:401f4047-b3a8-40b2-9ded-8df430e017dd bdev_register:c8114bfc-4031-4fee-a775-7173bfdf9595 bdev_register:65ca221b-1187-4393-8706-d6b4d24226f5 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:10.989 08:19:23 json_config -- json_config/json_config.sh@75 -- # sort 00:06:10.989 08:19:23 json_config -- json_config/json_config.sh@76 -- # recorded_events=($(get_notifications | sort)) 00:06:10.989 08:19:23 json_config -- json_config/json_config.sh@76 -- # get_notifications 00:06:10.989 08:19:23 json_config -- json_config/json_config.sh@76 -- # sort 00:06:10.989 08:19:23 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:06:10.989 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:10.989 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:10.989 08:19:23 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:06:10.989 08:19:23 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:10.989 08:19:23 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p1 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p0 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc3 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:PTBdevFromMalloc3 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Null0 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p2 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p1 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p0 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc1 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:aio_disk 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:401f4047-b3a8-40b2-9ded-8df430e017dd 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:62506f13-98a4-4e26-abd0-8113749ad177 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:c8114bfc-4031-4fee-a775-7173bfdf9595 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:65ca221b-1187-4393-8706-d6b4d24226f5 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:MallocForCryptoBdev 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:CryptoMallocBdev 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:11.248 08:19:23 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:11.249 08:19:23 json_config -- json_config/json_config.sh@78 -- # [[ bdev_register:401f4047-b3a8-40b2-9ded-8df430e017dd bdev_register:62506f13-98a4-4e26-abd0-8113749ad177 bdev_register:65ca221b-1187-4393-8706-d6b4d24226f5 bdev_register:aio_disk bdev_register:c8114bfc-4031-4fee-a775-7173bfdf9595 bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\4\0\1\f\4\0\4\7\-\b\3\a\8\-\4\0\b\2\-\9\d\e\d\-\8\d\f\4\3\0\e\0\1\7\d\d\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\6\2\5\0\6\f\1\3\-\9\8\a\4\-\4\e\2\6\-\a\b\d\0\-\8\1\1\3\7\4\9\a\d\1\7\7\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\6\5\c\a\2\2\1\b\-\1\1\8\7\-\4\3\9\3\-\8\7\0\6\-\d\6\b\4\d\2\4\2\2\6\f\5\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\c\8\1\1\4\b\f\c\-\4\0\3\1\-\4\f\e\e\-\a\7\7\5\-\7\1\7\3\b\f\d\f\9\5\9\5\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:06:11.249 08:19:23 json_config -- json_config/json_config.sh@90 -- # cat 00:06:11.249 08:19:23 json_config -- json_config/json_config.sh@90 -- # printf ' %s\n' bdev_register:401f4047-b3a8-40b2-9ded-8df430e017dd bdev_register:62506f13-98a4-4e26-abd0-8113749ad177 bdev_register:65ca221b-1187-4393-8706-d6b4d24226f5 bdev_register:aio_disk bdev_register:c8114bfc-4031-4fee-a775-7173bfdf9595 bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:06:11.249 Expected events matched: 00:06:11.249 bdev_register:401f4047-b3a8-40b2-9ded-8df430e017dd 00:06:11.249 bdev_register:62506f13-98a4-4e26-abd0-8113749ad177 00:06:11.249 bdev_register:65ca221b-1187-4393-8706-d6b4d24226f5 00:06:11.249 bdev_register:aio_disk 00:06:11.249 bdev_register:c8114bfc-4031-4fee-a775-7173bfdf9595 00:06:11.249 bdev_register:CryptoMallocBdev 00:06:11.249 bdev_register:Malloc0 00:06:11.249 bdev_register:Malloc0p0 00:06:11.249 bdev_register:Malloc0p1 00:06:11.249 bdev_register:Malloc0p2 00:06:11.249 bdev_register:Malloc1 00:06:11.249 bdev_register:Malloc3 00:06:11.249 bdev_register:MallocForCryptoBdev 00:06:11.249 bdev_register:Null0 00:06:11.249 bdev_register:Nvme0n1 00:06:11.249 bdev_register:Nvme0n1p0 00:06:11.249 bdev_register:Nvme0n1p1 00:06:11.249 bdev_register:PTBdevFromMalloc3 00:06:11.249 08:19:23 json_config -- json_config/json_config.sh@184 -- # timing_exit create_bdev_subsystem_config 00:06:11.249 08:19:23 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:11.249 08:19:23 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:11.249 08:19:23 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:06:11.249 08:19:23 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:06:11.249 08:19:23 json_config -- json_config/json_config.sh@294 -- # [[ 0 -eq 1 ]] 00:06:11.249 08:19:23 json_config -- json_config/json_config.sh@297 -- # timing_exit json_config_setup_target 00:06:11.249 08:19:23 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:11.249 08:19:23 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:11.249 08:19:23 json_config -- json_config/json_config.sh@299 -- # [[ 0 -eq 1 ]] 00:06:11.249 08:19:23 json_config -- json_config/json_config.sh@304 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:11.249 08:19:23 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:11.508 MallocBdevForConfigChangeCheck 00:06:11.508 08:19:23 json_config -- json_config/json_config.sh@306 -- # timing_exit json_config_test_init 00:06:11.508 08:19:23 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:11.508 08:19:23 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:11.508 08:19:23 json_config -- json_config/json_config.sh@363 -- # tgt_rpc save_config 00:06:11.508 08:19:23 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:11.766 08:19:24 json_config -- json_config/json_config.sh@365 -- # echo 'INFO: shutting down applications...' 00:06:11.766 INFO: shutting down applications... 00:06:11.766 08:19:24 json_config -- json_config/json_config.sh@366 -- # [[ 0 -eq 1 ]] 00:06:11.766 08:19:24 json_config -- json_config/json_config.sh@372 -- # json_config_clear target 00:06:11.766 08:19:24 json_config -- json_config/json_config.sh@336 -- # [[ -n 22 ]] 00:06:11.766 08:19:24 json_config -- json_config/json_config.sh@337 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:06:12.025 [2024-07-23 08:19:24.313361] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:06:16.215 Calling clear_iscsi_subsystem 00:06:16.215 Calling clear_nvmf_subsystem 00:06:16.215 Calling clear_nbd_subsystem 00:06:16.215 Calling clear_ublk_subsystem 00:06:16.215 Calling clear_vhost_blk_subsystem 00:06:16.215 Calling clear_vhost_scsi_subsystem 00:06:16.215 Calling clear_bdev_subsystem 00:06:16.215 08:19:28 json_config -- json_config/json_config.sh@341 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:06:16.215 08:19:28 json_config -- json_config/json_config.sh@347 -- # count=100 00:06:16.215 08:19:28 json_config -- json_config/json_config.sh@348 -- # '[' 100 -gt 0 ']' 00:06:16.215 08:19:28 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:06:16.215 08:19:28 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:16.215 08:19:28 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:06:16.474 08:19:28 json_config -- json_config/json_config.sh@349 -- # break 00:06:16.474 08:19:28 json_config -- json_config/json_config.sh@354 -- # '[' 100 -eq 0 ']' 00:06:16.474 08:19:28 json_config -- json_config/json_config.sh@373 -- # json_config_test_shutdown_app target 00:06:16.474 08:19:28 json_config -- json_config/common.sh@31 -- # local app=target 00:06:16.474 08:19:28 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:16.474 08:19:28 json_config -- json_config/common.sh@35 -- # [[ -n 1330302 ]] 00:06:16.474 08:19:28 json_config -- json_config/common.sh@38 -- # kill -SIGINT 1330302 00:06:16.474 08:19:28 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:16.474 08:19:28 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:16.474 08:19:28 json_config -- json_config/common.sh@41 -- # kill -0 1330302 00:06:16.474 08:19:28 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:17.043 08:19:29 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:17.043 08:19:29 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:17.043 08:19:29 json_config -- json_config/common.sh@41 -- # kill -0 1330302 00:06:17.043 08:19:29 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:17.302 08:19:29 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:17.302 08:19:29 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:17.302 08:19:29 json_config -- json_config/common.sh@41 -- # kill -0 1330302 00:06:17.302 08:19:29 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:17.871 08:19:30 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:17.871 08:19:30 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:17.871 08:19:30 json_config -- json_config/common.sh@41 -- # kill -0 1330302 00:06:17.871 08:19:30 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:18.439 08:19:30 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:18.439 08:19:30 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:18.440 08:19:30 json_config -- json_config/common.sh@41 -- # kill -0 1330302 00:06:18.440 08:19:30 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:19.008 08:19:31 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:19.008 08:19:31 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:19.008 08:19:31 json_config -- json_config/common.sh@41 -- # kill -0 1330302 00:06:19.008 08:19:31 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:19.008 08:19:31 json_config -- json_config/common.sh@43 -- # break 00:06:19.008 08:19:31 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:19.008 08:19:31 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:19.008 SPDK target shutdown done 00:06:19.008 08:19:31 json_config -- json_config/json_config.sh@375 -- # echo 'INFO: relaunching applications...' 00:06:19.008 INFO: relaunching applications... 00:06:19.008 08:19:31 json_config -- json_config/json_config.sh@376 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:19.008 08:19:31 json_config -- json_config/common.sh@9 -- # local app=target 00:06:19.008 08:19:31 json_config -- json_config/common.sh@10 -- # shift 00:06:19.008 08:19:31 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:19.008 08:19:31 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:19.008 08:19:31 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:19.008 08:19:31 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:19.008 08:19:31 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:19.008 08:19:31 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1335271 00:06:19.008 08:19:31 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:19.008 Waiting for target to run... 00:06:19.008 08:19:31 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:19.008 08:19:31 json_config -- json_config/common.sh@25 -- # waitforlisten 1335271 /var/tmp/spdk_tgt.sock 00:06:19.008 08:19:31 json_config -- common/autotest_common.sh@829 -- # '[' -z 1335271 ']' 00:06:19.008 08:19:31 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:19.008 08:19:31 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:19.008 08:19:31 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:19.008 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:19.008 08:19:31 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:19.008 08:19:31 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:19.008 [2024-07-23 08:19:31.388828] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:06:19.008 [2024-07-23 08:19:31.388924] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1335271 ] 00:06:19.268 [2024-07-23 08:19:31.777955] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.525 [2024-07-23 08:19:31.973963] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.525 [2024-07-23 08:19:32.027759] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:06:19.525 [2024-07-23 08:19:32.035797] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:19.525 [2024-07-23 08:19:32.043813] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:20.093 [2024-07-23 08:19:32.305596] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:23.381 [2024-07-23 08:19:35.346920] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:23.381 [2024-07-23 08:19:35.346976] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:06:23.381 [2024-07-23 08:19:35.346995] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:23.381 [2024-07-23 08:19:35.354937] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:23.381 [2024-07-23 08:19:35.354974] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:23.381 [2024-07-23 08:19:35.362951] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:23.381 [2024-07-23 08:19:35.362982] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:23.381 [2024-07-23 08:19:35.370988] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:06:23.381 [2024-07-23 08:19:35.371023] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:06:23.381 [2024-07-23 08:19:35.371035] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:25.915 [2024-07-23 08:19:38.278861] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:25.915 [2024-07-23 08:19:38.278917] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:25.915 [2024-07-23 08:19:38.278934] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037b80 00:06:25.915 [2024-07-23 08:19:38.278943] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:25.915 [2024-07-23 08:19:38.279314] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:25.915 [2024-07-23 08:19:38.279331] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:26.483 08:19:38 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:26.483 08:19:38 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:26.483 08:19:38 json_config -- json_config/common.sh@26 -- # echo '' 00:06:26.483 00:06:26.483 08:19:38 json_config -- json_config/json_config.sh@377 -- # [[ 0 -eq 1 ]] 00:06:26.483 08:19:38 json_config -- json_config/json_config.sh@381 -- # echo 'INFO: Checking if target configuration is the same...' 00:06:26.483 INFO: Checking if target configuration is the same... 00:06:26.483 08:19:38 json_config -- json_config/json_config.sh@382 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:26.483 08:19:38 json_config -- json_config/json_config.sh@382 -- # tgt_rpc save_config 00:06:26.483 08:19:38 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:26.483 + '[' 2 -ne 2 ']' 00:06:26.483 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:26.483 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:26.483 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:26.483 +++ basename /dev/fd/62 00:06:26.483 ++ mktemp /tmp/62.XXX 00:06:26.483 + tmp_file_1=/tmp/62.Tmh 00:06:26.483 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:26.483 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:26.483 + tmp_file_2=/tmp/spdk_tgt_config.json.hdn 00:06:26.483 + ret=0 00:06:26.483 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:26.777 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:26.777 + diff -u /tmp/62.Tmh /tmp/spdk_tgt_config.json.hdn 00:06:26.777 + echo 'INFO: JSON config files are the same' 00:06:26.777 INFO: JSON config files are the same 00:06:26.777 + rm /tmp/62.Tmh /tmp/spdk_tgt_config.json.hdn 00:06:26.777 + exit 0 00:06:26.777 08:19:39 json_config -- json_config/json_config.sh@383 -- # [[ 0 -eq 1 ]] 00:06:26.777 08:19:39 json_config -- json_config/json_config.sh@388 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:06:26.777 INFO: changing configuration and checking if this can be detected... 00:06:26.777 08:19:39 json_config -- json_config/json_config.sh@390 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:26.777 08:19:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:27.040 08:19:39 json_config -- json_config/json_config.sh@391 -- # tgt_rpc save_config 00:06:27.040 08:19:39 json_config -- json_config/json_config.sh@391 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:27.040 08:19:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:27.040 + '[' 2 -ne 2 ']' 00:06:27.040 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:27.040 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:27.040 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:27.040 +++ basename /dev/fd/62 00:06:27.040 ++ mktemp /tmp/62.XXX 00:06:27.040 + tmp_file_1=/tmp/62.fwu 00:06:27.040 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:27.040 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:27.040 + tmp_file_2=/tmp/spdk_tgt_config.json.BBi 00:06:27.040 + ret=0 00:06:27.040 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:27.299 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:27.299 + diff -u /tmp/62.fwu /tmp/spdk_tgt_config.json.BBi 00:06:27.299 + ret=1 00:06:27.299 + echo '=== Start of file: /tmp/62.fwu ===' 00:06:27.299 + cat /tmp/62.fwu 00:06:27.299 + echo '=== End of file: /tmp/62.fwu ===' 00:06:27.299 + echo '' 00:06:27.299 + echo '=== Start of file: /tmp/spdk_tgt_config.json.BBi ===' 00:06:27.299 + cat /tmp/spdk_tgt_config.json.BBi 00:06:27.299 + echo '=== End of file: /tmp/spdk_tgt_config.json.BBi ===' 00:06:27.299 + echo '' 00:06:27.299 + rm /tmp/62.fwu /tmp/spdk_tgt_config.json.BBi 00:06:27.299 + exit 1 00:06:27.299 08:19:39 json_config -- json_config/json_config.sh@395 -- # echo 'INFO: configuration change detected.' 00:06:27.299 INFO: configuration change detected. 00:06:27.299 08:19:39 json_config -- json_config/json_config.sh@398 -- # json_config_test_fini 00:06:27.299 08:19:39 json_config -- json_config/json_config.sh@310 -- # timing_enter json_config_test_fini 00:06:27.299 08:19:39 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:27.299 08:19:39 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:27.299 08:19:39 json_config -- json_config/json_config.sh@311 -- # local ret=0 00:06:27.299 08:19:39 json_config -- json_config/json_config.sh@313 -- # [[ -n '' ]] 00:06:27.299 08:19:39 json_config -- json_config/json_config.sh@321 -- # [[ -n 1335271 ]] 00:06:27.299 08:19:39 json_config -- json_config/json_config.sh@324 -- # cleanup_bdev_subsystem_config 00:06:27.299 08:19:39 json_config -- json_config/json_config.sh@188 -- # timing_enter cleanup_bdev_subsystem_config 00:06:27.299 08:19:39 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:27.299 08:19:39 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:27.299 08:19:39 json_config -- json_config/json_config.sh@190 -- # [[ 1 -eq 1 ]] 00:06:27.299 08:19:39 json_config -- json_config/json_config.sh@191 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:06:27.299 08:19:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:06:27.558 08:19:39 json_config -- json_config/json_config.sh@192 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:06:27.558 08:19:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:06:27.817 08:19:40 json_config -- json_config/json_config.sh@193 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:06:27.817 08:19:40 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:06:27.817 08:19:40 json_config -- json_config/json_config.sh@194 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:06:27.817 08:19:40 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:06:28.076 08:19:40 json_config -- json_config/json_config.sh@197 -- # uname -s 00:06:28.076 08:19:40 json_config -- json_config/json_config.sh@197 -- # [[ Linux = Linux ]] 00:06:28.076 08:19:40 json_config -- json_config/json_config.sh@198 -- # rm -f /sample_aio 00:06:28.076 08:19:40 json_config -- json_config/json_config.sh@201 -- # [[ 0 -eq 1 ]] 00:06:28.076 08:19:40 json_config -- json_config/json_config.sh@205 -- # timing_exit cleanup_bdev_subsystem_config 00:06:28.076 08:19:40 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:28.076 08:19:40 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:28.076 08:19:40 json_config -- json_config/json_config.sh@327 -- # killprocess 1335271 00:06:28.076 08:19:40 json_config -- common/autotest_common.sh@948 -- # '[' -z 1335271 ']' 00:06:28.076 08:19:40 json_config -- common/autotest_common.sh@952 -- # kill -0 1335271 00:06:28.076 08:19:40 json_config -- common/autotest_common.sh@953 -- # uname 00:06:28.076 08:19:40 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:28.076 08:19:40 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1335271 00:06:28.076 08:19:40 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:28.076 08:19:40 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:28.076 08:19:40 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1335271' 00:06:28.076 killing process with pid 1335271 00:06:28.076 08:19:40 json_config -- common/autotest_common.sh@967 -- # kill 1335271 00:06:28.076 08:19:40 json_config -- common/autotest_common.sh@972 -- # wait 1335271 00:06:34.645 08:19:46 json_config -- json_config/json_config.sh@330 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:34.645 08:19:46 json_config -- json_config/json_config.sh@331 -- # timing_exit json_config_test_fini 00:06:34.645 08:19:46 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:34.645 08:19:46 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:34.645 08:19:46 json_config -- json_config/json_config.sh@332 -- # return 0 00:06:34.645 08:19:46 json_config -- json_config/json_config.sh@400 -- # echo 'INFO: Success' 00:06:34.645 INFO: Success 00:06:34.645 00:06:34.645 real 0m41.321s 00:06:34.645 user 0m43.834s 00:06:34.645 sys 0m3.045s 00:06:34.645 08:19:46 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:34.645 08:19:46 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:34.645 ************************************ 00:06:34.645 END TEST json_config 00:06:34.645 ************************************ 00:06:34.645 08:19:46 -- common/autotest_common.sh@1142 -- # return 0 00:06:34.645 08:19:46 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:34.645 08:19:46 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:34.645 08:19:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:34.645 08:19:46 -- common/autotest_common.sh@10 -- # set +x 00:06:34.645 ************************************ 00:06:34.645 START TEST json_config_extra_key 00:06:34.645 ************************************ 00:06:34.645 08:19:46 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:34.645 08:19:46 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:800e967b-538f-e911-906e-001635649f5c 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=800e967b-538f-e911-906e-001635649f5c 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:34.645 08:19:46 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:34.645 08:19:46 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:34.645 08:19:46 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:34.645 08:19:46 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:34.645 08:19:46 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:34.645 08:19:46 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:34.645 08:19:46 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:34.645 08:19:46 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:34.645 08:19:46 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:34.645 08:19:46 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:34.645 08:19:46 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:34.645 08:19:46 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:34.645 08:19:46 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:34.645 08:19:46 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:34.645 08:19:46 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:34.645 08:19:46 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:34.645 08:19:46 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:34.645 08:19:46 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:34.645 08:19:46 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:34.646 08:19:46 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:34.646 INFO: launching applications... 00:06:34.646 08:19:46 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:34.646 08:19:46 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:34.646 08:19:46 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:34.646 08:19:46 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:34.646 08:19:46 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:34.646 08:19:46 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:34.646 08:19:46 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:34.646 08:19:46 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:34.646 08:19:46 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=1338373 00:06:34.646 08:19:46 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:34.646 Waiting for target to run... 00:06:34.646 08:19:46 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 1338373 /var/tmp/spdk_tgt.sock 00:06:34.646 08:19:46 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 1338373 ']' 00:06:34.646 08:19:46 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:34.646 08:19:46 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:34.646 08:19:46 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:34.646 08:19:46 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:34.646 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:34.646 08:19:46 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:34.646 08:19:46 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:34.646 [2024-07-23 08:19:46.638472] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:06:34.646 [2024-07-23 08:19:46.638569] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1338373 ] 00:06:34.646 [2024-07-23 08:19:47.022662] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.905 [2024-07-23 08:19:47.217626] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.471 08:19:47 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:35.472 08:19:47 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:06:35.472 08:19:47 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:35.472 00:06:35.472 08:19:47 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:35.472 INFO: shutting down applications... 00:06:35.472 08:19:47 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:35.472 08:19:47 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:35.731 08:19:47 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:35.731 08:19:47 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 1338373 ]] 00:06:35.731 08:19:47 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 1338373 00:06:35.731 08:19:47 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:35.731 08:19:47 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:35.731 08:19:47 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1338373 00:06:35.731 08:19:47 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:35.990 08:19:48 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:35.990 08:19:48 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:35.990 08:19:48 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1338373 00:06:35.990 08:19:48 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:36.559 08:19:48 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:36.559 08:19:48 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:36.559 08:19:48 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1338373 00:06:36.559 08:19:49 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:36.559 08:19:49 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:36.559 08:19:49 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:36.559 08:19:49 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:36.559 SPDK target shutdown done 00:06:36.559 08:19:49 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:36.559 Success 00:06:36.559 00:06:36.559 real 0m2.546s 00:06:36.559 user 0m2.229s 00:06:36.559 sys 0m0.534s 00:06:36.559 08:19:49 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:36.559 08:19:49 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:36.559 ************************************ 00:06:36.559 END TEST json_config_extra_key 00:06:36.559 ************************************ 00:06:36.559 08:19:49 -- common/autotest_common.sh@1142 -- # return 0 00:06:36.559 08:19:49 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:36.559 08:19:49 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:36.559 08:19:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:36.559 08:19:49 -- common/autotest_common.sh@10 -- # set +x 00:06:36.559 ************************************ 00:06:36.559 START TEST alias_rpc 00:06:36.559 ************************************ 00:06:36.559 08:19:49 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:36.818 * Looking for test storage... 00:06:36.818 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:06:36.818 08:19:49 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:36.818 08:19:49 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1338926 00:06:36.818 08:19:49 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1338926 00:06:36.818 08:19:49 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:36.818 08:19:49 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 1338926 ']' 00:06:36.818 08:19:49 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:36.818 08:19:49 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:36.818 08:19:49 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:36.818 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:36.818 08:19:49 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:36.818 08:19:49 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:36.818 [2024-07-23 08:19:49.229741] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:06:36.818 [2024-07-23 08:19:49.229834] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1338926 ] 00:06:37.077 [2024-07-23 08:19:49.354274] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.077 [2024-07-23 08:19:49.563413] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.456 08:19:50 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:38.456 08:19:50 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:06:38.456 08:19:50 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:38.456 08:19:50 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1338926 00:06:38.456 08:19:50 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 1338926 ']' 00:06:38.456 08:19:50 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 1338926 00:06:38.456 08:19:50 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:06:38.456 08:19:50 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:38.456 08:19:50 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1338926 00:06:38.456 08:19:50 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:38.456 08:19:50 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:38.456 08:19:50 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1338926' 00:06:38.456 killing process with pid 1338926 00:06:38.456 08:19:50 alias_rpc -- common/autotest_common.sh@967 -- # kill 1338926 00:06:38.456 08:19:50 alias_rpc -- common/autotest_common.sh@972 -- # wait 1338926 00:06:40.990 00:06:40.990 real 0m4.194s 00:06:40.990 user 0m4.146s 00:06:40.990 sys 0m0.529s 00:06:40.990 08:19:53 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:40.990 08:19:53 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:40.990 ************************************ 00:06:40.990 END TEST alias_rpc 00:06:40.990 ************************************ 00:06:40.990 08:19:53 -- common/autotest_common.sh@1142 -- # return 0 00:06:40.990 08:19:53 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:06:40.990 08:19:53 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:40.990 08:19:53 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:40.990 08:19:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:40.990 08:19:53 -- common/autotest_common.sh@10 -- # set +x 00:06:40.990 ************************************ 00:06:40.990 START TEST spdkcli_tcp 00:06:40.990 ************************************ 00:06:40.990 08:19:53 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:40.990 * Looking for test storage... 00:06:40.990 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:06:40.990 08:19:53 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:06:40.990 08:19:53 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:40.990 08:19:53 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:06:40.990 08:19:53 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:40.990 08:19:53 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:40.990 08:19:53 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:40.990 08:19:53 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:40.990 08:19:53 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:40.990 08:19:53 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:40.990 08:19:53 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1339735 00:06:40.990 08:19:53 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 1339735 00:06:40.990 08:19:53 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:40.990 08:19:53 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 1339735 ']' 00:06:40.990 08:19:53 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:40.990 08:19:53 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:40.990 08:19:53 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:40.990 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:40.990 08:19:53 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:40.990 08:19:53 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:41.248 [2024-07-23 08:19:53.519239] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:06:41.249 [2024-07-23 08:19:53.519358] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1339735 ] 00:06:41.249 [2024-07-23 08:19:53.644043] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:41.507 [2024-07-23 08:19:53.869244] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.507 [2024-07-23 08:19:53.869254] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:42.449 08:19:54 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:42.449 08:19:54 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:06:42.449 08:19:54 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=1339991 00:06:42.449 08:19:54 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:42.449 08:19:54 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:42.709 [ 00:06:42.709 "bdev_malloc_delete", 00:06:42.709 "bdev_malloc_create", 00:06:42.709 "bdev_null_resize", 00:06:42.709 "bdev_null_delete", 00:06:42.709 "bdev_null_create", 00:06:42.709 "bdev_nvme_cuse_unregister", 00:06:42.709 "bdev_nvme_cuse_register", 00:06:42.709 "bdev_opal_new_user", 00:06:42.709 "bdev_opal_set_lock_state", 00:06:42.709 "bdev_opal_delete", 00:06:42.709 "bdev_opal_get_info", 00:06:42.709 "bdev_opal_create", 00:06:42.709 "bdev_nvme_opal_revert", 00:06:42.709 "bdev_nvme_opal_init", 00:06:42.709 "bdev_nvme_send_cmd", 00:06:42.709 "bdev_nvme_get_path_iostat", 00:06:42.709 "bdev_nvme_get_mdns_discovery_info", 00:06:42.709 "bdev_nvme_stop_mdns_discovery", 00:06:42.709 "bdev_nvme_start_mdns_discovery", 00:06:42.709 "bdev_nvme_set_multipath_policy", 00:06:42.709 "bdev_nvme_set_preferred_path", 00:06:42.709 "bdev_nvme_get_io_paths", 00:06:42.709 "bdev_nvme_remove_error_injection", 00:06:42.709 "bdev_nvme_add_error_injection", 00:06:42.709 "bdev_nvme_get_discovery_info", 00:06:42.709 "bdev_nvme_stop_discovery", 00:06:42.709 "bdev_nvme_start_discovery", 00:06:42.709 "bdev_nvme_get_controller_health_info", 00:06:42.709 "bdev_nvme_disable_controller", 00:06:42.709 "bdev_nvme_enable_controller", 00:06:42.709 "bdev_nvme_reset_controller", 00:06:42.709 "bdev_nvme_get_transport_statistics", 00:06:42.709 "bdev_nvme_apply_firmware", 00:06:42.709 "bdev_nvme_detach_controller", 00:06:42.709 "bdev_nvme_get_controllers", 00:06:42.709 "bdev_nvme_attach_controller", 00:06:42.709 "bdev_nvme_set_hotplug", 00:06:42.709 "bdev_nvme_set_options", 00:06:42.709 "bdev_passthru_delete", 00:06:42.709 "bdev_passthru_create", 00:06:42.709 "bdev_lvol_set_parent_bdev", 00:06:42.709 "bdev_lvol_set_parent", 00:06:42.709 "bdev_lvol_check_shallow_copy", 00:06:42.709 "bdev_lvol_start_shallow_copy", 00:06:42.709 "bdev_lvol_grow_lvstore", 00:06:42.709 "bdev_lvol_get_lvols", 00:06:42.709 "bdev_lvol_get_lvstores", 00:06:42.709 "bdev_lvol_delete", 00:06:42.709 "bdev_lvol_set_read_only", 00:06:42.709 "bdev_lvol_resize", 00:06:42.709 "bdev_lvol_decouple_parent", 00:06:42.709 "bdev_lvol_inflate", 00:06:42.709 "bdev_lvol_rename", 00:06:42.709 "bdev_lvol_clone_bdev", 00:06:42.709 "bdev_lvol_clone", 00:06:42.709 "bdev_lvol_snapshot", 00:06:42.709 "bdev_lvol_create", 00:06:42.709 "bdev_lvol_delete_lvstore", 00:06:42.709 "bdev_lvol_rename_lvstore", 00:06:42.709 "bdev_lvol_create_lvstore", 00:06:42.709 "bdev_raid_set_options", 00:06:42.709 "bdev_raid_remove_base_bdev", 00:06:42.709 "bdev_raid_add_base_bdev", 00:06:42.709 "bdev_raid_delete", 00:06:42.709 "bdev_raid_create", 00:06:42.709 "bdev_raid_get_bdevs", 00:06:42.709 "bdev_error_inject_error", 00:06:42.709 "bdev_error_delete", 00:06:42.709 "bdev_error_create", 00:06:42.709 "bdev_split_delete", 00:06:42.709 "bdev_split_create", 00:06:42.709 "bdev_delay_delete", 00:06:42.709 "bdev_delay_create", 00:06:42.709 "bdev_delay_update_latency", 00:06:42.709 "bdev_zone_block_delete", 00:06:42.709 "bdev_zone_block_create", 00:06:42.709 "blobfs_create", 00:06:42.709 "blobfs_detect", 00:06:42.709 "blobfs_set_cache_size", 00:06:42.709 "bdev_crypto_delete", 00:06:42.709 "bdev_crypto_create", 00:06:42.709 "bdev_compress_delete", 00:06:42.709 "bdev_compress_create", 00:06:42.709 "bdev_compress_get_orphans", 00:06:42.709 "bdev_aio_delete", 00:06:42.709 "bdev_aio_rescan", 00:06:42.709 "bdev_aio_create", 00:06:42.709 "bdev_ftl_set_property", 00:06:42.709 "bdev_ftl_get_properties", 00:06:42.709 "bdev_ftl_get_stats", 00:06:42.709 "bdev_ftl_unmap", 00:06:42.709 "bdev_ftl_unload", 00:06:42.709 "bdev_ftl_delete", 00:06:42.709 "bdev_ftl_load", 00:06:42.709 "bdev_ftl_create", 00:06:42.709 "bdev_virtio_attach_controller", 00:06:42.709 "bdev_virtio_scsi_get_devices", 00:06:42.709 "bdev_virtio_detach_controller", 00:06:42.709 "bdev_virtio_blk_set_hotplug", 00:06:42.709 "bdev_iscsi_delete", 00:06:42.709 "bdev_iscsi_create", 00:06:42.709 "bdev_iscsi_set_options", 00:06:42.709 "accel_error_inject_error", 00:06:42.709 "ioat_scan_accel_module", 00:06:42.709 "dsa_scan_accel_module", 00:06:42.709 "iaa_scan_accel_module", 00:06:42.709 "dpdk_cryptodev_get_driver", 00:06:42.709 "dpdk_cryptodev_set_driver", 00:06:42.709 "dpdk_cryptodev_scan_accel_module", 00:06:42.709 "compressdev_scan_accel_module", 00:06:42.709 "keyring_file_remove_key", 00:06:42.709 "keyring_file_add_key", 00:06:42.709 "keyring_linux_set_options", 00:06:42.709 "iscsi_get_histogram", 00:06:42.709 "iscsi_enable_histogram", 00:06:42.709 "iscsi_set_options", 00:06:42.709 "iscsi_get_auth_groups", 00:06:42.709 "iscsi_auth_group_remove_secret", 00:06:42.709 "iscsi_auth_group_add_secret", 00:06:42.709 "iscsi_delete_auth_group", 00:06:42.709 "iscsi_create_auth_group", 00:06:42.709 "iscsi_set_discovery_auth", 00:06:42.709 "iscsi_get_options", 00:06:42.709 "iscsi_target_node_request_logout", 00:06:42.709 "iscsi_target_node_set_redirect", 00:06:42.709 "iscsi_target_node_set_auth", 00:06:42.709 "iscsi_target_node_add_lun", 00:06:42.709 "iscsi_get_stats", 00:06:42.709 "iscsi_get_connections", 00:06:42.709 "iscsi_portal_group_set_auth", 00:06:42.709 "iscsi_start_portal_group", 00:06:42.709 "iscsi_delete_portal_group", 00:06:42.709 "iscsi_create_portal_group", 00:06:42.709 "iscsi_get_portal_groups", 00:06:42.709 "iscsi_delete_target_node", 00:06:42.709 "iscsi_target_node_remove_pg_ig_maps", 00:06:42.709 "iscsi_target_node_add_pg_ig_maps", 00:06:42.709 "iscsi_create_target_node", 00:06:42.709 "iscsi_get_target_nodes", 00:06:42.709 "iscsi_delete_initiator_group", 00:06:42.709 "iscsi_initiator_group_remove_initiators", 00:06:42.709 "iscsi_initiator_group_add_initiators", 00:06:42.709 "iscsi_create_initiator_group", 00:06:42.709 "iscsi_get_initiator_groups", 00:06:42.709 "nvmf_set_crdt", 00:06:42.709 "nvmf_set_config", 00:06:42.709 "nvmf_set_max_subsystems", 00:06:42.709 "nvmf_stop_mdns_prr", 00:06:42.709 "nvmf_publish_mdns_prr", 00:06:42.709 "nvmf_subsystem_get_listeners", 00:06:42.709 "nvmf_subsystem_get_qpairs", 00:06:42.709 "nvmf_subsystem_get_controllers", 00:06:42.709 "nvmf_get_stats", 00:06:42.709 "nvmf_get_transports", 00:06:42.709 "nvmf_create_transport", 00:06:42.709 "nvmf_get_targets", 00:06:42.709 "nvmf_delete_target", 00:06:42.709 "nvmf_create_target", 00:06:42.709 "nvmf_subsystem_allow_any_host", 00:06:42.709 "nvmf_subsystem_remove_host", 00:06:42.709 "nvmf_subsystem_add_host", 00:06:42.709 "nvmf_ns_remove_host", 00:06:42.709 "nvmf_ns_add_host", 00:06:42.709 "nvmf_subsystem_remove_ns", 00:06:42.709 "nvmf_subsystem_add_ns", 00:06:42.709 "nvmf_subsystem_listener_set_ana_state", 00:06:42.709 "nvmf_discovery_get_referrals", 00:06:42.709 "nvmf_discovery_remove_referral", 00:06:42.709 "nvmf_discovery_add_referral", 00:06:42.709 "nvmf_subsystem_remove_listener", 00:06:42.709 "nvmf_subsystem_add_listener", 00:06:42.709 "nvmf_delete_subsystem", 00:06:42.709 "nvmf_create_subsystem", 00:06:42.709 "nvmf_get_subsystems", 00:06:42.709 "env_dpdk_get_mem_stats", 00:06:42.710 "nbd_get_disks", 00:06:42.710 "nbd_stop_disk", 00:06:42.710 "nbd_start_disk", 00:06:42.710 "ublk_recover_disk", 00:06:42.710 "ublk_get_disks", 00:06:42.710 "ublk_stop_disk", 00:06:42.710 "ublk_start_disk", 00:06:42.710 "ublk_destroy_target", 00:06:42.710 "ublk_create_target", 00:06:42.710 "virtio_blk_create_transport", 00:06:42.710 "virtio_blk_get_transports", 00:06:42.710 "vhost_controller_set_coalescing", 00:06:42.710 "vhost_get_controllers", 00:06:42.710 "vhost_delete_controller", 00:06:42.710 "vhost_create_blk_controller", 00:06:42.710 "vhost_scsi_controller_remove_target", 00:06:42.710 "vhost_scsi_controller_add_target", 00:06:42.710 "vhost_start_scsi_controller", 00:06:42.710 "vhost_create_scsi_controller", 00:06:42.710 "thread_set_cpumask", 00:06:42.710 "framework_get_governor", 00:06:42.710 "framework_get_scheduler", 00:06:42.710 "framework_set_scheduler", 00:06:42.710 "framework_get_reactors", 00:06:42.710 "thread_get_io_channels", 00:06:42.710 "thread_get_pollers", 00:06:42.710 "thread_get_stats", 00:06:42.710 "framework_monitor_context_switch", 00:06:42.710 "spdk_kill_instance", 00:06:42.710 "log_enable_timestamps", 00:06:42.710 "log_get_flags", 00:06:42.710 "log_clear_flag", 00:06:42.710 "log_set_flag", 00:06:42.710 "log_get_level", 00:06:42.710 "log_set_level", 00:06:42.710 "log_get_print_level", 00:06:42.710 "log_set_print_level", 00:06:42.710 "framework_enable_cpumask_locks", 00:06:42.710 "framework_disable_cpumask_locks", 00:06:42.710 "framework_wait_init", 00:06:42.710 "framework_start_init", 00:06:42.710 "scsi_get_devices", 00:06:42.710 "bdev_get_histogram", 00:06:42.710 "bdev_enable_histogram", 00:06:42.710 "bdev_set_qos_limit", 00:06:42.710 "bdev_set_qd_sampling_period", 00:06:42.710 "bdev_get_bdevs", 00:06:42.710 "bdev_reset_iostat", 00:06:42.710 "bdev_get_iostat", 00:06:42.710 "bdev_examine", 00:06:42.710 "bdev_wait_for_examine", 00:06:42.710 "bdev_set_options", 00:06:42.710 "notify_get_notifications", 00:06:42.710 "notify_get_types", 00:06:42.710 "accel_get_stats", 00:06:42.710 "accel_set_options", 00:06:42.710 "accel_set_driver", 00:06:42.710 "accel_crypto_key_destroy", 00:06:42.710 "accel_crypto_keys_get", 00:06:42.710 "accel_crypto_key_create", 00:06:42.710 "accel_assign_opc", 00:06:42.710 "accel_get_module_info", 00:06:42.710 "accel_get_opc_assignments", 00:06:42.710 "vmd_rescan", 00:06:42.710 "vmd_remove_device", 00:06:42.710 "vmd_enable", 00:06:42.710 "sock_get_default_impl", 00:06:42.710 "sock_set_default_impl", 00:06:42.710 "sock_impl_set_options", 00:06:42.710 "sock_impl_get_options", 00:06:42.710 "iobuf_get_stats", 00:06:42.710 "iobuf_set_options", 00:06:42.710 "framework_get_pci_devices", 00:06:42.710 "framework_get_config", 00:06:42.710 "framework_get_subsystems", 00:06:42.710 "trace_get_info", 00:06:42.710 "trace_get_tpoint_group_mask", 00:06:42.710 "trace_disable_tpoint_group", 00:06:42.710 "trace_enable_tpoint_group", 00:06:42.710 "trace_clear_tpoint_mask", 00:06:42.710 "trace_set_tpoint_mask", 00:06:42.710 "keyring_get_keys", 00:06:42.710 "spdk_get_version", 00:06:42.710 "rpc_get_methods" 00:06:42.710 ] 00:06:42.710 08:19:55 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:42.710 08:19:55 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:42.710 08:19:55 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:42.710 08:19:55 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:42.710 08:19:55 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 1339735 00:06:42.710 08:19:55 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 1339735 ']' 00:06:42.710 08:19:55 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 1339735 00:06:42.710 08:19:55 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:06:42.710 08:19:55 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:42.710 08:19:55 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1339735 00:06:42.710 08:19:55 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:42.710 08:19:55 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:42.710 08:19:55 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1339735' 00:06:42.710 killing process with pid 1339735 00:06:42.710 08:19:55 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 1339735 00:06:42.710 08:19:55 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 1339735 00:06:45.247 00:06:45.247 real 0m4.252s 00:06:45.247 user 0m7.485s 00:06:45.247 sys 0m0.588s 00:06:45.247 08:19:57 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:45.247 08:19:57 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:45.247 ************************************ 00:06:45.247 END TEST spdkcli_tcp 00:06:45.247 ************************************ 00:06:45.247 08:19:57 -- common/autotest_common.sh@1142 -- # return 0 00:06:45.247 08:19:57 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:45.247 08:19:57 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:45.247 08:19:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:45.247 08:19:57 -- common/autotest_common.sh@10 -- # set +x 00:06:45.247 ************************************ 00:06:45.247 START TEST dpdk_mem_utility 00:06:45.247 ************************************ 00:06:45.247 08:19:57 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:45.247 * Looking for test storage... 00:06:45.247 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:06:45.247 08:19:57 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:45.247 08:19:57 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1340567 00:06:45.247 08:19:57 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1340567 00:06:45.247 08:19:57 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:45.247 08:19:57 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 1340567 ']' 00:06:45.247 08:19:57 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:45.247 08:19:57 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:45.247 08:19:57 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:45.247 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:45.247 08:19:57 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:45.247 08:19:57 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:45.507 [2024-07-23 08:19:57.813753] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:06:45.507 [2024-07-23 08:19:57.813846] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1340567 ] 00:06:45.507 [2024-07-23 08:19:57.936924] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.766 [2024-07-23 08:19:58.144598] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.702 08:19:59 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:46.702 08:19:59 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:06:46.702 08:19:59 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:46.702 08:19:59 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:46.702 08:19:59 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:46.702 08:19:59 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:46.702 { 00:06:46.702 "filename": "/tmp/spdk_mem_dump.txt" 00:06:46.702 } 00:06:46.702 08:19:59 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:46.702 08:19:59 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:46.702 DPDK memory size 822.000000 MiB in 2 heap(s) 00:06:46.702 2 heaps totaling size 822.000000 MiB 00:06:46.702 size: 820.000000 MiB heap id: 0 00:06:46.703 size: 2.000000 MiB heap id: 1 00:06:46.703 end heaps---------- 00:06:46.703 8 mempools totaling size 598.116089 MiB 00:06:46.703 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:46.703 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:46.703 size: 84.521057 MiB name: bdev_io_1340567 00:06:46.703 size: 51.011292 MiB name: evtpool_1340567 00:06:46.703 size: 50.003479 MiB name: msgpool_1340567 00:06:46.703 size: 21.763794 MiB name: PDU_Pool 00:06:46.703 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:46.703 size: 0.026123 MiB name: Session_Pool 00:06:46.703 end mempools------- 00:06:46.703 201 memzones totaling size 4.176453 MiB 00:06:46.703 size: 1.000366 MiB name: RG_ring_0_1340567 00:06:46.703 size: 1.000366 MiB name: RG_ring_1_1340567 00:06:46.703 size: 1.000366 MiB name: RG_ring_4_1340567 00:06:46.703 size: 1.000366 MiB name: RG_ring_5_1340567 00:06:46.703 size: 0.125366 MiB name: RG_ring_2_1340567 00:06:46.703 size: 0.015991 MiB name: RG_ring_3_1340567 00:06:46.703 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:46.703 size: 0.000305 MiB name: 0000:b1:01.0_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b1:01.1_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b1:01.2_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b1:01.3_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b1:01.4_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b1:01.5_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b1:01.6_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b1:01.7_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b1:02.0_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b1:02.1_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b1:02.2_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b1:02.3_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b1:02.4_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b1:02.5_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b1:02.6_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b1:02.7_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b3:01.0_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b3:01.1_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b3:01.2_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b3:01.3_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b3:01.4_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b3:01.5_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b3:01.6_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b3:01.7_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b3:02.0_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b3:02.1_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b3:02.2_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b3:02.3_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b3:02.4_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b3:02.5_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b3:02.6_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b3:02.7_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b5:01.0_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b5:01.1_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b5:01.2_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b5:01.3_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b5:01.4_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b5:01.5_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b5:01.6_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b5:01.7_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b5:02.0_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b5:02.1_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b5:02.2_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b5:02.3_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b5:02.4_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b5:02.5_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b5:02.6_qat 00:06:46.703 size: 0.000305 MiB name: 0000:b5:02.7_qat 00:06:46.703 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_0 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_1 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_2 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_3 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_4 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_5 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_6 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_7 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_8 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_9 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_10 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_11 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_12 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_13 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_14 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_15 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_16 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_17 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_18 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_19 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_20 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_21 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_22 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_23 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_24 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_25 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_26 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_27 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_28 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:46.703 size: 0.000122 MiB name: rte_compressdev_data_29 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:46.703 size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_30 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_31 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_64 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_65 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_32 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_66 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_67 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_33 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_68 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_69 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_34 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_70 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_71 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_35 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_72 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_73 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_36 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_74 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_75 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_37 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_76 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_77 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_38 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_78 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_79 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_39 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_80 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_81 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_40 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_82 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_83 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_41 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_84 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_85 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_42 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_86 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_87 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_43 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_88 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_89 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_44 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_90 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_91 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_45 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_92 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_93 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_46 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_94 00:06:46.704 size: 0.000122 MiB name: rte_cryptodev_data_95 00:06:46.704 size: 0.000122 MiB name: rte_compressdev_data_47 00:06:46.704 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:46.704 end memzones------- 00:06:46.704 08:19:59 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:46.966 heap id: 0 total size: 820.000000 MiB number of busy elements: 487 number of free elements: 17 00:06:46.966 list of free elements. size: 17.809082 MiB 00:06:46.966 element at address: 0x200000400000 with size: 1.999451 MiB 00:06:46.966 element at address: 0x200000800000 with size: 1.996887 MiB 00:06:46.966 element at address: 0x200007000000 with size: 1.995972 MiB 00:06:46.966 element at address: 0x20000b200000 with size: 1.995972 MiB 00:06:46.966 element at address: 0x200019100040 with size: 0.999939 MiB 00:06:46.966 element at address: 0x200019500040 with size: 0.999939 MiB 00:06:46.966 element at address: 0x200019600000 with size: 0.999329 MiB 00:06:46.966 element at address: 0x200003e00000 with size: 0.996338 MiB 00:06:46.966 element at address: 0x200032200000 with size: 0.994324 MiB 00:06:46.966 element at address: 0x200018e00000 with size: 0.959900 MiB 00:06:46.966 element at address: 0x200019900040 with size: 0.937256 MiB 00:06:46.966 element at address: 0x20001b000000 with size: 0.583191 MiB 00:06:46.966 element at address: 0x200003a00000 with size: 0.498352 MiB 00:06:46.966 element at address: 0x200019200000 with size: 0.491150 MiB 00:06:46.966 element at address: 0x200019a00000 with size: 0.485657 MiB 00:06:46.966 element at address: 0x200013800000 with size: 0.467651 MiB 00:06:46.966 element at address: 0x200028400000 with size: 0.407776 MiB 00:06:46.966 list of standard malloc elements. size: 199.901123 MiB 00:06:46.966 element at address: 0x20000b3fef80 with size: 132.000183 MiB 00:06:46.966 element at address: 0x2000071fef80 with size: 64.000183 MiB 00:06:46.966 element at address: 0x200018ffff80 with size: 1.000183 MiB 00:06:46.966 element at address: 0x2000193fff80 with size: 1.000183 MiB 00:06:46.966 element at address: 0x2000197fff80 with size: 1.000183 MiB 00:06:46.966 element at address: 0x2000003d9e80 with size: 0.140808 MiB 00:06:46.966 element at address: 0x2000199eff40 with size: 0.062683 MiB 00:06:46.966 element at address: 0x2000003fdf40 with size: 0.007996 MiB 00:06:46.966 element at address: 0x200000337700 with size: 0.004456 MiB 00:06:46.966 element at address: 0x20000033adc0 with size: 0.004456 MiB 00:06:46.966 element at address: 0x20000033e480 with size: 0.004456 MiB 00:06:46.966 element at address: 0x200000341b40 with size: 0.004456 MiB 00:06:46.966 element at address: 0x200000345200 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003488c0 with size: 0.004456 MiB 00:06:46.966 element at address: 0x20000034bf80 with size: 0.004456 MiB 00:06:46.966 element at address: 0x20000034f640 with size: 0.004456 MiB 00:06:46.966 element at address: 0x200000352d00 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003563c0 with size: 0.004456 MiB 00:06:46.966 element at address: 0x200000359a80 with size: 0.004456 MiB 00:06:46.966 element at address: 0x20000035d140 with size: 0.004456 MiB 00:06:46.966 element at address: 0x200000360800 with size: 0.004456 MiB 00:06:46.966 element at address: 0x200000363ec0 with size: 0.004456 MiB 00:06:46.966 element at address: 0x200000367580 with size: 0.004456 MiB 00:06:46.966 element at address: 0x20000036ac40 with size: 0.004456 MiB 00:06:46.966 element at address: 0x20000036e300 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003719c0 with size: 0.004456 MiB 00:06:46.966 element at address: 0x200000375080 with size: 0.004456 MiB 00:06:46.966 element at address: 0x200000378740 with size: 0.004456 MiB 00:06:46.966 element at address: 0x20000037be00 with size: 0.004456 MiB 00:06:46.966 element at address: 0x20000037f4c0 with size: 0.004456 MiB 00:06:46.966 element at address: 0x200000382b80 with size: 0.004456 MiB 00:06:46.966 element at address: 0x200000386240 with size: 0.004456 MiB 00:06:46.966 element at address: 0x200000389900 with size: 0.004456 MiB 00:06:46.966 element at address: 0x20000038cfc0 with size: 0.004456 MiB 00:06:46.966 element at address: 0x200000390680 with size: 0.004456 MiB 00:06:46.966 element at address: 0x200000393d40 with size: 0.004456 MiB 00:06:46.966 element at address: 0x200000397400 with size: 0.004456 MiB 00:06:46.966 element at address: 0x20000039aac0 with size: 0.004456 MiB 00:06:46.966 element at address: 0x20000039e180 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003a1840 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003a4f00 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003a85c0 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003abc80 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003af340 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003b2a00 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003b60c0 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003b9780 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003bce40 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003c0500 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003c3bc0 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003c7280 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003ca940 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003ce000 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003d16c0 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003d4d80 with size: 0.004456 MiB 00:06:46.966 element at address: 0x2000003d8c40 with size: 0.004456 MiB 00:06:46.966 element at address: 0x200000335580 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000336640 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000338c40 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000339d00 with size: 0.004089 MiB 00:06:46.966 element at address: 0x20000033c300 with size: 0.004089 MiB 00:06:46.966 element at address: 0x20000033d3c0 with size: 0.004089 MiB 00:06:46.966 element at address: 0x20000033f9c0 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000340a80 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000343080 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000344140 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000346740 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000347800 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000349e00 with size: 0.004089 MiB 00:06:46.966 element at address: 0x20000034aec0 with size: 0.004089 MiB 00:06:46.966 element at address: 0x20000034d4c0 with size: 0.004089 MiB 00:06:46.966 element at address: 0x20000034e580 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000350b80 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000351c40 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000354240 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000355300 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000357900 with size: 0.004089 MiB 00:06:46.966 element at address: 0x2000003589c0 with size: 0.004089 MiB 00:06:46.966 element at address: 0x20000035afc0 with size: 0.004089 MiB 00:06:46.966 element at address: 0x20000035c080 with size: 0.004089 MiB 00:06:46.966 element at address: 0x20000035e680 with size: 0.004089 MiB 00:06:46.966 element at address: 0x20000035f740 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000361d40 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000362e00 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000365400 with size: 0.004089 MiB 00:06:46.966 element at address: 0x2000003664c0 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000368ac0 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000369b80 with size: 0.004089 MiB 00:06:46.966 element at address: 0x20000036c180 with size: 0.004089 MiB 00:06:46.966 element at address: 0x20000036d240 with size: 0.004089 MiB 00:06:46.966 element at address: 0x20000036f840 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000370900 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000372f00 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000373fc0 with size: 0.004089 MiB 00:06:46.966 element at address: 0x2000003765c0 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000377680 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000379c80 with size: 0.004089 MiB 00:06:46.966 element at address: 0x20000037ad40 with size: 0.004089 MiB 00:06:46.966 element at address: 0x20000037d340 with size: 0.004089 MiB 00:06:46.966 element at address: 0x20000037e400 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000380a00 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000381ac0 with size: 0.004089 MiB 00:06:46.966 element at address: 0x2000003840c0 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000385180 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000387780 with size: 0.004089 MiB 00:06:46.966 element at address: 0x200000388840 with size: 0.004089 MiB 00:06:46.967 element at address: 0x20000038ae40 with size: 0.004089 MiB 00:06:46.967 element at address: 0x20000038bf00 with size: 0.004089 MiB 00:06:46.967 element at address: 0x20000038e500 with size: 0.004089 MiB 00:06:46.967 element at address: 0x20000038f5c0 with size: 0.004089 MiB 00:06:46.967 element at address: 0x200000391bc0 with size: 0.004089 MiB 00:06:46.967 element at address: 0x200000392c80 with size: 0.004089 MiB 00:06:46.967 element at address: 0x200000395280 with size: 0.004089 MiB 00:06:46.967 element at address: 0x200000396340 with size: 0.004089 MiB 00:06:46.967 element at address: 0x200000398940 with size: 0.004089 MiB 00:06:46.967 element at address: 0x200000399a00 with size: 0.004089 MiB 00:06:46.967 element at address: 0x20000039c000 with size: 0.004089 MiB 00:06:46.967 element at address: 0x20000039d0c0 with size: 0.004089 MiB 00:06:46.967 element at address: 0x20000039f6c0 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003a0780 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003a2d80 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003a3e40 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003a6440 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003a7500 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003a9b00 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003aabc0 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003ad1c0 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003ae280 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003b0880 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003b1940 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003b3f40 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003b5000 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003b7600 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003b86c0 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003bacc0 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003bbd80 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003be380 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003bf440 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003c1a40 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003c2b00 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003c5100 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003c61c0 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003c87c0 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003c9880 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003cbe80 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003ccf40 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003cf540 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003d0600 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003d2c00 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003d3cc0 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003d6ac0 with size: 0.004089 MiB 00:06:46.967 element at address: 0x2000003d7b80 with size: 0.004089 MiB 00:06:46.967 element at address: 0x20000b1ff040 with size: 0.000427 MiB 00:06:46.967 element at address: 0x200000200000 with size: 0.000366 MiB 00:06:46.967 element at address: 0x20000020b480 with size: 0.000366 MiB 00:06:46.967 element at address: 0x2000137ff040 with size: 0.000305 MiB 00:06:46.967 element at address: 0x200000200180 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000200280 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000200380 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000200480 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000200580 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000200680 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000200780 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000200880 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000200980 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000200a80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000200b80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000200c80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000200d80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000200e80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000200f80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000201080 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000201180 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000201280 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000201380 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000201480 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000201580 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000201680 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000201780 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000201880 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000201980 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000201a80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000201b80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000201c80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000201d80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000201e80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000201f80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000202080 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000202180 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000202280 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000202380 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000202480 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000202580 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000202680 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000202780 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000202880 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000202980 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000202a80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000202b80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000202c80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000202d80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000202e80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000202f80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000203080 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000203180 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000203280 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000203380 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000203480 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000203580 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000203680 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000203780 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000203880 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000203980 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000203a80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000203b80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000203c80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000203d80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000203e80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000203f80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000204080 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000204180 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000204280 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000204380 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000204480 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000204580 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000204680 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000204780 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000204880 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000204980 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000204a80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000204b80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000204c80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000204d80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000204e80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000204f80 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000205080 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000205180 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000205280 with size: 0.000244 MiB 00:06:46.967 element at address: 0x200000205380 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000205480 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000205580 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000205680 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000205780 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000205880 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000205980 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000205a80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000205b80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000205c80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000205d80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000205e80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000205f80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000206080 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000206180 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000206280 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000206380 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000206480 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000206580 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000206680 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000206780 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000206880 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000206980 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000206a80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000206b80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000206c80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000206d80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000206e80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000206f80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000207080 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000207180 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000207280 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000207380 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000207480 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000207580 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000207680 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000207780 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000207880 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000207980 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000207a80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000207b80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000207c80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000207d80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000207e80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000207f80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000208080 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000208180 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000208280 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000208380 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000208480 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000208580 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000208680 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000208780 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000208880 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000208980 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000208a80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000208b80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000208c80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000208d80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000208e80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000208f80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000209080 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000209180 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000209280 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000209380 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000209480 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000209580 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000209680 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000209780 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000209880 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000209980 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000209a80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000209b80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000209c80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000209d80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000209e80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000209f80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020a080 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020a180 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020a280 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020a380 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020a480 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020a580 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020a680 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020a780 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020a880 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020a980 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020aa80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020ab80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020ac80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020ad80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020ae80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020af80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020b080 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020b180 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020b280 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020b380 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020b600 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020b700 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020b800 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020b900 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020ba00 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020bb00 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020bc00 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020bd00 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020be00 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020bf00 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020c000 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020c100 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020c200 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020c300 with size: 0.000244 MiB 00:06:46.968 element at address: 0x20000020c400 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000230980 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000230a80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000230b80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000230c80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000230d80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000230e80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000230f80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000231080 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000231180 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000231280 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000231380 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000231480 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000231580 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000231680 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000231780 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000231880 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000231980 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000231a80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000231b80 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000231e00 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000231f00 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000232000 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000232100 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000232200 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000232300 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000232400 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000232500 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000232600 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000232700 with size: 0.000244 MiB 00:06:46.968 element at address: 0x200000232800 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000232900 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000232a00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000232b00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000232c00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000232d00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000232e00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000232f00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000335180 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000335280 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000338940 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000033c000 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000033f6c0 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000342d80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000346440 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000349b00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000034d1c0 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000350880 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000353f40 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000357600 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000035acc0 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000035e380 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000361a40 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000365100 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003687c0 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000036be80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000036f540 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000372c00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003762c0 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000379980 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000037d040 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000380700 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000383dc0 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000387480 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000038ab40 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000038e200 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003918c0 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000394f80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200000398640 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000039bd00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000039f3c0 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003a2a80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003a6140 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003a9800 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003acec0 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003b0580 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003b3c40 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003b7300 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003ba9c0 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003be080 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003c1740 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003c4e00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003c84c0 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003cbb80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003cf240 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003d2900 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000b1ff200 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000b1ff300 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000b1ff400 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000b1ff500 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000b1ff600 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000b1ff700 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000b1ff800 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000b1ff900 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000b1ffa00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000b1ffb00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000b1ffc00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000b1ffd00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000b1ffe00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20000b1fff00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000137ff180 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000137ff280 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000137ff380 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000137ff480 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000137ff580 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000137ff680 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000137ff780 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000137ff880 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000137ff980 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000137ffa80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000137ffb80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000137ffc80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000137fff00 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200013877b80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200013877c80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200013877d80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200013877e80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200013877f80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200013878080 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200013878180 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200013878280 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200013878380 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200013878480 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200013878580 with size: 0.000244 MiB 00:06:46.969 element at address: 0x2000138f88c0 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200028468640 with size: 0.000244 MiB 00:06:46.969 element at address: 0x200028468740 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20002846f400 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20002846f680 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20002846f780 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20002846f880 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20002846f980 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20002846fa80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20002846fb80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20002846fc80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20002846fd80 with size: 0.000244 MiB 00:06:46.969 element at address: 0x20002846fe80 with size: 0.000244 MiB 00:06:46.969 list of memzone associated elements. size: 602.289795 MiB 00:06:46.969 element at address: 0x20001b0954c0 with size: 211.416809 MiB 00:06:46.969 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:46.969 element at address: 0x20002846ff80 with size: 157.562622 MiB 00:06:46.969 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:46.969 element at address: 0x2000139fab40 with size: 84.020691 MiB 00:06:46.969 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1340567_0 00:06:46.969 element at address: 0x2000009ff340 with size: 48.003113 MiB 00:06:46.969 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1340567_0 00:06:46.969 element at address: 0x200003fff340 with size: 48.003113 MiB 00:06:46.969 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1340567_0 00:06:46.969 element at address: 0x200019bbe900 with size: 20.255615 MiB 00:06:46.969 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:46.969 element at address: 0x2000323feb00 with size: 18.005127 MiB 00:06:46.969 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:46.969 element at address: 0x2000005ffdc0 with size: 2.000549 MiB 00:06:46.969 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1340567 00:06:46.969 element at address: 0x200003bffdc0 with size: 2.000549 MiB 00:06:46.969 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1340567 00:06:46.969 element at address: 0x200000233000 with size: 1.008179 MiB 00:06:46.969 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1340567 00:06:46.969 element at address: 0x2000192fde00 with size: 1.008179 MiB 00:06:46.969 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:46.969 element at address: 0x200019abc780 with size: 1.008179 MiB 00:06:46.969 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:46.969 element at address: 0x200018efde00 with size: 1.008179 MiB 00:06:46.969 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:46.970 element at address: 0x2000138f89c0 with size: 1.008179 MiB 00:06:46.970 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:46.970 element at address: 0x200003eff100 with size: 1.000549 MiB 00:06:46.970 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1340567 00:06:46.970 element at address: 0x200003affb80 with size: 1.000549 MiB 00:06:46.970 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1340567 00:06:46.970 element at address: 0x2000196ffd40 with size: 1.000549 MiB 00:06:46.970 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1340567 00:06:46.970 element at address: 0x2000322fe8c0 with size: 1.000549 MiB 00:06:46.970 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1340567 00:06:46.970 element at address: 0x200003a7f940 with size: 0.500549 MiB 00:06:46.970 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1340567 00:06:46.970 element at address: 0x20001927dbc0 with size: 0.500549 MiB 00:06:46.970 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:46.970 element at address: 0x200013878680 with size: 0.500549 MiB 00:06:46.970 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:46.970 element at address: 0x200019a7c540 with size: 0.250549 MiB 00:06:46.970 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:46.970 element at address: 0x200000210740 with size: 0.125549 MiB 00:06:46.970 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1340567 00:06:46.970 element at address: 0x200018ef5bc0 with size: 0.031799 MiB 00:06:46.970 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:46.970 element at address: 0x200028468840 with size: 0.023804 MiB 00:06:46.970 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:46.970 element at address: 0x20000020c500 with size: 0.016174 MiB 00:06:46.970 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1340567 00:06:46.970 element at address: 0x20002846e9c0 with size: 0.002502 MiB 00:06:46.970 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:46.970 element at address: 0x2000003d60c0 with size: 0.001343 MiB 00:06:46.970 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:46.970 element at address: 0x2000003d68c0 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b1:01.0_qat 00:06:46.970 element at address: 0x2000003d2a00 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b1:01.1_qat 00:06:46.970 element at address: 0x2000003cf340 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b1:01.2_qat 00:06:46.970 element at address: 0x2000003cbc80 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b1:01.3_qat 00:06:46.970 element at address: 0x2000003c85c0 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b1:01.4_qat 00:06:46.970 element at address: 0x2000003c4f00 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b1:01.5_qat 00:06:46.970 element at address: 0x2000003c1840 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b1:01.6_qat 00:06:46.970 element at address: 0x2000003be180 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b1:01.7_qat 00:06:46.970 element at address: 0x2000003baac0 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b1:02.0_qat 00:06:46.970 element at address: 0x2000003b7400 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b1:02.1_qat 00:06:46.970 element at address: 0x2000003b3d40 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b1:02.2_qat 00:06:46.970 element at address: 0x2000003b0680 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b1:02.3_qat 00:06:46.970 element at address: 0x2000003acfc0 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b1:02.4_qat 00:06:46.970 element at address: 0x2000003a9900 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b1:02.5_qat 00:06:46.970 element at address: 0x2000003a6240 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b1:02.6_qat 00:06:46.970 element at address: 0x2000003a2b80 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b1:02.7_qat 00:06:46.970 element at address: 0x20000039f4c0 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b3:01.0_qat 00:06:46.970 element at address: 0x20000039be00 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b3:01.1_qat 00:06:46.970 element at address: 0x200000398740 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b3:01.2_qat 00:06:46.970 element at address: 0x200000395080 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b3:01.3_qat 00:06:46.970 element at address: 0x2000003919c0 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b3:01.4_qat 00:06:46.970 element at address: 0x20000038e300 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b3:01.5_qat 00:06:46.970 element at address: 0x20000038ac40 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b3:01.6_qat 00:06:46.970 element at address: 0x200000387580 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b3:01.7_qat 00:06:46.970 element at address: 0x200000383ec0 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b3:02.0_qat 00:06:46.970 element at address: 0x200000380800 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b3:02.1_qat 00:06:46.970 element at address: 0x20000037d140 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b3:02.2_qat 00:06:46.970 element at address: 0x200000379a80 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b3:02.3_qat 00:06:46.970 element at address: 0x2000003763c0 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b3:02.4_qat 00:06:46.970 element at address: 0x200000372d00 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b3:02.5_qat 00:06:46.970 element at address: 0x20000036f640 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b3:02.6_qat 00:06:46.970 element at address: 0x20000036bf80 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b3:02.7_qat 00:06:46.970 element at address: 0x2000003688c0 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b5:01.0_qat 00:06:46.970 element at address: 0x200000365200 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b5:01.1_qat 00:06:46.970 element at address: 0x200000361b40 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b5:01.2_qat 00:06:46.970 element at address: 0x20000035e480 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b5:01.3_qat 00:06:46.970 element at address: 0x20000035adc0 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b5:01.4_qat 00:06:46.970 element at address: 0x200000357700 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b5:01.5_qat 00:06:46.970 element at address: 0x200000354040 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b5:01.6_qat 00:06:46.970 element at address: 0x200000350980 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b5:01.7_qat 00:06:46.970 element at address: 0x20000034d2c0 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b5:02.0_qat 00:06:46.970 element at address: 0x200000349c00 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b5:02.1_qat 00:06:46.970 element at address: 0x200000346540 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b5:02.2_qat 00:06:46.970 element at address: 0x200000342e80 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b5:02.3_qat 00:06:46.970 element at address: 0x20000033f7c0 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b5:02.4_qat 00:06:46.970 element at address: 0x20000033c100 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b5:02.5_qat 00:06:46.970 element at address: 0x200000338a40 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b5:02.6_qat 00:06:46.970 element at address: 0x200000335380 with size: 0.000488 MiB 00:06:46.970 associated memzone info: size: 0.000305 MiB name: 0000:b5:02.7_qat 00:06:46.970 element at address: 0x2000003d6740 with size: 0.000366 MiB 00:06:46.970 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:46.970 element at address: 0x200000231c80 with size: 0.000366 MiB 00:06:46.970 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1340567 00:06:46.970 element at address: 0x2000137ffd80 with size: 0.000366 MiB 00:06:46.970 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1340567 00:06:46.970 element at address: 0x20002846f500 with size: 0.000366 MiB 00:06:46.970 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:46.970 element at address: 0x2000003d5fc0 with size: 0.000244 MiB 00:06:46.970 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:46.970 08:19:59 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:46.970 08:19:59 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1340567 00:06:46.970 08:19:59 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 1340567 ']' 00:06:46.970 08:19:59 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 1340567 00:06:46.970 08:19:59 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:06:46.970 08:19:59 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:46.971 08:19:59 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1340567 00:06:46.971 08:19:59 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:46.971 08:19:59 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:46.971 08:19:59 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1340567' 00:06:46.971 killing process with pid 1340567 00:06:46.971 08:19:59 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 1340567 00:06:46.971 08:19:59 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 1340567 00:06:49.507 00:06:49.507 real 0m4.098s 00:06:49.507 user 0m4.032s 00:06:49.507 sys 0m0.544s 00:06:49.507 08:20:01 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:49.507 08:20:01 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:49.507 ************************************ 00:06:49.507 END TEST dpdk_mem_utility 00:06:49.507 ************************************ 00:06:49.507 08:20:01 -- common/autotest_common.sh@1142 -- # return 0 00:06:49.507 08:20:01 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:49.507 08:20:01 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:49.507 08:20:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:49.507 08:20:01 -- common/autotest_common.sh@10 -- # set +x 00:06:49.507 ************************************ 00:06:49.507 START TEST event 00:06:49.507 ************************************ 00:06:49.507 08:20:01 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:49.507 * Looking for test storage... 00:06:49.507 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:06:49.507 08:20:01 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:49.507 08:20:01 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:49.507 08:20:01 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:49.507 08:20:01 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:06:49.507 08:20:01 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:49.507 08:20:01 event -- common/autotest_common.sh@10 -- # set +x 00:06:49.507 ************************************ 00:06:49.507 START TEST event_perf 00:06:49.507 ************************************ 00:06:49.507 08:20:01 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:49.507 Running I/O for 1 seconds...[2024-07-23 08:20:01.942468] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:06:49.507 [2024-07-23 08:20:01.942549] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1341379 ] 00:06:49.766 [2024-07-23 08:20:02.066763] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:50.024 [2024-07-23 08:20:02.297520] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:50.024 [2024-07-23 08:20:02.297593] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:50.024 [2024-07-23 08:20:02.297680] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.024 [2024-07-23 08:20:02.297690] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:51.401 Running I/O for 1 seconds... 00:06:51.401 lcore 0: 199186 00:06:51.401 lcore 1: 199183 00:06:51.401 lcore 2: 199184 00:06:51.401 lcore 3: 199184 00:06:51.401 done. 00:06:51.401 00:06:51.401 real 0m1.813s 00:06:51.401 user 0m4.641s 00:06:51.401 sys 0m0.164s 00:06:51.401 08:20:03 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:51.401 08:20:03 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:51.401 ************************************ 00:06:51.401 END TEST event_perf 00:06:51.401 ************************************ 00:06:51.401 08:20:03 event -- common/autotest_common.sh@1142 -- # return 0 00:06:51.401 08:20:03 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:51.401 08:20:03 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:06:51.401 08:20:03 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:51.401 08:20:03 event -- common/autotest_common.sh@10 -- # set +x 00:06:51.401 ************************************ 00:06:51.401 START TEST event_reactor 00:06:51.401 ************************************ 00:06:51.401 08:20:03 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:51.401 [2024-07-23 08:20:03.832113] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:06:51.401 [2024-07-23 08:20:03.832187] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1341865 ] 00:06:51.661 [2024-07-23 08:20:03.952714] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.661 [2024-07-23 08:20:04.167185] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.566 test_start 00:06:53.566 oneshot 00:06:53.566 tick 100 00:06:53.566 tick 100 00:06:53.566 tick 250 00:06:53.566 tick 100 00:06:53.566 tick 100 00:06:53.566 tick 100 00:06:53.566 tick 250 00:06:53.566 tick 500 00:06:53.566 tick 100 00:06:53.566 tick 100 00:06:53.566 tick 250 00:06:53.566 tick 100 00:06:53.566 tick 100 00:06:53.566 test_end 00:06:53.566 00:06:53.566 real 0m1.786s 00:06:53.566 user 0m1.627s 00:06:53.566 sys 0m0.150s 00:06:53.566 08:20:05 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:53.566 08:20:05 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:53.566 ************************************ 00:06:53.566 END TEST event_reactor 00:06:53.566 ************************************ 00:06:53.566 08:20:05 event -- common/autotest_common.sh@1142 -- # return 0 00:06:53.566 08:20:05 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:53.566 08:20:05 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:06:53.566 08:20:05 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:53.566 08:20:05 event -- common/autotest_common.sh@10 -- # set +x 00:06:53.566 ************************************ 00:06:53.566 START TEST event_reactor_perf 00:06:53.566 ************************************ 00:06:53.566 08:20:05 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:53.566 [2024-07-23 08:20:05.674283] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:06:53.566 [2024-07-23 08:20:05.674356] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1342172 ] 00:06:53.566 [2024-07-23 08:20:05.792487] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.566 [2024-07-23 08:20:06.002553] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.942 test_start 00:06:54.942 test_end 00:06:54.942 Performance: 395202 events per second 00:06:54.942 00:06:54.942 real 0m1.766s 00:06:54.942 user 0m1.619s 00:06:54.943 sys 0m0.138s 00:06:54.943 08:20:07 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:54.943 08:20:07 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:54.943 ************************************ 00:06:54.943 END TEST event_reactor_perf 00:06:54.943 ************************************ 00:06:54.943 08:20:07 event -- common/autotest_common.sh@1142 -- # return 0 00:06:54.943 08:20:07 event -- event/event.sh@49 -- # uname -s 00:06:54.943 08:20:07 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:54.943 08:20:07 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:54.943 08:20:07 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:54.943 08:20:07 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:54.943 08:20:07 event -- common/autotest_common.sh@10 -- # set +x 00:06:55.201 ************************************ 00:06:55.201 START TEST event_scheduler 00:06:55.201 ************************************ 00:06:55.201 08:20:07 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:55.201 * Looking for test storage... 00:06:55.201 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:06:55.201 08:20:07 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:55.201 08:20:07 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=1342489 00:06:55.201 08:20:07 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:55.201 08:20:07 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:55.201 08:20:07 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 1342489 00:06:55.201 08:20:07 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 1342489 ']' 00:06:55.201 08:20:07 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:55.201 08:20:07 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:55.201 08:20:07 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:55.201 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:55.201 08:20:07 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:55.201 08:20:07 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:55.201 [2024-07-23 08:20:07.628145] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:06:55.201 [2024-07-23 08:20:07.628238] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1342489 ] 00:06:55.459 [2024-07-23 08:20:07.748708] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:55.459 [2024-07-23 08:20:07.957010] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.459 [2024-07-23 08:20:07.957082] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:55.459 [2024-07-23 08:20:07.957139] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:55.459 [2024-07-23 08:20:07.957149] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:56.027 08:20:08 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:56.027 08:20:08 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:06:56.027 08:20:08 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:56.027 08:20:08 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:56.027 08:20:08 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:56.027 [2024-07-23 08:20:08.415096] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:06:56.027 [2024-07-23 08:20:08.415124] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:06:56.027 [2024-07-23 08:20:08.415138] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:06:56.027 [2024-07-23 08:20:08.415148] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:06:56.027 [2024-07-23 08:20:08.415157] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:06:56.027 08:20:08 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:56.027 08:20:08 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:56.027 08:20:08 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:56.027 08:20:08 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:56.596 [2024-07-23 08:20:08.811258] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:56.596 08:20:08 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:56.596 08:20:08 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:56.596 08:20:08 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:56.596 08:20:08 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:56.596 08:20:08 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:56.596 ************************************ 00:06:56.596 START TEST scheduler_create_thread 00:06:56.596 ************************************ 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:56.596 2 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:56.596 3 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:56.596 4 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:56.596 5 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:56.596 6 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:56.596 7 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:56.596 8 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:56.596 9 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:56.596 10 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:56.596 08:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:57.975 08:20:10 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.975 08:20:10 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:57.975 08:20:10 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:57.975 08:20:10 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.975 08:20:10 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:59.350 08:20:11 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:59.350 00:06:59.350 real 0m2.626s 00:06:59.350 user 0m0.025s 00:06:59.350 sys 0m0.004s 00:06:59.350 08:20:11 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:59.350 08:20:11 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:59.350 ************************************ 00:06:59.350 END TEST scheduler_create_thread 00:06:59.350 ************************************ 00:06:59.350 08:20:11 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:06:59.350 08:20:11 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:59.350 08:20:11 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 1342489 00:06:59.350 08:20:11 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 1342489 ']' 00:06:59.350 08:20:11 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 1342489 00:06:59.351 08:20:11 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:06:59.351 08:20:11 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:59.351 08:20:11 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1342489 00:06:59.351 08:20:11 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:06:59.351 08:20:11 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:06:59.351 08:20:11 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1342489' 00:06:59.351 killing process with pid 1342489 00:06:59.351 08:20:11 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 1342489 00:06:59.351 08:20:11 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 1342489 00:06:59.351 [2024-07-23 08:20:11.852238] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:00.765 00:07:00.765 real 0m5.723s 00:07:00.765 user 0m11.710s 00:07:00.765 sys 0m0.487s 00:07:00.765 08:20:13 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:00.765 08:20:13 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:00.765 ************************************ 00:07:00.765 END TEST event_scheduler 00:07:00.765 ************************************ 00:07:00.765 08:20:13 event -- common/autotest_common.sh@1142 -- # return 0 00:07:00.765 08:20:13 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:00.765 08:20:13 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:00.765 08:20:13 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:00.765 08:20:13 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:00.765 08:20:13 event -- common/autotest_common.sh@10 -- # set +x 00:07:00.765 ************************************ 00:07:00.765 START TEST app_repeat 00:07:00.765 ************************************ 00:07:00.765 08:20:13 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:07:00.765 08:20:13 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:00.765 08:20:13 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:00.765 08:20:13 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:00.765 08:20:13 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:00.765 08:20:13 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:00.765 08:20:13 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:00.765 08:20:13 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:00.765 08:20:13 event.app_repeat -- event/event.sh@19 -- # repeat_pid=1343735 00:07:00.765 08:20:13 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:00.765 08:20:13 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:00.765 08:20:13 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1343735' 00:07:00.765 Process app_repeat pid: 1343735 00:07:00.765 08:20:13 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:00.765 08:20:13 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:00.765 spdk_app_start Round 0 00:07:00.765 08:20:13 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1343735 /var/tmp/spdk-nbd.sock 00:07:00.765 08:20:13 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1343735 ']' 00:07:00.765 08:20:13 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:00.765 08:20:13 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:00.765 08:20:13 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:00.765 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:00.765 08:20:13 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:00.765 08:20:13 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:01.023 [2024-07-23 08:20:13.312546] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:01.023 [2024-07-23 08:20:13.312647] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1343735 ] 00:07:01.023 [2024-07-23 08:20:13.435653] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:01.281 [2024-07-23 08:20:13.649347] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.281 [2024-07-23 08:20:13.649361] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:01.847 08:20:14 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:01.847 08:20:14 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:01.847 08:20:14 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:01.847 Malloc0 00:07:01.847 08:20:14 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:02.106 Malloc1 00:07:02.106 08:20:14 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:02.106 08:20:14 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:02.106 08:20:14 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:02.106 08:20:14 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:02.106 08:20:14 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:02.106 08:20:14 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:02.106 08:20:14 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:02.106 08:20:14 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:02.106 08:20:14 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:02.106 08:20:14 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:02.106 08:20:14 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:02.106 08:20:14 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:02.106 08:20:14 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:02.106 08:20:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:02.106 08:20:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:02.106 08:20:14 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:02.364 /dev/nbd0 00:07:02.364 08:20:14 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:02.364 08:20:14 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:02.364 08:20:14 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:02.364 08:20:14 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:02.364 08:20:14 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:02.364 08:20:14 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:02.364 08:20:14 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:02.364 08:20:14 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:02.364 08:20:14 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:02.364 08:20:14 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:02.364 08:20:14 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:02.364 1+0 records in 00:07:02.364 1+0 records out 00:07:02.364 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00019261 s, 21.3 MB/s 00:07:02.364 08:20:14 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:02.364 08:20:14 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:02.364 08:20:14 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:02.364 08:20:14 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:02.364 08:20:14 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:02.365 08:20:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:02.365 08:20:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:02.365 08:20:14 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:02.623 /dev/nbd1 00:07:02.623 08:20:15 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:02.623 08:20:15 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:02.623 08:20:15 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:02.623 08:20:15 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:02.623 08:20:15 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:02.623 08:20:15 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:02.623 08:20:15 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:02.623 08:20:15 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:02.623 08:20:15 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:02.623 08:20:15 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:02.623 08:20:15 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:02.623 1+0 records in 00:07:02.623 1+0 records out 00:07:02.623 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000164134 s, 25.0 MB/s 00:07:02.623 08:20:15 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:02.623 08:20:15 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:02.623 08:20:15 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:02.623 08:20:15 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:02.623 08:20:15 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:02.623 08:20:15 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:02.623 08:20:15 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:02.623 08:20:15 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:02.623 08:20:15 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:02.623 08:20:15 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:02.882 { 00:07:02.882 "nbd_device": "/dev/nbd0", 00:07:02.882 "bdev_name": "Malloc0" 00:07:02.882 }, 00:07:02.882 { 00:07:02.882 "nbd_device": "/dev/nbd1", 00:07:02.882 "bdev_name": "Malloc1" 00:07:02.882 } 00:07:02.882 ]' 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:02.882 { 00:07:02.882 "nbd_device": "/dev/nbd0", 00:07:02.882 "bdev_name": "Malloc0" 00:07:02.882 }, 00:07:02.882 { 00:07:02.882 "nbd_device": "/dev/nbd1", 00:07:02.882 "bdev_name": "Malloc1" 00:07:02.882 } 00:07:02.882 ]' 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:02.882 /dev/nbd1' 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:02.882 /dev/nbd1' 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:02.882 256+0 records in 00:07:02.882 256+0 records out 00:07:02.882 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103656 s, 101 MB/s 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:02.882 256+0 records in 00:07:02.882 256+0 records out 00:07:02.882 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0156164 s, 67.1 MB/s 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:02.882 256+0 records in 00:07:02.882 256+0 records out 00:07:02.882 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0190818 s, 55.0 MB/s 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:02.882 08:20:15 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:03.141 08:20:15 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:03.141 08:20:15 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:03.141 08:20:15 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:03.141 08:20:15 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:03.141 08:20:15 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:03.141 08:20:15 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:03.141 08:20:15 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:03.141 08:20:15 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:03.141 08:20:15 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:03.141 08:20:15 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:03.399 08:20:15 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:03.399 08:20:15 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:03.399 08:20:15 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:03.399 08:20:15 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:03.399 08:20:15 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:03.399 08:20:15 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:03.399 08:20:15 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:03.399 08:20:15 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:03.399 08:20:15 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:03.399 08:20:15 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:03.399 08:20:15 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:03.657 08:20:15 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:03.657 08:20:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:03.657 08:20:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:03.657 08:20:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:03.657 08:20:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:03.657 08:20:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:03.657 08:20:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:03.657 08:20:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:03.657 08:20:15 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:03.657 08:20:15 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:03.657 08:20:15 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:03.657 08:20:15 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:03.657 08:20:15 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:03.916 08:20:16 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:05.294 [2024-07-23 08:20:17.745325] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:05.553 [2024-07-23 08:20:17.945988] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.553 [2024-07-23 08:20:17.945989] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:05.811 [2024-07-23 08:20:18.172752] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:05.811 [2024-07-23 08:20:18.172797] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:07.187 08:20:19 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:07.187 08:20:19 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:07.187 spdk_app_start Round 1 00:07:07.187 08:20:19 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1343735 /var/tmp/spdk-nbd.sock 00:07:07.187 08:20:19 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1343735 ']' 00:07:07.188 08:20:19 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:07.188 08:20:19 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:07.188 08:20:19 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:07.188 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:07.188 08:20:19 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:07.188 08:20:19 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:07.188 08:20:19 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:07.188 08:20:19 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:07.188 08:20:19 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:07.446 Malloc0 00:07:07.446 08:20:19 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:07.446 Malloc1 00:07:07.705 08:20:19 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:07.705 08:20:19 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:07.705 08:20:19 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:07.705 08:20:19 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:07.705 08:20:19 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:07.705 08:20:19 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:07.705 08:20:19 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:07.705 08:20:19 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:07.705 08:20:19 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:07.705 08:20:19 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:07.705 08:20:19 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:07.705 08:20:19 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:07.705 08:20:19 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:07.705 08:20:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:07.705 08:20:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:07.705 08:20:19 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:07.705 /dev/nbd0 00:07:07.705 08:20:20 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:07.705 08:20:20 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:07.705 08:20:20 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:07.705 08:20:20 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:07.705 08:20:20 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:07.705 08:20:20 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:07.705 08:20:20 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:07.705 08:20:20 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:07.705 08:20:20 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:07.705 08:20:20 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:07.705 08:20:20 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:07.705 1+0 records in 00:07:07.705 1+0 records out 00:07:07.705 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000226509 s, 18.1 MB/s 00:07:07.705 08:20:20 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:07.705 08:20:20 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:07.705 08:20:20 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:07.705 08:20:20 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:07.705 08:20:20 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:07.705 08:20:20 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:07.705 08:20:20 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:07.705 08:20:20 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:07.963 /dev/nbd1 00:07:07.963 08:20:20 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:07.963 08:20:20 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:07.963 08:20:20 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:07.963 08:20:20 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:07.963 08:20:20 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:07.963 08:20:20 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:07.963 08:20:20 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:07.963 08:20:20 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:07.963 08:20:20 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:07.963 08:20:20 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:07.963 08:20:20 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:07.963 1+0 records in 00:07:07.963 1+0 records out 00:07:07.963 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000195277 s, 21.0 MB/s 00:07:07.963 08:20:20 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:07.963 08:20:20 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:07.963 08:20:20 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:07.963 08:20:20 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:07.963 08:20:20 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:07.963 08:20:20 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:07.963 08:20:20 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:07.963 08:20:20 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:07.963 08:20:20 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:07.963 08:20:20 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:08.223 { 00:07:08.223 "nbd_device": "/dev/nbd0", 00:07:08.223 "bdev_name": "Malloc0" 00:07:08.223 }, 00:07:08.223 { 00:07:08.223 "nbd_device": "/dev/nbd1", 00:07:08.223 "bdev_name": "Malloc1" 00:07:08.223 } 00:07:08.223 ]' 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:08.223 { 00:07:08.223 "nbd_device": "/dev/nbd0", 00:07:08.223 "bdev_name": "Malloc0" 00:07:08.223 }, 00:07:08.223 { 00:07:08.223 "nbd_device": "/dev/nbd1", 00:07:08.223 "bdev_name": "Malloc1" 00:07:08.223 } 00:07:08.223 ]' 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:08.223 /dev/nbd1' 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:08.223 /dev/nbd1' 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:08.223 256+0 records in 00:07:08.223 256+0 records out 00:07:08.223 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0101622 s, 103 MB/s 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:08.223 256+0 records in 00:07:08.223 256+0 records out 00:07:08.223 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0163111 s, 64.3 MB/s 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:08.223 256+0 records in 00:07:08.223 256+0 records out 00:07:08.223 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.018609 s, 56.3 MB/s 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:08.223 08:20:20 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:08.482 08:20:20 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:08.482 08:20:20 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:08.482 08:20:20 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:08.482 08:20:20 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:08.482 08:20:20 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:08.482 08:20:20 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:08.482 08:20:20 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:08.482 08:20:20 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:08.482 08:20:20 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:08.482 08:20:20 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:08.741 08:20:21 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:08.741 08:20:21 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:08.741 08:20:21 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:08.741 08:20:21 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:08.741 08:20:21 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:08.741 08:20:21 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:08.741 08:20:21 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:08.741 08:20:21 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:08.741 08:20:21 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:08.741 08:20:21 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:08.741 08:20:21 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:09.000 08:20:21 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:09.000 08:20:21 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:09.000 08:20:21 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:09.000 08:20:21 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:09.000 08:20:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:09.000 08:20:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:09.000 08:20:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:09.000 08:20:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:09.000 08:20:21 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:09.000 08:20:21 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:09.000 08:20:21 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:09.000 08:20:21 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:09.000 08:20:21 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:09.259 08:20:21 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:10.637 [2024-07-23 08:20:23.106099] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:10.902 [2024-07-23 08:20:23.310722] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.902 [2024-07-23 08:20:23.310731] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:11.161 [2024-07-23 08:20:23.541572] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:11.161 [2024-07-23 08:20:23.541624] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:12.536 08:20:24 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:12.536 08:20:24 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:12.536 spdk_app_start Round 2 00:07:12.537 08:20:24 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1343735 /var/tmp/spdk-nbd.sock 00:07:12.537 08:20:24 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1343735 ']' 00:07:12.537 08:20:24 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:12.537 08:20:24 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:12.537 08:20:24 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:12.537 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:12.537 08:20:24 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:12.537 08:20:24 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:12.537 08:20:24 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:12.537 08:20:24 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:12.537 08:20:24 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:12.795 Malloc0 00:07:12.795 08:20:25 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:13.055 Malloc1 00:07:13.055 08:20:25 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:13.055 08:20:25 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:13.055 08:20:25 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:13.055 08:20:25 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:13.055 08:20:25 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:13.055 08:20:25 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:13.055 08:20:25 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:13.055 08:20:25 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:13.055 08:20:25 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:13.055 08:20:25 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:13.055 08:20:25 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:13.055 08:20:25 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:13.055 08:20:25 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:13.055 08:20:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:13.055 08:20:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:13.055 08:20:25 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:13.055 /dev/nbd0 00:07:13.055 08:20:25 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:13.315 08:20:25 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:13.315 1+0 records in 00:07:13.315 1+0 records out 00:07:13.315 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000180792 s, 22.7 MB/s 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:13.315 08:20:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:13.315 08:20:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:13.315 08:20:25 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:13.315 /dev/nbd1 00:07:13.315 08:20:25 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:13.315 08:20:25 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:13.315 1+0 records in 00:07:13.315 1+0 records out 00:07:13.315 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000341772 s, 12.0 MB/s 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:13.315 08:20:25 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:13.315 08:20:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:13.315 08:20:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:13.315 08:20:25 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:13.315 08:20:25 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:13.315 08:20:25 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:13.574 08:20:25 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:13.574 { 00:07:13.574 "nbd_device": "/dev/nbd0", 00:07:13.574 "bdev_name": "Malloc0" 00:07:13.574 }, 00:07:13.574 { 00:07:13.574 "nbd_device": "/dev/nbd1", 00:07:13.574 "bdev_name": "Malloc1" 00:07:13.574 } 00:07:13.574 ]' 00:07:13.574 08:20:25 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:13.574 { 00:07:13.574 "nbd_device": "/dev/nbd0", 00:07:13.574 "bdev_name": "Malloc0" 00:07:13.574 }, 00:07:13.574 { 00:07:13.574 "nbd_device": "/dev/nbd1", 00:07:13.574 "bdev_name": "Malloc1" 00:07:13.574 } 00:07:13.574 ]' 00:07:13.574 08:20:25 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:13.574 08:20:25 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:13.574 /dev/nbd1' 00:07:13.574 08:20:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:13.574 /dev/nbd1' 00:07:13.574 08:20:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:13.574 08:20:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:13.574 08:20:26 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:13.574 08:20:26 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:13.574 08:20:26 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:13.574 08:20:26 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:13.574 08:20:26 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:13.574 08:20:26 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:13.574 08:20:26 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:13.574 08:20:26 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:13.574 08:20:26 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:13.574 08:20:26 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:13.574 256+0 records in 00:07:13.574 256+0 records out 00:07:13.574 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.010349 s, 101 MB/s 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:13.575 256+0 records in 00:07:13.575 256+0 records out 00:07:13.575 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0159852 s, 65.6 MB/s 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:13.575 256+0 records in 00:07:13.575 256+0 records out 00:07:13.575 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0190331 s, 55.1 MB/s 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:13.575 08:20:26 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:13.834 08:20:26 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:13.834 08:20:26 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:13.834 08:20:26 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:13.834 08:20:26 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:13.834 08:20:26 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:13.834 08:20:26 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:13.834 08:20:26 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:13.834 08:20:26 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:13.834 08:20:26 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:13.834 08:20:26 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:14.093 08:20:26 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:14.093 08:20:26 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:14.093 08:20:26 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:14.093 08:20:26 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:14.093 08:20:26 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:14.093 08:20:26 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:14.093 08:20:26 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:14.093 08:20:26 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:14.093 08:20:26 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:14.093 08:20:26 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:14.093 08:20:26 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:14.352 08:20:26 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:14.352 08:20:26 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:14.352 08:20:26 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:14.352 08:20:26 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:14.352 08:20:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:14.352 08:20:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:14.352 08:20:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:14.352 08:20:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:14.352 08:20:26 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:14.352 08:20:26 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:14.352 08:20:26 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:14.352 08:20:26 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:14.352 08:20:26 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:14.610 08:20:27 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:15.987 [2024-07-23 08:20:28.470062] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:16.247 [2024-07-23 08:20:28.677968] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:16.247 [2024-07-23 08:20:28.677968] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.506 [2024-07-23 08:20:28.907444] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:16.506 [2024-07-23 08:20:28.907488] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:17.883 08:20:30 event.app_repeat -- event/event.sh@38 -- # waitforlisten 1343735 /var/tmp/spdk-nbd.sock 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1343735 ']' 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:17.883 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:17.883 08:20:30 event.app_repeat -- event/event.sh@39 -- # killprocess 1343735 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 1343735 ']' 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 1343735 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1343735 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1343735' 00:07:17.883 killing process with pid 1343735 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@967 -- # kill 1343735 00:07:17.883 08:20:30 event.app_repeat -- common/autotest_common.sh@972 -- # wait 1343735 00:07:19.261 spdk_app_start is called in Round 0. 00:07:19.261 Shutdown signal received, stop current app iteration 00:07:19.261 Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 reinitialization... 00:07:19.261 spdk_app_start is called in Round 1. 00:07:19.261 Shutdown signal received, stop current app iteration 00:07:19.261 Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 reinitialization... 00:07:19.261 spdk_app_start is called in Round 2. 00:07:19.261 Shutdown signal received, stop current app iteration 00:07:19.261 Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 reinitialization... 00:07:19.261 spdk_app_start is called in Round 3. 00:07:19.261 Shutdown signal received, stop current app iteration 00:07:19.261 08:20:31 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:19.261 08:20:31 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:19.261 00:07:19.261 real 0m18.243s 00:07:19.261 user 0m36.872s 00:07:19.261 sys 0m2.426s 00:07:19.261 08:20:31 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:19.261 08:20:31 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:19.261 ************************************ 00:07:19.261 END TEST app_repeat 00:07:19.261 ************************************ 00:07:19.261 08:20:31 event -- common/autotest_common.sh@1142 -- # return 0 00:07:19.261 08:20:31 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:19.261 00:07:19.261 real 0m29.736s 00:07:19.261 user 0m56.624s 00:07:19.261 sys 0m3.640s 00:07:19.261 08:20:31 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:19.261 08:20:31 event -- common/autotest_common.sh@10 -- # set +x 00:07:19.261 ************************************ 00:07:19.261 END TEST event 00:07:19.261 ************************************ 00:07:19.261 08:20:31 -- common/autotest_common.sh@1142 -- # return 0 00:07:19.261 08:20:31 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:19.261 08:20:31 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:19.261 08:20:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.261 08:20:31 -- common/autotest_common.sh@10 -- # set +x 00:07:19.261 ************************************ 00:07:19.261 START TEST thread 00:07:19.261 ************************************ 00:07:19.261 08:20:31 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:19.261 * Looking for test storage... 00:07:19.261 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:07:19.261 08:20:31 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:19.261 08:20:31 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:19.261 08:20:31 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.261 08:20:31 thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.261 ************************************ 00:07:19.261 START TEST thread_poller_perf 00:07:19.261 ************************************ 00:07:19.261 08:20:31 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:19.261 [2024-07-23 08:20:31.741169] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:19.261 [2024-07-23 08:20:31.741249] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1347344 ] 00:07:19.520 [2024-07-23 08:20:31.870022] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.779 [2024-07-23 08:20:32.086953] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.779 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:21.156 ====================================== 00:07:21.156 busy:2109749300 (cyc) 00:07:21.156 total_run_count: 398000 00:07:21.156 tsc_hz: 2100000000 (cyc) 00:07:21.156 ====================================== 00:07:21.156 poller_cost: 5300 (cyc), 2523 (nsec) 00:07:21.156 00:07:21.156 real 0m1.824s 00:07:21.156 user 0m1.662s 00:07:21.156 sys 0m0.154s 00:07:21.156 08:20:33 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:21.156 08:20:33 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:21.156 ************************************ 00:07:21.156 END TEST thread_poller_perf 00:07:21.156 ************************************ 00:07:21.156 08:20:33 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:21.156 08:20:33 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:21.156 08:20:33 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:21.156 08:20:33 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:21.156 08:20:33 thread -- common/autotest_common.sh@10 -- # set +x 00:07:21.156 ************************************ 00:07:21.156 START TEST thread_poller_perf 00:07:21.156 ************************************ 00:07:21.156 08:20:33 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:21.156 [2024-07-23 08:20:33.639445] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:21.156 [2024-07-23 08:20:33.639522] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1347782 ] 00:07:21.415 [2024-07-23 08:20:33.760033] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.674 [2024-07-23 08:20:33.964936] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.674 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:23.052 ====================================== 00:07:23.052 busy:2102258510 (cyc) 00:07:23.052 total_run_count: 5090000 00:07:23.052 tsc_hz: 2100000000 (cyc) 00:07:23.052 ====================================== 00:07:23.052 poller_cost: 413 (cyc), 196 (nsec) 00:07:23.052 00:07:23.052 real 0m1.772s 00:07:23.052 user 0m1.615s 00:07:23.052 sys 0m0.149s 00:07:23.052 08:20:35 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:23.052 08:20:35 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:23.052 ************************************ 00:07:23.052 END TEST thread_poller_perf 00:07:23.052 ************************************ 00:07:23.052 08:20:35 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:23.052 08:20:35 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:07:23.052 00:07:23.052 real 0m3.809s 00:07:23.052 user 0m3.379s 00:07:23.052 sys 0m0.429s 00:07:23.052 08:20:35 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:23.052 08:20:35 thread -- common/autotest_common.sh@10 -- # set +x 00:07:23.052 ************************************ 00:07:23.052 END TEST thread 00:07:23.052 ************************************ 00:07:23.052 08:20:35 -- common/autotest_common.sh@1142 -- # return 0 00:07:23.052 08:20:35 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:23.052 08:20:35 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:23.052 08:20:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:23.052 08:20:35 -- common/autotest_common.sh@10 -- # set +x 00:07:23.052 ************************************ 00:07:23.052 START TEST accel 00:07:23.052 ************************************ 00:07:23.052 08:20:35 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:23.052 * Looking for test storage... 00:07:23.052 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:23.052 08:20:35 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:07:23.052 08:20:35 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:07:23.052 08:20:35 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:23.052 08:20:35 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1348118 00:07:23.052 08:20:35 accel -- accel/accel.sh@63 -- # waitforlisten 1348118 00:07:23.052 08:20:35 accel -- common/autotest_common.sh@829 -- # '[' -z 1348118 ']' 00:07:23.052 08:20:35 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:23.052 08:20:35 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:23.052 08:20:35 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:23.052 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:23.052 08:20:35 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:23.052 08:20:35 accel -- common/autotest_common.sh@10 -- # set +x 00:07:23.052 08:20:35 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:23.052 08:20:35 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:23.052 08:20:35 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:23.052 08:20:35 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:23.052 08:20:35 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:23.052 08:20:35 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:23.052 08:20:35 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:23.052 08:20:35 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:23.052 08:20:35 accel -- accel/accel.sh@41 -- # jq -r . 00:07:23.312 [2024-07-23 08:20:35.614891] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:23.312 [2024-07-23 08:20:35.615000] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1348118 ] 00:07:23.312 [2024-07-23 08:20:35.737064] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.571 [2024-07-23 08:20:35.953371] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.510 08:20:36 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:24.510 08:20:36 accel -- common/autotest_common.sh@862 -- # return 0 00:07:24.510 08:20:36 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:24.510 08:20:36 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:24.510 08:20:36 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:24.510 08:20:36 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:07:24.510 08:20:36 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:24.510 08:20:36 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:24.510 08:20:36 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:24.510 08:20:36 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:24.510 08:20:36 accel -- common/autotest_common.sh@10 -- # set +x 00:07:24.510 08:20:36 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:24.510 08:20:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.510 08:20:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.510 08:20:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.510 08:20:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.510 08:20:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.510 08:20:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.510 08:20:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.510 08:20:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.510 08:20:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.510 08:20:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.510 08:20:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.510 08:20:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.510 08:20:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.510 08:20:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.510 08:20:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.510 08:20:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.510 08:20:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.510 08:20:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.510 08:20:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.510 08:20:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.510 08:20:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.510 08:20:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.510 08:20:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.510 08:20:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.510 08:20:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.510 08:20:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.510 08:20:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.510 08:20:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.510 08:20:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.510 08:20:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.510 08:20:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.510 08:20:36 accel -- accel/accel.sh@75 -- # killprocess 1348118 00:07:24.510 08:20:36 accel -- common/autotest_common.sh@948 -- # '[' -z 1348118 ']' 00:07:24.510 08:20:36 accel -- common/autotest_common.sh@952 -- # kill -0 1348118 00:07:24.510 08:20:36 accel -- common/autotest_common.sh@953 -- # uname 00:07:24.510 08:20:36 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:24.510 08:20:36 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1348118 00:07:24.510 08:20:36 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:24.510 08:20:36 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:24.510 08:20:36 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1348118' 00:07:24.510 killing process with pid 1348118 00:07:24.510 08:20:36 accel -- common/autotest_common.sh@967 -- # kill 1348118 00:07:24.510 08:20:36 accel -- common/autotest_common.sh@972 -- # wait 1348118 00:07:27.047 08:20:39 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:27.047 08:20:39 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:07:27.047 08:20:39 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:07:27.047 08:20:39 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:27.047 08:20:39 accel -- common/autotest_common.sh@10 -- # set +x 00:07:27.047 08:20:39 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:07:27.047 08:20:39 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:07:27.047 08:20:39 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:07:27.047 08:20:39 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:27.047 08:20:39 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:27.047 08:20:39 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:27.047 08:20:39 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:27.047 08:20:39 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:27.047 08:20:39 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:07:27.047 08:20:39 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:07:27.047 08:20:39 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:27.048 08:20:39 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:07:27.048 08:20:39 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:27.048 08:20:39 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:07:27.048 08:20:39 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:27.048 08:20:39 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:27.048 08:20:39 accel -- common/autotest_common.sh@10 -- # set +x 00:07:27.307 ************************************ 00:07:27.307 START TEST accel_missing_filename 00:07:27.307 ************************************ 00:07:27.307 08:20:39 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:07:27.307 08:20:39 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:07:27.307 08:20:39 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:07:27.307 08:20:39 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:27.307 08:20:39 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:27.307 08:20:39 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:27.307 08:20:39 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:27.307 08:20:39 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:07:27.307 08:20:39 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:07:27.307 08:20:39 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:07:27.307 08:20:39 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:27.307 08:20:39 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:27.307 08:20:39 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:27.307 08:20:39 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:27.307 08:20:39 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:27.307 08:20:39 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:07:27.307 08:20:39 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:07:27.307 [2024-07-23 08:20:39.618914] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:27.307 [2024-07-23 08:20:39.619009] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1348942 ] 00:07:27.307 [2024-07-23 08:20:39.753974] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.573 [2024-07-23 08:20:39.971357] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.832 [2024-07-23 08:20:40.219420] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:28.399 [2024-07-23 08:20:40.766424] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:07:28.966 A filename is required. 00:07:28.967 08:20:41 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:07:28.967 08:20:41 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:28.967 08:20:41 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:07:28.967 08:20:41 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:07:28.967 08:20:41 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:07:28.967 08:20:41 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:28.967 00:07:28.967 real 0m1.615s 00:07:28.967 user 0m1.426s 00:07:28.967 sys 0m0.210s 00:07:28.967 08:20:41 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:28.967 08:20:41 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:07:28.967 ************************************ 00:07:28.967 END TEST accel_missing_filename 00:07:28.967 ************************************ 00:07:28.967 08:20:41 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:28.967 08:20:41 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:28.967 08:20:41 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:28.967 08:20:41 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:28.967 08:20:41 accel -- common/autotest_common.sh@10 -- # set +x 00:07:28.967 ************************************ 00:07:28.967 START TEST accel_compress_verify 00:07:28.967 ************************************ 00:07:28.967 08:20:41 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:28.967 08:20:41 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:07:28.967 08:20:41 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:28.967 08:20:41 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:28.967 08:20:41 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:28.967 08:20:41 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:28.967 08:20:41 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:28.967 08:20:41 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:28.967 08:20:41 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:28.967 08:20:41 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:28.967 08:20:41 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:28.967 08:20:41 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:28.967 08:20:41 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:28.967 08:20:41 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:28.967 08:20:41 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:28.967 08:20:41 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:28.967 08:20:41 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:07:28.967 [2024-07-23 08:20:41.303683] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:28.967 [2024-07-23 08:20:41.303760] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1349397 ] 00:07:28.967 [2024-07-23 08:20:41.426135] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.225 [2024-07-23 08:20:41.639661] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.484 [2024-07-23 08:20:41.891035] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:30.051 [2024-07-23 08:20:42.453499] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:07:30.620 00:07:30.620 Compression does not support the verify option, aborting. 00:07:30.620 08:20:42 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:07:30.620 08:20:42 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:30.620 08:20:42 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:07:30.620 08:20:42 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:07:30.620 08:20:42 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:07:30.620 08:20:42 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:30.620 00:07:30.620 real 0m1.616s 00:07:30.620 user 0m1.414s 00:07:30.620 sys 0m0.209s 00:07:30.620 08:20:42 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:30.620 08:20:42 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:07:30.620 ************************************ 00:07:30.620 END TEST accel_compress_verify 00:07:30.620 ************************************ 00:07:30.620 08:20:42 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:30.620 08:20:42 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:07:30.620 08:20:42 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:30.620 08:20:42 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:30.620 08:20:42 accel -- common/autotest_common.sh@10 -- # set +x 00:07:30.620 ************************************ 00:07:30.620 START TEST accel_wrong_workload 00:07:30.620 ************************************ 00:07:30.620 08:20:42 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:07:30.620 08:20:42 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:07:30.620 08:20:42 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:07:30.620 08:20:42 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:30.620 08:20:42 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:30.620 08:20:42 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:30.620 08:20:42 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:30.620 08:20:42 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:07:30.620 08:20:42 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:07:30.620 08:20:42 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:07:30.620 08:20:42 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:30.620 08:20:42 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:30.620 08:20:42 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:30.620 08:20:42 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:30.620 08:20:42 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:30.620 08:20:42 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:07:30.620 08:20:42 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:07:30.620 Unsupported workload type: foobar 00:07:30.620 [2024-07-23 08:20:42.973007] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:07:30.620 accel_perf options: 00:07:30.620 [-h help message] 00:07:30.620 [-q queue depth per core] 00:07:30.620 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:30.620 [-T number of threads per core 00:07:30.620 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:30.620 [-t time in seconds] 00:07:30.620 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:30.620 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:30.620 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:30.620 [-l for compress/decompress workloads, name of uncompressed input file 00:07:30.620 [-S for crc32c workload, use this seed value (default 0) 00:07:30.620 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:30.620 [-f for fill workload, use this BYTE value (default 255) 00:07:30.620 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:30.620 [-y verify result if this switch is on] 00:07:30.620 [-a tasks to allocate per core (default: same value as -q)] 00:07:30.620 Can be used to spread operations across a wider range of memory. 00:07:30.620 08:20:42 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:07:30.620 08:20:42 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:30.620 08:20:42 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:30.620 08:20:42 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:30.620 00:07:30.620 real 0m0.068s 00:07:30.620 user 0m0.070s 00:07:30.620 sys 0m0.039s 00:07:30.620 08:20:42 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:30.620 08:20:42 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:07:30.620 ************************************ 00:07:30.620 END TEST accel_wrong_workload 00:07:30.620 ************************************ 00:07:30.620 08:20:43 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:30.620 08:20:43 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:07:30.620 08:20:43 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:30.620 08:20:43 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:30.620 08:20:43 accel -- common/autotest_common.sh@10 -- # set +x 00:07:30.620 ************************************ 00:07:30.620 START TEST accel_negative_buffers 00:07:30.620 ************************************ 00:07:30.620 08:20:43 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:07:30.620 08:20:43 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:07:30.620 08:20:43 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:07:30.620 08:20:43 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:30.620 08:20:43 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:30.620 08:20:43 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:30.620 08:20:43 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:30.620 08:20:43 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:07:30.620 08:20:43 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:07:30.620 08:20:43 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:07:30.620 08:20:43 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:30.620 08:20:43 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:30.620 08:20:43 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:30.620 08:20:43 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:30.620 08:20:43 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:30.620 08:20:43 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:07:30.620 08:20:43 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:07:30.620 -x option must be non-negative. 00:07:30.620 [2024-07-23 08:20:43.100717] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:07:30.620 accel_perf options: 00:07:30.620 [-h help message] 00:07:30.620 [-q queue depth per core] 00:07:30.620 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:30.620 [-T number of threads per core 00:07:30.620 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:30.620 [-t time in seconds] 00:07:30.620 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:30.620 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:30.620 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:30.621 [-l for compress/decompress workloads, name of uncompressed input file 00:07:30.621 [-S for crc32c workload, use this seed value (default 0) 00:07:30.621 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:30.621 [-f for fill workload, use this BYTE value (default 255) 00:07:30.621 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:30.621 [-y verify result if this switch is on] 00:07:30.621 [-a tasks to allocate per core (default: same value as -q)] 00:07:30.621 Can be used to spread operations across a wider range of memory. 00:07:30.621 08:20:43 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:07:30.621 08:20:43 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:30.621 08:20:43 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:30.621 08:20:43 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:30.621 00:07:30.621 real 0m0.071s 00:07:30.621 user 0m0.072s 00:07:30.621 sys 0m0.040s 00:07:30.621 08:20:43 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:30.621 08:20:43 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:07:30.621 ************************************ 00:07:30.621 END TEST accel_negative_buffers 00:07:30.621 ************************************ 00:07:30.883 08:20:43 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:30.883 08:20:43 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:07:30.883 08:20:43 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:30.883 08:20:43 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:30.883 08:20:43 accel -- common/autotest_common.sh@10 -- # set +x 00:07:30.883 ************************************ 00:07:30.883 START TEST accel_crc32c 00:07:30.883 ************************************ 00:07:30.883 08:20:43 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:07:30.883 08:20:43 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:30.883 08:20:43 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:30.883 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:30.883 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:30.883 08:20:43 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:07:30.883 08:20:43 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:07:30.883 08:20:43 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:30.883 08:20:43 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:30.883 08:20:43 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:30.883 08:20:43 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:30.883 08:20:43 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:30.883 08:20:43 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:30.883 08:20:43 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:30.883 08:20:43 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:30.883 [2024-07-23 08:20:43.235470] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:30.883 [2024-07-23 08:20:43.235542] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1349729 ] 00:07:30.883 [2024-07-23 08:20:43.358333] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.143 [2024-07-23 08:20:43.575622] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.403 08:20:43 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.305 08:20:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.565 08:20:45 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:33.565 08:20:45 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:33.565 08:20:45 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:33.565 00:07:33.565 real 0m2.643s 00:07:33.565 user 0m2.436s 00:07:33.565 sys 0m0.195s 00:07:33.565 08:20:45 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:33.565 08:20:45 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:33.565 ************************************ 00:07:33.565 END TEST accel_crc32c 00:07:33.565 ************************************ 00:07:33.565 08:20:45 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:33.565 08:20:45 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:07:33.565 08:20:45 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:33.565 08:20:45 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:33.565 08:20:45 accel -- common/autotest_common.sh@10 -- # set +x 00:07:33.565 ************************************ 00:07:33.565 START TEST accel_crc32c_C2 00:07:33.565 ************************************ 00:07:33.565 08:20:45 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:07:33.565 08:20:45 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:33.565 08:20:45 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:33.565 08:20:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:33.565 08:20:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:33.565 08:20:45 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:33.565 08:20:45 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:33.565 08:20:45 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:33.565 08:20:45 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:33.565 08:20:45 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:33.565 08:20:45 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:33.565 08:20:45 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:33.565 08:20:45 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:33.565 08:20:45 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:33.565 08:20:45 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:33.565 [2024-07-23 08:20:45.936399] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:33.565 [2024-07-23 08:20:45.936469] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1350250 ] 00:07:33.565 [2024-07-23 08:20:46.058783] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.823 [2024-07-23 08:20:46.289191] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.082 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:34.083 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.083 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.083 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.083 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:34.083 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.083 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.083 08:20:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:36.616 00:07:36.616 real 0m2.668s 00:07:36.616 user 0m2.444s 00:07:36.616 sys 0m0.215s 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:36.616 08:20:48 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:36.616 ************************************ 00:07:36.616 END TEST accel_crc32c_C2 00:07:36.616 ************************************ 00:07:36.616 08:20:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:36.616 08:20:48 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:36.616 08:20:48 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:36.616 08:20:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:36.616 08:20:48 accel -- common/autotest_common.sh@10 -- # set +x 00:07:36.616 ************************************ 00:07:36.616 START TEST accel_copy 00:07:36.616 ************************************ 00:07:36.616 08:20:48 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:07:36.616 08:20:48 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:36.616 08:20:48 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:07:36.616 08:20:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.616 08:20:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.616 08:20:48 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:36.616 08:20:48 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:36.616 08:20:48 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:36.616 08:20:48 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:36.616 08:20:48 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:36.616 08:20:48 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:36.616 08:20:48 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:36.616 08:20:48 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:36.616 08:20:48 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:36.616 08:20:48 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:07:36.616 [2024-07-23 08:20:48.659994] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:36.616 [2024-07-23 08:20:48.660063] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1350772 ] 00:07:36.616 [2024-07-23 08:20:48.781801] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.616 [2024-07-23 08:20:49.017017] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.875 08:20:49 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:07:38.782 08:20:51 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:38.782 00:07:38.782 real 0m2.635s 00:07:38.782 user 0m0.010s 00:07:38.782 sys 0m0.002s 00:07:38.782 08:20:51 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:38.782 08:20:51 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:07:38.782 ************************************ 00:07:38.782 END TEST accel_copy 00:07:38.782 ************************************ 00:07:38.782 08:20:51 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:38.782 08:20:51 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:38.782 08:20:51 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:38.782 08:20:51 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:38.782 08:20:51 accel -- common/autotest_common.sh@10 -- # set +x 00:07:39.042 ************************************ 00:07:39.042 START TEST accel_fill 00:07:39.042 ************************************ 00:07:39.042 08:20:51 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:39.042 08:20:51 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:07:39.042 08:20:51 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:07:39.042 08:20:51 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.042 08:20:51 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.042 08:20:51 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:39.042 08:20:51 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:39.042 08:20:51 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:07:39.042 08:20:51 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:39.042 08:20:51 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:39.042 08:20:51 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:39.042 08:20:51 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:39.042 08:20:51 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:39.042 08:20:51 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:07:39.042 08:20:51 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:07:39.042 [2024-07-23 08:20:51.358430] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:39.042 [2024-07-23 08:20:51.358517] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1351294 ] 00:07:39.042 [2024-07-23 08:20:51.477939] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.301 [2024-07-23 08:20:51.720367] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.560 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:39.561 08:20:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:41.465 08:20:53 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:41.724 08:20:53 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:41.724 08:20:53 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:07:41.724 08:20:53 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:41.724 00:07:41.724 real 0m2.678s 00:07:41.724 user 0m2.409s 00:07:41.724 sys 0m0.213s 00:07:41.724 08:20:53 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:41.724 08:20:53 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:07:41.724 ************************************ 00:07:41.724 END TEST accel_fill 00:07:41.724 ************************************ 00:07:41.724 08:20:54 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:41.724 08:20:54 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:41.724 08:20:54 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:41.724 08:20:54 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:41.724 08:20:54 accel -- common/autotest_common.sh@10 -- # set +x 00:07:41.724 ************************************ 00:07:41.724 START TEST accel_copy_crc32c 00:07:41.724 ************************************ 00:07:41.724 08:20:54 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:07:41.724 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:41.724 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:41.724 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.724 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.724 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:41.724 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:41.724 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:41.724 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:41.724 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:41.724 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:41.724 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:41.724 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:41.724 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:41.724 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:41.724 [2024-07-23 08:20:54.093786] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:41.724 [2024-07-23 08:20:54.093860] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1351820 ] 00:07:41.724 [2024-07-23 08:20:54.217837] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.998 [2024-07-23 08:20:54.454173] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.280 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:42.280 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.280 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.280 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.280 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:42.280 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.280 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.280 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.280 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:42.280 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.280 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.281 08:20:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.187 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.446 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:44.446 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:44.446 08:20:56 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:44.446 00:07:44.446 real 0m2.660s 00:07:44.446 user 0m0.012s 00:07:44.446 sys 0m0.001s 00:07:44.446 08:20:56 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:44.446 08:20:56 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:44.446 ************************************ 00:07:44.446 END TEST accel_copy_crc32c 00:07:44.446 ************************************ 00:07:44.446 08:20:56 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:44.446 08:20:56 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:44.446 08:20:56 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:44.446 08:20:56 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:44.446 08:20:56 accel -- common/autotest_common.sh@10 -- # set +x 00:07:44.446 ************************************ 00:07:44.446 START TEST accel_copy_crc32c_C2 00:07:44.446 ************************************ 00:07:44.446 08:20:56 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:44.446 08:20:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:44.446 08:20:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:44.446 08:20:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.447 08:20:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.447 08:20:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:44.447 08:20:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:44.447 08:20:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:44.447 08:20:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.447 08:20:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.447 08:20:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.447 08:20:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.447 08:20:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:44.447 08:20:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:44.447 08:20:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:44.447 [2024-07-23 08:20:56.827257] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:44.447 [2024-07-23 08:20:56.827338] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1352343 ] 00:07:44.447 [2024-07-23 08:20:56.958376] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.706 [2024-07-23 08:20:57.184002] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.966 08:20:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:46.871 00:07:46.871 real 0m2.608s 00:07:46.871 user 0m0.011s 00:07:46.871 sys 0m0.003s 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:46.871 08:20:59 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:46.871 ************************************ 00:07:46.871 END TEST accel_copy_crc32c_C2 00:07:46.871 ************************************ 00:07:47.129 08:20:59 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:47.129 08:20:59 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:47.129 08:20:59 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:47.129 08:20:59 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:47.129 08:20:59 accel -- common/autotest_common.sh@10 -- # set +x 00:07:47.129 ************************************ 00:07:47.129 START TEST accel_dualcast 00:07:47.129 ************************************ 00:07:47.129 08:20:59 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:07:47.129 08:20:59 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:07:47.129 08:20:59 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:07:47.129 08:20:59 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.129 08:20:59 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:47.129 08:20:59 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:47.129 08:20:59 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:47.129 08:20:59 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:07:47.129 08:20:59 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:47.129 08:20:59 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:47.129 08:20:59 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:47.129 08:20:59 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:47.129 08:20:59 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:47.129 08:20:59 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:07:47.129 08:20:59 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:07:47.129 [2024-07-23 08:20:59.474470] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:47.129 [2024-07-23 08:20:59.474540] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1352867 ] 00:07:47.129 [2024-07-23 08:20:59.595675] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.388 [2024-07-23 08:20:59.842536] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 08:21:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:07:50.179 08:21:02 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:50.179 00:07:50.179 real 0m2.673s 00:07:50.179 user 0m2.453s 00:07:50.179 sys 0m0.204s 00:07:50.179 08:21:02 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:50.179 08:21:02 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:07:50.179 ************************************ 00:07:50.179 END TEST accel_dualcast 00:07:50.179 ************************************ 00:07:50.179 08:21:02 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:50.179 08:21:02 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:50.179 08:21:02 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:50.179 08:21:02 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:50.179 08:21:02 accel -- common/autotest_common.sh@10 -- # set +x 00:07:50.179 ************************************ 00:07:50.179 START TEST accel_compare 00:07:50.179 ************************************ 00:07:50.179 08:21:02 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:07:50.179 08:21:02 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:07:50.179 08:21:02 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:07:50.179 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.179 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.179 08:21:02 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:50.179 08:21:02 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:50.179 08:21:02 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:07:50.179 08:21:02 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:50.179 08:21:02 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:50.179 08:21:02 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:50.179 08:21:02 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:50.179 08:21:02 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:50.179 08:21:02 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:07:50.179 08:21:02 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:07:50.179 [2024-07-23 08:21:02.197986] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:50.179 [2024-07-23 08:21:02.198067] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1353389 ] 00:07:50.179 [2024-07-23 08:21:02.322735] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.179 [2024-07-23 08:21:02.569722] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.438 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.439 08:21:02 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.342 08:21:04 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:52.342 08:21:04 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.342 08:21:04 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.342 08:21:04 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.342 08:21:04 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:52.342 08:21:04 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.342 08:21:04 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.342 08:21:04 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.342 08:21:04 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:52.342 08:21:04 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.342 08:21:04 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.343 08:21:04 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.343 08:21:04 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:52.343 08:21:04 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.343 08:21:04 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.343 08:21:04 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.343 08:21:04 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:52.343 08:21:04 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.343 08:21:04 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.343 08:21:04 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.343 08:21:04 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:52.343 08:21:04 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.343 08:21:04 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.343 08:21:04 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.343 08:21:04 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:52.343 08:21:04 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:07:52.343 08:21:04 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:52.343 00:07:52.343 real 0m2.610s 00:07:52.343 user 0m0.011s 00:07:52.343 sys 0m0.001s 00:07:52.343 08:21:04 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:52.343 08:21:04 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:07:52.343 ************************************ 00:07:52.343 END TEST accel_compare 00:07:52.343 ************************************ 00:07:52.343 08:21:04 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:52.343 08:21:04 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:52.343 08:21:04 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:52.343 08:21:04 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:52.343 08:21:04 accel -- common/autotest_common.sh@10 -- # set +x 00:07:52.343 ************************************ 00:07:52.343 START TEST accel_xor 00:07:52.343 ************************************ 00:07:52.343 08:21:04 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:07:52.343 08:21:04 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:52.343 08:21:04 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:52.343 08:21:04 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.343 08:21:04 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.343 08:21:04 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:52.343 08:21:04 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:52.343 08:21:04 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:52.343 08:21:04 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:52.343 08:21:04 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:52.343 08:21:04 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:52.343 08:21:04 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:52.343 08:21:04 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:52.343 08:21:04 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:52.343 08:21:04 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:52.602 [2024-07-23 08:21:04.874269] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:52.602 [2024-07-23 08:21:04.874345] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1353919 ] 00:07:52.602 [2024-07-23 08:21:05.000421] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.881 [2024-07-23 08:21:05.216983] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.140 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.140 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.140 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.140 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.140 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.140 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.141 08:21:05 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:55.047 00:07:55.047 real 0m2.621s 00:07:55.047 user 0m2.410s 00:07:55.047 sys 0m0.192s 00:07:55.047 08:21:07 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:55.047 08:21:07 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:55.047 ************************************ 00:07:55.047 END TEST accel_xor 00:07:55.047 ************************************ 00:07:55.047 08:21:07 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:55.047 08:21:07 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:55.047 08:21:07 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:55.047 08:21:07 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:55.047 08:21:07 accel -- common/autotest_common.sh@10 -- # set +x 00:07:55.047 ************************************ 00:07:55.047 START TEST accel_xor 00:07:55.047 ************************************ 00:07:55.047 08:21:07 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:55.047 08:21:07 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:55.047 [2024-07-23 08:21:07.559732] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:55.047 [2024-07-23 08:21:07.559804] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1354437 ] 00:07:55.307 [2024-07-23 08:21:07.686664] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.566 [2024-07-23 08:21:07.933376] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:55.826 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.827 08:21:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:57.732 08:21:10 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:57.732 00:07:57.732 real 0m2.672s 00:07:57.732 user 0m2.426s 00:07:57.732 sys 0m0.211s 00:07:57.732 08:21:10 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:57.732 08:21:10 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:57.732 ************************************ 00:07:57.732 END TEST accel_xor 00:07:57.732 ************************************ 00:07:57.732 08:21:10 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:57.732 08:21:10 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:57.732 08:21:10 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:57.732 08:21:10 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:57.732 08:21:10 accel -- common/autotest_common.sh@10 -- # set +x 00:07:57.732 ************************************ 00:07:57.732 START TEST accel_dif_verify 00:07:57.732 ************************************ 00:07:57.732 08:21:10 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:07:57.732 08:21:10 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:07:57.732 08:21:10 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:07:57.732 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.732 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.732 08:21:10 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:57.732 08:21:10 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:57.732 08:21:10 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:57.732 08:21:10 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:57.732 08:21:10 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:57.732 08:21:10 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:57.732 08:21:10 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:57.732 08:21:10 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:57.732 08:21:10 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:57.732 08:21:10 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:07:57.991 [2024-07-23 08:21:10.288023] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:07:57.991 [2024-07-23 08:21:10.288110] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1354960 ] 00:07:57.991 [2024-07-23 08:21:10.411802] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.249 [2024-07-23 08:21:10.636655] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.507 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.508 08:21:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:08:00.413 08:21:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:00.413 00:08:00.413 real 0m2.631s 00:08:00.413 user 0m0.012s 00:08:00.413 sys 0m0.001s 00:08:00.413 08:21:12 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:00.413 08:21:12 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:08:00.413 ************************************ 00:08:00.413 END TEST accel_dif_verify 00:08:00.413 ************************************ 00:08:00.413 08:21:12 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:00.413 08:21:12 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:08:00.413 08:21:12 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:00.413 08:21:12 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:00.413 08:21:12 accel -- common/autotest_common.sh@10 -- # set +x 00:08:00.413 ************************************ 00:08:00.413 START TEST accel_dif_generate 00:08:00.413 ************************************ 00:08:00.671 08:21:12 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:08:00.671 08:21:12 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:08:00.671 08:21:12 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:08:00.671 08:21:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.671 08:21:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.671 08:21:12 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:08:00.671 08:21:12 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:00.672 08:21:12 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:08:00.672 08:21:12 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:00.672 08:21:12 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:00.672 08:21:12 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:00.672 08:21:12 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:00.672 08:21:12 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:00.672 08:21:12 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:08:00.672 08:21:12 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:08:00.672 [2024-07-23 08:21:12.976974] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:00.672 [2024-07-23 08:21:12.977047] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1355482 ] 00:08:00.672 [2024-07-23 08:21:13.107895] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.930 [2024-07-23 08:21:13.342037] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:08:01.189 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.190 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.190 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.190 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:08:01.190 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.190 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.190 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.190 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.190 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.190 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.190 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.190 08:21:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.190 08:21:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.190 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.190 08:21:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:03.091 08:21:15 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:03.091 08:21:15 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:03.091 08:21:15 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:03.091 08:21:15 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:03.091 08:21:15 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:03.091 08:21:15 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:03.091 08:21:15 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:03.091 08:21:15 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:03.091 08:21:15 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:03.091 08:21:15 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:03.091 08:21:15 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:03.091 08:21:15 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:03.091 08:21:15 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:03.091 08:21:15 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:03.092 08:21:15 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:03.092 08:21:15 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:03.092 08:21:15 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:03.092 08:21:15 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:03.092 08:21:15 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:03.092 08:21:15 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:03.092 08:21:15 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:03.092 08:21:15 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:03.092 08:21:15 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:03.092 08:21:15 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:03.092 08:21:15 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:03.092 08:21:15 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:08:03.092 08:21:15 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:03.092 00:08:03.092 real 0m2.626s 00:08:03.092 user 0m2.425s 00:08:03.092 sys 0m0.192s 00:08:03.092 08:21:15 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:03.092 08:21:15 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:08:03.092 ************************************ 00:08:03.092 END TEST accel_dif_generate 00:08:03.092 ************************************ 00:08:03.092 08:21:15 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:03.092 08:21:15 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:08:03.092 08:21:15 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:03.092 08:21:15 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:03.092 08:21:15 accel -- common/autotest_common.sh@10 -- # set +x 00:08:03.092 ************************************ 00:08:03.092 START TEST accel_dif_generate_copy 00:08:03.092 ************************************ 00:08:03.092 08:21:15 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:08:03.092 08:21:15 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:03.092 08:21:15 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:08:03.092 08:21:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.092 08:21:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.092 08:21:15 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:08:03.092 08:21:15 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:03.092 08:21:15 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:03.092 08:21:15 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:03.092 08:21:15 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:03.092 08:21:15 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:03.092 08:21:15 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:03.351 08:21:15 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:03.351 08:21:15 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:03.351 08:21:15 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:08:03.351 [2024-07-23 08:21:15.655862] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:03.351 [2024-07-23 08:21:15.655936] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1356004 ] 00:08:03.351 [2024-07-23 08:21:15.787811] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.610 [2024-07-23 08:21:16.017066] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.869 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:03.869 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.869 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.869 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.869 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.870 08:21:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:06.404 00:08:06.404 real 0m2.711s 00:08:06.404 user 0m2.446s 00:08:06.404 sys 0m0.208s 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:06.404 08:21:18 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:08:06.404 ************************************ 00:08:06.404 END TEST accel_dif_generate_copy 00:08:06.404 ************************************ 00:08:06.404 08:21:18 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:06.404 08:21:18 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:08:06.404 08:21:18 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:06.405 08:21:18 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:06.405 08:21:18 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:06.405 08:21:18 accel -- common/autotest_common.sh@10 -- # set +x 00:08:06.405 ************************************ 00:08:06.405 START TEST accel_comp 00:08:06.405 ************************************ 00:08:06.405 08:21:18 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:06.405 08:21:18 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:06.405 08:21:18 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:08:06.405 08:21:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.405 08:21:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.405 08:21:18 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:06.405 08:21:18 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:06.405 08:21:18 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:06.405 08:21:18 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:06.405 08:21:18 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:06.405 08:21:18 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:06.405 08:21:18 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:06.405 08:21:18 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:06.405 08:21:18 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:06.405 08:21:18 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:08:06.405 [2024-07-23 08:21:18.422875] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:06.405 [2024-07-23 08:21:18.422960] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1356535 ] 00:08:06.405 [2024-07-23 08:21:18.547600] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.405 [2024-07-23 08:21:18.776025] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:06.664 08:21:19 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:06.665 08:21:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.665 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:06.665 08:21:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.567 08:21:21 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.568 08:21:21 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.568 08:21:21 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.568 08:21:21 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:08.568 08:21:21 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:08.568 08:21:21 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:08.568 08:21:21 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:08.568 08:21:21 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:08.568 08:21:21 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:08.568 08:21:21 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:08.568 00:08:08.568 real 0m2.668s 00:08:08.568 user 0m2.447s 00:08:08.568 sys 0m0.207s 00:08:08.568 08:21:21 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:08.568 08:21:21 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:08:08.568 ************************************ 00:08:08.568 END TEST accel_comp 00:08:08.568 ************************************ 00:08:08.568 08:21:21 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:08.568 08:21:21 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:08.568 08:21:21 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:08.568 08:21:21 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:08.568 08:21:21 accel -- common/autotest_common.sh@10 -- # set +x 00:08:08.827 ************************************ 00:08:08.827 START TEST accel_decomp 00:08:08.827 ************************************ 00:08:08.827 08:21:21 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:08.827 08:21:21 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:08.827 08:21:21 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:08.827 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:08.827 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:08.827 08:21:21 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:08.827 08:21:21 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:08.827 08:21:21 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:08.827 08:21:21 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:08.827 08:21:21 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:08.827 08:21:21 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:08.827 08:21:21 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:08.827 08:21:21 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:08.827 08:21:21 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:08.827 08:21:21 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:08.827 [2024-07-23 08:21:21.147515] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:08.827 [2024-07-23 08:21:21.147604] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1357055 ] 00:08:08.827 [2024-07-23 08:21:21.285840] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.086 [2024-07-23 08:21:21.515979] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:09.345 08:21:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:11.251 08:21:23 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:11.509 08:21:23 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:11.509 08:21:23 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:11.509 08:21:23 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:11.509 00:08:11.509 real 0m2.670s 00:08:11.509 user 0m2.442s 00:08:11.509 sys 0m0.217s 00:08:11.509 08:21:23 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:11.509 08:21:23 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:11.509 ************************************ 00:08:11.509 END TEST accel_decomp 00:08:11.509 ************************************ 00:08:11.509 08:21:23 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:11.509 08:21:23 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:11.510 08:21:23 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:11.510 08:21:23 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:11.510 08:21:23 accel -- common/autotest_common.sh@10 -- # set +x 00:08:11.510 ************************************ 00:08:11.510 START TEST accel_decomp_full 00:08:11.510 ************************************ 00:08:11.510 08:21:23 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:11.510 08:21:23 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:11.510 08:21:23 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:11.510 08:21:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:11.510 08:21:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:11.510 08:21:23 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:11.510 08:21:23 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:11.510 08:21:23 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:11.510 08:21:23 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:11.510 08:21:23 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:11.510 08:21:23 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:11.510 08:21:23 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:11.510 08:21:23 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:11.510 08:21:23 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:11.510 08:21:23 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:11.510 [2024-07-23 08:21:23.884778] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:11.510 [2024-07-23 08:21:23.884865] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1357596 ] 00:08:11.510 [2024-07-23 08:21:24.010007] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.767 [2024-07-23 08:21:24.233754] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:12.028 08:21:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 08:21:26 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:14.574 08:21:26 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:14.574 00:08:14.574 real 0m2.645s 00:08:14.574 user 0m0.011s 00:08:14.574 sys 0m0.001s 00:08:14.574 08:21:26 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:14.574 08:21:26 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:14.574 ************************************ 00:08:14.574 END TEST accel_decomp_full 00:08:14.574 ************************************ 00:08:14.574 08:21:26 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:14.574 08:21:26 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:14.574 08:21:26 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:14.574 08:21:26 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:14.574 08:21:26 accel -- common/autotest_common.sh@10 -- # set +x 00:08:14.574 ************************************ 00:08:14.574 START TEST accel_decomp_mcore 00:08:14.574 ************************************ 00:08:14.574 08:21:26 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:14.574 08:21:26 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:14.574 08:21:26 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:14.574 08:21:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.574 08:21:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.574 08:21:26 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:14.574 08:21:26 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:14.574 08:21:26 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:14.574 08:21:26 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:14.574 08:21:26 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:14.574 08:21:26 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:14.574 08:21:26 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:14.574 08:21:26 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:14.574 08:21:26 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:14.574 08:21:26 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:14.574 [2024-07-23 08:21:26.594874] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:14.574 [2024-07-23 08:21:26.594967] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1358157 ] 00:08:14.574 [2024-07-23 08:21:26.726857] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:14.574 [2024-07-23 08:21:26.955549] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:14.574 [2024-07-23 08:21:26.955672] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:14.574 [2024-07-23 08:21:26.955691] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.574 [2024-07-23 08:21:26.955702] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:14.833 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.833 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.833 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.833 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.833 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.833 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:14.834 08:21:27 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.740 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.999 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:16.999 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:16.999 08:21:29 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:16.999 00:08:16.999 real 0m2.720s 00:08:16.999 user 0m8.109s 00:08:16.999 sys 0m0.226s 00:08:16.999 08:21:29 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:16.999 08:21:29 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:16.999 ************************************ 00:08:16.999 END TEST accel_decomp_mcore 00:08:16.999 ************************************ 00:08:16.999 08:21:29 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:16.999 08:21:29 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:16.999 08:21:29 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:16.999 08:21:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:16.999 08:21:29 accel -- common/autotest_common.sh@10 -- # set +x 00:08:16.999 ************************************ 00:08:16.999 START TEST accel_decomp_full_mcore 00:08:16.999 ************************************ 00:08:16.999 08:21:29 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:16.999 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:16.999 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:16.999 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:16.999 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:16.999 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:16.999 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:16.999 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:16.999 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:17.000 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:17.000 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:17.000 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:17.000 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:17.000 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:17.000 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:17.000 [2024-07-23 08:21:29.384992] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:17.000 [2024-07-23 08:21:29.385079] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1358720 ] 00:08:17.000 [2024-07-23 08:21:29.513961] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:17.259 [2024-07-23 08:21:29.731996] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:17.259 [2024-07-23 08:21:29.732072] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:17.259 [2024-07-23 08:21:29.732133] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.259 [2024-07-23 08:21:29.732155] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:17.518 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:17.518 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:17.518 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:29 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:17.518 08:21:30 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.051 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:20.051 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.051 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.051 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.051 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:20.051 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.051 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.051 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.051 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:20.051 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.051 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:20.052 00:08:20.052 real 0m2.730s 00:08:20.052 user 0m8.173s 00:08:20.052 sys 0m0.240s 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:20.052 08:21:32 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:20.052 ************************************ 00:08:20.052 END TEST accel_decomp_full_mcore 00:08:20.052 ************************************ 00:08:20.052 08:21:32 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:20.052 08:21:32 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:20.052 08:21:32 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:20.052 08:21:32 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:20.052 08:21:32 accel -- common/autotest_common.sh@10 -- # set +x 00:08:20.052 ************************************ 00:08:20.052 START TEST accel_decomp_mthread 00:08:20.052 ************************************ 00:08:20.052 08:21:32 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:20.052 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:20.052 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:20.052 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.052 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.052 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:20.052 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:20.052 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:20.052 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:20.052 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:20.052 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:20.052 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:20.052 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:20.052 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:20.052 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:20.052 [2024-07-23 08:21:32.178295] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:20.052 [2024-07-23 08:21:32.178379] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1359306 ] 00:08:20.052 [2024-07-23 08:21:32.299455] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.052 [2024-07-23 08:21:32.508673] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:20.311 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:20.312 08:21:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:22.853 00:08:22.853 real 0m2.644s 00:08:22.853 user 0m2.432s 00:08:22.853 sys 0m0.204s 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:22.853 08:21:34 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:22.853 ************************************ 00:08:22.853 END TEST accel_decomp_mthread 00:08:22.853 ************************************ 00:08:22.853 08:21:34 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:22.853 08:21:34 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:22.854 08:21:34 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:22.854 08:21:34 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:22.854 08:21:34 accel -- common/autotest_common.sh@10 -- # set +x 00:08:22.854 ************************************ 00:08:22.854 START TEST accel_decomp_full_mthread 00:08:22.854 ************************************ 00:08:22.854 08:21:34 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:22.854 08:21:34 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:22.854 08:21:34 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:22.854 08:21:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:22.854 08:21:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:22.854 08:21:34 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:22.854 08:21:34 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:22.854 08:21:34 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:22.854 08:21:34 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:22.854 08:21:34 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:22.854 08:21:34 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:22.854 08:21:34 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:22.854 08:21:34 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:22.854 08:21:34 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:22.854 08:21:34 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:22.854 [2024-07-23 08:21:34.889952] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:22.854 [2024-07-23 08:21:34.890036] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1359844 ] 00:08:22.854 [2024-07-23 08:21:35.010674] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:22.854 [2024-07-23 08:21:35.218867] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.113 08:21:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:25.015 00:08:25.015 real 0m2.667s 00:08:25.015 user 0m2.464s 00:08:25.015 sys 0m0.197s 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:25.015 08:21:37 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:25.015 ************************************ 00:08:25.015 END TEST accel_decomp_full_mthread 00:08:25.015 ************************************ 00:08:25.274 08:21:37 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:25.274 08:21:37 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:08:25.274 08:21:37 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:08:25.274 08:21:37 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:08:25.274 08:21:37 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:25.274 08:21:37 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1360358 00:08:25.274 08:21:37 accel -- accel/accel.sh@63 -- # waitforlisten 1360358 00:08:25.274 08:21:37 accel -- common/autotest_common.sh@829 -- # '[' -z 1360358 ']' 00:08:25.274 08:21:37 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:25.274 08:21:37 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:25.274 08:21:37 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:25.274 08:21:37 accel -- accel/accel.sh@61 -- # build_accel_config 00:08:25.274 08:21:37 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:25.274 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:25.274 08:21:37 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:25.274 08:21:37 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:25.274 08:21:37 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:25.274 08:21:37 accel -- common/autotest_common.sh@10 -- # set +x 00:08:25.274 08:21:37 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:25.274 08:21:37 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:25.274 08:21:37 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:25.274 08:21:37 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:25.274 08:21:37 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:25.274 08:21:37 accel -- accel/accel.sh@41 -- # jq -r . 00:08:25.274 [2024-07-23 08:21:37.628864] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:25.274 [2024-07-23 08:21:37.628954] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1360358 ] 00:08:25.274 [2024-07-23 08:21:37.753058] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.533 [2024-07-23 08:21:37.957047] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.469 [2024-07-23 08:21:38.908589] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@862 -- # return 0 00:08:27.846 08:21:40 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:27.846 08:21:40 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:27.846 08:21:40 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:27.846 08:21:40 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:08:27.846 08:21:40 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:08:27.846 08:21:40 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:08:27.846 08:21:40 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:27.846 08:21:40 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@10 -- # set +x 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:27.846 "method": "compressdev_scan_accel_module", 00:08:27.846 08:21:40 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:27.846 08:21:40 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:27.846 08:21:40 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@10 -- # set +x 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:27.846 08:21:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # IFS== 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:27.846 08:21:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:27.846 08:21:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # IFS== 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:27.846 08:21:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:27.846 08:21:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # IFS== 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:27.846 08:21:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:27.846 08:21:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # IFS== 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:27.846 08:21:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:27.846 08:21:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # IFS== 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:27.846 08:21:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:27.846 08:21:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # IFS== 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:27.846 08:21:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:27.846 08:21:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # IFS== 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:27.846 08:21:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:27.846 08:21:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # IFS== 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:27.846 08:21:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:27.846 08:21:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # IFS== 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:27.846 08:21:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:27.846 08:21:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # IFS== 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:27.846 08:21:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:27.846 08:21:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # IFS== 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:27.846 08:21:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:27.846 08:21:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # IFS== 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:27.846 08:21:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:27.846 08:21:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # IFS== 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:27.846 08:21:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:27.846 08:21:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # IFS== 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:27.846 08:21:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:27.846 08:21:40 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # IFS== 00:08:27.846 08:21:40 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:27.846 08:21:40 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:27.846 08:21:40 accel -- accel/accel.sh@75 -- # killprocess 1360358 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@948 -- # '[' -z 1360358 ']' 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@952 -- # kill -0 1360358 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@953 -- # uname 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1360358 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1360358' 00:08:27.846 killing process with pid 1360358 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@967 -- # kill 1360358 00:08:27.846 08:21:40 accel -- common/autotest_common.sh@972 -- # wait 1360358 00:08:30.381 08:21:42 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:30.381 08:21:42 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:30.381 08:21:42 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:30.381 08:21:42 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:30.381 08:21:42 accel -- common/autotest_common.sh@10 -- # set +x 00:08:30.381 ************************************ 00:08:30.381 START TEST accel_cdev_comp 00:08:30.381 ************************************ 00:08:30.381 08:21:42 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:30.381 08:21:42 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:30.381 08:21:42 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:08:30.381 08:21:42 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:30.381 08:21:42 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:30.381 08:21:42 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:30.381 08:21:42 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:30.381 08:21:42 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:30.381 08:21:42 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:30.381 08:21:42 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:30.381 08:21:42 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:30.381 08:21:42 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:30.381 08:21:42 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:30.381 08:21:42 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:30.381 08:21:42 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:30.381 08:21:42 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:08:30.381 [2024-07-23 08:21:42.503063] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:30.381 [2024-07-23 08:21:42.503145] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1361180 ] 00:08:30.381 [2024-07-23 08:21:42.628375] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.381 [2024-07-23 08:21:42.859522] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.317 [2024-07-23 08:21:43.825775] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:31.317 [2024-07-23 08:21:43.827871] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000147e0 PMD being used: compress_qat 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:31.317 [2024-07-23 08:21:43.833622] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000148c0 PMD being used: compress_qat 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.317 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:31.576 08:21:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:32.952 08:21:45 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:32.952 00:08:32.952 real 0m3.001s 00:08:32.952 user 0m2.561s 00:08:32.952 sys 0m0.423s 00:08:32.952 08:21:45 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:32.952 08:21:45 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:08:32.952 ************************************ 00:08:32.952 END TEST accel_cdev_comp 00:08:32.952 ************************************ 00:08:33.210 08:21:45 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:33.210 08:21:45 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:33.210 08:21:45 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:33.210 08:21:45 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:33.210 08:21:45 accel -- common/autotest_common.sh@10 -- # set +x 00:08:33.210 ************************************ 00:08:33.210 START TEST accel_cdev_decomp 00:08:33.210 ************************************ 00:08:33.210 08:21:45 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:33.210 08:21:45 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:33.210 08:21:45 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:33.210 08:21:45 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:33.210 08:21:45 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:33.210 08:21:45 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:33.210 08:21:45 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:33.210 08:21:45 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:33.210 08:21:45 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:33.210 08:21:45 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:33.210 08:21:45 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:33.210 08:21:45 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:33.210 08:21:45 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:33.210 08:21:45 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:33.210 08:21:45 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:33.210 08:21:45 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:33.210 [2024-07-23 08:21:45.556960] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:33.210 [2024-07-23 08:21:45.557045] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1361779 ] 00:08:33.210 [2024-07-23 08:21:45.685356] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.469 [2024-07-23 08:21:45.897964] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.402 [2024-07-23 08:21:46.787124] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:34.402 [2024-07-23 08:21:46.789261] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000147e0 PMD being used: compress_qat 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:34.402 [2024-07-23 08:21:46.795253] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000148c0 PMD being used: compress_qat 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.402 08:21:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:36.301 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:36.302 08:21:48 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:36.302 00:08:36.302 real 0m2.919s 00:08:36.302 user 0m2.509s 00:08:36.302 sys 0m0.411s 00:08:36.302 08:21:48 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:36.302 08:21:48 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:36.302 ************************************ 00:08:36.302 END TEST accel_cdev_decomp 00:08:36.302 ************************************ 00:08:36.302 08:21:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:36.302 08:21:48 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:36.302 08:21:48 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:36.302 08:21:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:36.302 08:21:48 accel -- common/autotest_common.sh@10 -- # set +x 00:08:36.302 ************************************ 00:08:36.302 START TEST accel_cdev_decomp_full 00:08:36.302 ************************************ 00:08:36.302 08:21:48 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:36.302 08:21:48 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:36.302 08:21:48 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:36.302 08:21:48 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:36.302 08:21:48 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:36.302 08:21:48 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:36.302 08:21:48 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:36.302 08:21:48 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:36.302 08:21:48 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:36.302 08:21:48 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:36.302 08:21:48 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:36.302 08:21:48 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:36.302 08:21:48 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:36.302 08:21:48 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:36.302 08:21:48 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:36.302 08:21:48 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:36.302 [2024-07-23 08:21:48.558430] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:36.302 [2024-07-23 08:21:48.558502] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1362444 ] 00:08:36.302 [2024-07-23 08:21:48.682555] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.560 [2024-07-23 08:21:48.893253] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.494 [2024-07-23 08:21:49.818050] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:37.494 [2024-07-23 08:21:49.820112] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000147e0 PMD being used: compress_qat 00:08:37.494 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:37.494 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.494 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.494 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.494 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:37.494 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.494 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:37.495 [2024-07-23 08:21:49.825474] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000148c0 PMD being used: compress_qat 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.495 08:21:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:39.393 00:08:39.393 real 0m2.958s 00:08:39.393 user 0m2.521s 00:08:39.393 sys 0m0.419s 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:39.393 08:21:51 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:39.393 ************************************ 00:08:39.393 END TEST accel_cdev_decomp_full 00:08:39.393 ************************************ 00:08:39.393 08:21:51 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:39.393 08:21:51 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:39.393 08:21:51 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:39.393 08:21:51 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:39.393 08:21:51 accel -- common/autotest_common.sh@10 -- # set +x 00:08:39.393 ************************************ 00:08:39.393 START TEST accel_cdev_decomp_mcore 00:08:39.393 ************************************ 00:08:39.393 08:21:51 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:39.393 08:21:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:39.393 08:21:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:39.393 08:21:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:39.393 08:21:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:39.393 08:21:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:39.393 08:21:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:39.393 08:21:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:39.393 08:21:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:39.393 08:21:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:39.393 08:21:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:39.393 08:21:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:39.393 08:21:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:39.393 08:21:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:39.393 08:21:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:39.393 08:21:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:39.393 [2024-07-23 08:21:51.585126] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:39.393 [2024-07-23 08:21:51.585218] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1362979 ] 00:08:39.394 [2024-07-23 08:21:51.706282] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:39.652 [2024-07-23 08:21:51.923916] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:39.652 [2024-07-23 08:21:51.923993] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:39.652 [2024-07-23 08:21:51.924009] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.652 [2024-07-23 08:21:51.924020] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:40.587 [2024-07-23 08:21:52.864959] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:40.587 [2024-07-23 08:21:52.867090] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e00002b840 PMD being used: compress_qat 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.587 [2024-07-23 08:21:52.874789] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e000010100 PMD being used: compress_qat 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.587 [2024-07-23 08:21:52.876435] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e000017100 PMD being used: compress_qat 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.587 [2024-07-23 08:21:52.879680] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e000020160 PMD being used: compress_qat 00:08:40.587 [2024-07-23 08:21:52.879775] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e00002b920 PMD being used: compress_qat 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.587 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.588 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:40.588 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.588 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.588 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.588 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:40.588 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.588 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.588 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.588 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:40.588 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.588 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.588 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:40.588 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:40.588 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:40.588 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:40.588 08:21:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:42.491 00:08:42.491 real 0m3.095s 00:08:42.491 user 0m9.371s 00:08:42.491 sys 0m0.448s 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:42.491 08:21:54 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:42.491 ************************************ 00:08:42.491 END TEST accel_cdev_decomp_mcore 00:08:42.491 ************************************ 00:08:42.491 08:21:54 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:42.491 08:21:54 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:42.491 08:21:54 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:42.491 08:21:54 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:42.491 08:21:54 accel -- common/autotest_common.sh@10 -- # set +x 00:08:42.491 ************************************ 00:08:42.491 START TEST accel_cdev_decomp_full_mcore 00:08:42.491 ************************************ 00:08:42.491 08:21:54 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:42.491 08:21:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:42.491 08:21:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:42.491 08:21:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:42.491 08:21:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:42.491 08:21:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:42.491 08:21:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:42.491 08:21:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:42.491 08:21:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:42.491 08:21:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:42.491 08:21:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:42.491 08:21:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:42.491 08:21:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:42.491 08:21:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:42.491 08:21:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:42.491 08:21:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:42.491 [2024-07-23 08:21:54.745209] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:42.491 [2024-07-23 08:21:54.745281] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1363518 ] 00:08:42.491 [2024-07-23 08:21:54.867167] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:42.750 [2024-07-23 08:21:55.103668] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:42.750 [2024-07-23 08:21:55.103688] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:42.750 [2024-07-23 08:21:55.103708] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.750 [2024-07-23 08:21:55.103714] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:43.686 [2024-07-23 08:21:56.062876] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:43.686 [2024-07-23 08:21:56.065073] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e00002b840 PMD being used: compress_qat 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 [2024-07-23 08:21:56.072034] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e000010100 PMD being used: compress_qat 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 [2024-07-23 08:21:56.073600] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e000017100 PMD being used: compress_qat 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:43.686 [2024-07-23 08:21:56.076530] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e000020160 PMD being used: compress_qat 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 [2024-07-23 08:21:56.076674] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e00002b920 PMD being used: compress_qat 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:43.686 08:21:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:45.619 00:08:45.619 real 0m3.147s 00:08:45.619 user 0m9.500s 00:08:45.619 sys 0m0.441s 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:45.619 08:21:57 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:45.619 ************************************ 00:08:45.619 END TEST accel_cdev_decomp_full_mcore 00:08:45.619 ************************************ 00:08:45.619 08:21:57 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:45.619 08:21:57 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:45.619 08:21:57 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:45.619 08:21:57 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:45.619 08:21:57 accel -- common/autotest_common.sh@10 -- # set +x 00:08:45.619 ************************************ 00:08:45.619 START TEST accel_cdev_decomp_mthread 00:08:45.619 ************************************ 00:08:45.619 08:21:57 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:45.619 08:21:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:45.619 08:21:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:45.619 08:21:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:45.619 08:21:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:45.619 08:21:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:45.619 08:21:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:45.619 08:21:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:45.619 08:21:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:45.619 08:21:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:45.619 08:21:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:45.619 08:21:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:45.619 08:21:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:45.619 08:21:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:45.619 08:21:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:45.619 08:21:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:45.619 [2024-07-23 08:21:57.961408] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:45.619 [2024-07-23 08:21:57.961481] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1364222 ] 00:08:45.619 [2024-07-23 08:21:58.084993] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.878 [2024-07-23 08:21:58.294826] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.814 [2024-07-23 08:21:59.197999] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:46.814 [2024-07-23 08:21:59.200102] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000147e0 PMD being used: compress_qat 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.814 [2024-07-23 08:21:59.207391] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000148c0 PMD being used: compress_qat 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:46.814 [2024-07-23 08:21:59.210373] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000149a0 PMD being used: compress_qat 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:46.814 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:46.815 08:21:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:48.719 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:48.720 08:22:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:48.720 00:08:48.720 real 0m2.933s 00:08:48.720 user 0m2.507s 00:08:48.720 sys 0m0.409s 00:08:48.720 08:22:00 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:48.720 08:22:00 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:48.720 ************************************ 00:08:48.720 END TEST accel_cdev_decomp_mthread 00:08:48.720 ************************************ 00:08:48.720 08:22:00 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:48.720 08:22:00 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:48.720 08:22:00 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:48.720 08:22:00 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:48.720 08:22:00 accel -- common/autotest_common.sh@10 -- # set +x 00:08:48.720 ************************************ 00:08:48.720 START TEST accel_cdev_decomp_full_mthread 00:08:48.720 ************************************ 00:08:48.720 08:22:00 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:48.720 08:22:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:48.720 08:22:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:48.720 08:22:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:48.720 08:22:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:48.720 08:22:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:48.720 08:22:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:48.720 08:22:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:48.720 08:22:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:48.720 08:22:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:48.720 08:22:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:48.720 08:22:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:48.720 08:22:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:48.720 08:22:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:48.720 08:22:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:48.720 08:22:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:48.720 [2024-07-23 08:22:00.962560] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:48.720 [2024-07-23 08:22:00.962639] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1364797 ] 00:08:48.720 [2024-07-23 08:22:01.084113] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:48.979 [2024-07-23 08:22:01.292725] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.914 [2024-07-23 08:22:02.223385] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:49.914 [2024-07-23 08:22:02.225494] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000147e0 PMD being used: compress_qat 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 [2024-07-23 08:22:02.232259] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000148c0 PMD being used: compress_qat 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 [2024-07-23 08:22:02.238544] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000149a0 PMD being used: compress_qat 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.914 08:22:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:51.819 00:08:51.819 real 0m2.986s 00:08:51.819 user 0m2.574s 00:08:51.819 sys 0m0.406s 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:51.819 08:22:03 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:51.819 ************************************ 00:08:51.819 END TEST accel_cdev_decomp_full_mthread 00:08:51.819 ************************************ 00:08:51.819 08:22:03 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:51.819 08:22:03 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:08:51.819 08:22:03 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:51.819 08:22:03 accel -- accel/accel.sh@137 -- # build_accel_config 00:08:51.819 08:22:03 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:51.819 08:22:03 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:51.819 08:22:03 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:51.819 08:22:03 accel -- common/autotest_common.sh@10 -- # set +x 00:08:51.819 08:22:03 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:51.819 08:22:03 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:51.819 08:22:03 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:51.819 08:22:03 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:51.819 08:22:03 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:51.819 08:22:03 accel -- accel/accel.sh@41 -- # jq -r . 00:08:51.819 ************************************ 00:08:51.819 START TEST accel_dif_functional_tests 00:08:51.819 ************************************ 00:08:51.819 08:22:03 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:51.819 [2024-07-23 08:22:04.038773] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:51.819 [2024-07-23 08:22:04.038860] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1365338 ] 00:08:51.819 [2024-07-23 08:22:04.157231] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:52.078 [2024-07-23 08:22:04.371126] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:52.078 [2024-07-23 08:22:04.371192] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.078 [2024-07-23 08:22:04.371200] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:52.336 00:08:52.336 00:08:52.336 CUnit - A unit testing framework for C - Version 2.1-3 00:08:52.336 http://cunit.sourceforge.net/ 00:08:52.336 00:08:52.336 00:08:52.336 Suite: accel_dif 00:08:52.336 Test: verify: DIF generated, GUARD check ...passed 00:08:52.337 Test: verify: DIF generated, APPTAG check ...passed 00:08:52.337 Test: verify: DIF generated, REFTAG check ...passed 00:08:52.337 Test: verify: DIF not generated, GUARD check ...[2024-07-23 08:22:04.754359] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:52.337 passed 00:08:52.337 Test: verify: DIF not generated, APPTAG check ...[2024-07-23 08:22:04.754425] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:52.337 passed 00:08:52.337 Test: verify: DIF not generated, REFTAG check ...[2024-07-23 08:22:04.754471] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:52.337 passed 00:08:52.337 Test: verify: APPTAG correct, APPTAG check ...passed 00:08:52.337 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-23 08:22:04.754544] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:52.337 passed 00:08:52.337 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:08:52.337 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:08:52.337 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:08:52.337 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-23 08:22:04.754700] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:52.337 passed 00:08:52.337 Test: verify copy: DIF generated, GUARD check ...passed 00:08:52.337 Test: verify copy: DIF generated, APPTAG check ...passed 00:08:52.337 Test: verify copy: DIF generated, REFTAG check ...passed 00:08:52.337 Test: verify copy: DIF not generated, GUARD check ...[2024-07-23 08:22:04.754888] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:52.337 passed 00:08:52.337 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-23 08:22:04.754931] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:52.337 passed 00:08:52.337 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-23 08:22:04.754970] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:52.337 passed 00:08:52.337 Test: generate copy: DIF generated, GUARD check ...passed 00:08:52.337 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:52.337 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:52.337 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:52.337 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:52.337 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:52.337 Test: generate copy: iovecs-len validate ...[2024-07-23 08:22:04.755265] dif.c:1225:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:52.337 passed 00:08:52.337 Test: generate copy: buffer alignment validate ...passed 00:08:52.337 00:08:52.337 Run Summary: Type Total Ran Passed Failed Inactive 00:08:52.337 suites 1 1 n/a 0 0 00:08:52.337 tests 26 26 26 0 0 00:08:52.337 asserts 115 115 115 0 n/a 00:08:52.337 00:08:52.337 Elapsed time = 0.003 seconds 00:08:53.713 00:08:53.713 real 0m2.061s 00:08:53.713 user 0m4.261s 00:08:53.713 sys 0m0.243s 00:08:53.713 08:22:06 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:53.713 08:22:06 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:08:53.713 ************************************ 00:08:53.713 END TEST accel_dif_functional_tests 00:08:53.713 ************************************ 00:08:53.713 08:22:06 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:53.713 00:08:53.713 real 1m30.602s 00:08:53.713 user 1m48.263s 00:08:53.713 sys 0m10.435s 00:08:53.713 08:22:06 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:53.713 08:22:06 accel -- common/autotest_common.sh@10 -- # set +x 00:08:53.713 ************************************ 00:08:53.713 END TEST accel 00:08:53.713 ************************************ 00:08:53.713 08:22:06 -- common/autotest_common.sh@1142 -- # return 0 00:08:53.713 08:22:06 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:53.713 08:22:06 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:53.713 08:22:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:53.713 08:22:06 -- common/autotest_common.sh@10 -- # set +x 00:08:53.713 ************************************ 00:08:53.713 START TEST accel_rpc 00:08:53.713 ************************************ 00:08:53.713 08:22:06 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:53.713 * Looking for test storage... 00:08:53.713 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:53.713 08:22:06 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:53.713 08:22:06 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1365882 00:08:53.713 08:22:06 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:53.713 08:22:06 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 1365882 00:08:53.713 08:22:06 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 1365882 ']' 00:08:53.713 08:22:06 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:53.713 08:22:06 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:53.713 08:22:06 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:53.713 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:53.714 08:22:06 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:53.714 08:22:06 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:53.971 [2024-07-23 08:22:06.288249] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:53.971 [2024-07-23 08:22:06.288343] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1365882 ] 00:08:53.971 [2024-07-23 08:22:06.411258] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:54.229 [2024-07-23 08:22:06.612802] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:54.796 08:22:07 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:54.796 08:22:07 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:08:54.796 08:22:07 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:54.796 08:22:07 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:54.796 08:22:07 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:54.796 08:22:07 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:54.796 08:22:07 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:54.796 08:22:07 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:54.796 08:22:07 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:54.796 08:22:07 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:54.796 ************************************ 00:08:54.796 START TEST accel_assign_opcode 00:08:54.796 ************************************ 00:08:54.796 08:22:07 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:08:54.796 08:22:07 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:54.796 08:22:07 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:54.796 08:22:07 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:54.796 [2024-07-23 08:22:07.074569] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:54.796 08:22:07 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:54.796 08:22:07 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:54.796 08:22:07 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:54.796 08:22:07 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:54.796 [2024-07-23 08:22:07.082550] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:54.796 08:22:07 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:54.796 08:22:07 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:54.796 08:22:07 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:54.796 08:22:07 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:55.729 08:22:08 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:55.729 08:22:08 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:55.729 08:22:08 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:55.729 08:22:08 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:55.729 08:22:08 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:08:55.729 08:22:08 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:55.729 08:22:08 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:55.729 software 00:08:55.729 00:08:55.729 real 0m1.009s 00:08:55.729 user 0m0.049s 00:08:55.729 sys 0m0.009s 00:08:55.729 08:22:08 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:55.729 08:22:08 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:55.729 ************************************ 00:08:55.729 END TEST accel_assign_opcode 00:08:55.729 ************************************ 00:08:55.729 08:22:08 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:55.729 08:22:08 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 1365882 00:08:55.729 08:22:08 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 1365882 ']' 00:08:55.729 08:22:08 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 1365882 00:08:55.729 08:22:08 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:08:55.729 08:22:08 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:55.729 08:22:08 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1365882 00:08:55.729 08:22:08 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:55.729 08:22:08 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:55.729 08:22:08 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1365882' 00:08:55.729 killing process with pid 1365882 00:08:55.729 08:22:08 accel_rpc -- common/autotest_common.sh@967 -- # kill 1365882 00:08:55.729 08:22:08 accel_rpc -- common/autotest_common.sh@972 -- # wait 1365882 00:08:58.260 00:08:58.260 real 0m4.416s 00:08:58.260 user 0m4.284s 00:08:58.260 sys 0m0.546s 00:08:58.260 08:22:10 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:58.260 08:22:10 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:58.260 ************************************ 00:08:58.260 END TEST accel_rpc 00:08:58.260 ************************************ 00:08:58.260 08:22:10 -- common/autotest_common.sh@1142 -- # return 0 00:08:58.260 08:22:10 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:58.260 08:22:10 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:58.260 08:22:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:58.260 08:22:10 -- common/autotest_common.sh@10 -- # set +x 00:08:58.260 ************************************ 00:08:58.260 START TEST app_cmdline 00:08:58.260 ************************************ 00:08:58.260 08:22:10 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:58.260 * Looking for test storage... 00:08:58.260 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:58.260 08:22:10 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:58.260 08:22:10 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=1366724 00:08:58.260 08:22:10 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 1366724 00:08:58.260 08:22:10 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:58.260 08:22:10 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 1366724 ']' 00:08:58.260 08:22:10 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:58.260 08:22:10 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:58.260 08:22:10 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:58.260 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:58.260 08:22:10 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:58.260 08:22:10 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:58.260 [2024-07-23 08:22:10.770208] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:08:58.260 [2024-07-23 08:22:10.770319] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1366724 ] 00:08:58.518 [2024-07-23 08:22:10.895419] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:58.776 [2024-07-23 08:22:11.102018] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:59.712 08:22:12 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:59.712 08:22:12 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:08:59.712 08:22:12 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:59.970 { 00:08:59.970 "version": "SPDK v24.09-pre git sha1 f7b31b2b9", 00:08:59.970 "fields": { 00:08:59.970 "major": 24, 00:08:59.970 "minor": 9, 00:08:59.970 "patch": 0, 00:08:59.970 "suffix": "-pre", 00:08:59.970 "commit": "f7b31b2b9" 00:08:59.970 } 00:08:59.970 } 00:08:59.970 08:22:12 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:08:59.970 08:22:12 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:59.970 08:22:12 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:59.970 08:22:12 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:59.970 08:22:12 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:59.970 08:22:12 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:59.970 08:22:12 app_cmdline -- app/cmdline.sh@26 -- # sort 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:59.970 08:22:12 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:59.970 08:22:12 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:59.970 08:22:12 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:59.970 request: 00:08:59.970 { 00:08:59.970 "method": "env_dpdk_get_mem_stats", 00:08:59.970 "req_id": 1 00:08:59.970 } 00:08:59.970 Got JSON-RPC error response 00:08:59.970 response: 00:08:59.970 { 00:08:59.970 "code": -32601, 00:08:59.970 "message": "Method not found" 00:08:59.970 } 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:59.970 08:22:12 app_cmdline -- app/cmdline.sh@1 -- # killprocess 1366724 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 1366724 ']' 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 1366724 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1366724 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:59.970 08:22:12 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:00.229 08:22:12 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1366724' 00:09:00.229 killing process with pid 1366724 00:09:00.229 08:22:12 app_cmdline -- common/autotest_common.sh@967 -- # kill 1366724 00:09:00.229 08:22:12 app_cmdline -- common/autotest_common.sh@972 -- # wait 1366724 00:09:02.827 00:09:02.827 real 0m4.356s 00:09:02.827 user 0m4.495s 00:09:02.827 sys 0m0.568s 00:09:02.827 08:22:14 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:02.827 08:22:14 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:09:02.827 ************************************ 00:09:02.827 END TEST app_cmdline 00:09:02.827 ************************************ 00:09:02.827 08:22:14 -- common/autotest_common.sh@1142 -- # return 0 00:09:02.827 08:22:14 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:09:02.827 08:22:14 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:02.827 08:22:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:02.827 08:22:14 -- common/autotest_common.sh@10 -- # set +x 00:09:02.827 ************************************ 00:09:02.827 START TEST version 00:09:02.827 ************************************ 00:09:02.827 08:22:15 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:09:02.827 * Looking for test storage... 00:09:02.827 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:09:02.827 08:22:15 version -- app/version.sh@17 -- # get_header_version major 00:09:02.827 08:22:15 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:02.827 08:22:15 version -- app/version.sh@14 -- # cut -f2 00:09:02.827 08:22:15 version -- app/version.sh@14 -- # tr -d '"' 00:09:02.827 08:22:15 version -- app/version.sh@17 -- # major=24 00:09:02.827 08:22:15 version -- app/version.sh@18 -- # get_header_version minor 00:09:02.827 08:22:15 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:02.827 08:22:15 version -- app/version.sh@14 -- # cut -f2 00:09:02.827 08:22:15 version -- app/version.sh@14 -- # tr -d '"' 00:09:02.827 08:22:15 version -- app/version.sh@18 -- # minor=9 00:09:02.827 08:22:15 version -- app/version.sh@19 -- # get_header_version patch 00:09:02.827 08:22:15 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:02.827 08:22:15 version -- app/version.sh@14 -- # cut -f2 00:09:02.827 08:22:15 version -- app/version.sh@14 -- # tr -d '"' 00:09:02.827 08:22:15 version -- app/version.sh@19 -- # patch=0 00:09:02.827 08:22:15 version -- app/version.sh@20 -- # get_header_version suffix 00:09:02.827 08:22:15 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:02.827 08:22:15 version -- app/version.sh@14 -- # tr -d '"' 00:09:02.827 08:22:15 version -- app/version.sh@14 -- # cut -f2 00:09:02.827 08:22:15 version -- app/version.sh@20 -- # suffix=-pre 00:09:02.827 08:22:15 version -- app/version.sh@22 -- # version=24.9 00:09:02.827 08:22:15 version -- app/version.sh@25 -- # (( patch != 0 )) 00:09:02.827 08:22:15 version -- app/version.sh@28 -- # version=24.9rc0 00:09:02.827 08:22:15 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:09:02.827 08:22:15 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:09:02.827 08:22:15 version -- app/version.sh@30 -- # py_version=24.9rc0 00:09:02.827 08:22:15 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:09:02.827 00:09:02.827 real 0m0.145s 00:09:02.827 user 0m0.075s 00:09:02.827 sys 0m0.103s 00:09:02.827 08:22:15 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:02.827 08:22:15 version -- common/autotest_common.sh@10 -- # set +x 00:09:02.827 ************************************ 00:09:02.827 END TEST version 00:09:02.827 ************************************ 00:09:02.827 08:22:15 -- common/autotest_common.sh@1142 -- # return 0 00:09:02.827 08:22:15 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:09:02.827 08:22:15 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:09:02.827 08:22:15 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:02.827 08:22:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:02.827 08:22:15 -- common/autotest_common.sh@10 -- # set +x 00:09:02.827 ************************************ 00:09:02.827 START TEST blockdev_general 00:09:02.827 ************************************ 00:09:02.827 08:22:15 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:09:02.827 * Looking for test storage... 00:09:02.827 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:02.827 08:22:15 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:09:02.827 08:22:15 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:09:02.827 08:22:15 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:09:02.827 08:22:15 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:02.827 08:22:15 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:09:02.827 08:22:15 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:09:02.827 08:22:15 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:09:02.827 08:22:15 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@673 -- # uname -s 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@681 -- # test_type=bdev 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@682 -- # crypto_device= 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@683 -- # dek= 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@684 -- # env_ctx= 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@689 -- # [[ bdev == bdev ]] 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1367683 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:09:02.828 08:22:15 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 1367683 00:09:02.828 08:22:15 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 1367683 ']' 00:09:02.828 08:22:15 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:02.828 08:22:15 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:02.828 08:22:15 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:02.828 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:02.828 08:22:15 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:02.828 08:22:15 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:03.086 [2024-07-23 08:22:15.404415] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:09:03.086 [2024-07-23 08:22:15.404524] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1367683 ] 00:09:03.086 [2024-07-23 08:22:15.525923] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:03.344 [2024-07-23 08:22:15.732794] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.910 08:22:16 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:03.910 08:22:16 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:09:03.910 08:22:16 blockdev_general -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:09:03.910 08:22:16 blockdev_general -- bdev/blockdev.sh@695 -- # setup_bdev_conf 00:09:03.910 08:22:16 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:09:03.910 08:22:16 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:03.910 08:22:16 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:04.845 [2024-07-23 08:22:17.128817] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:04.845 [2024-07-23 08:22:17.128871] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:04.845 00:09:04.845 [2024-07-23 08:22:17.136814] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:04.845 [2024-07-23 08:22:17.136846] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:04.845 00:09:04.845 Malloc0 00:09:04.845 Malloc1 00:09:04.845 Malloc2 00:09:04.845 Malloc3 00:09:04.845 Malloc4 00:09:05.104 Malloc5 00:09:05.104 Malloc6 00:09:05.104 Malloc7 00:09:05.104 Malloc8 00:09:05.104 Malloc9 00:09:05.104 [2024-07-23 08:22:17.583296] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:05.104 [2024-07-23 08:22:17.583346] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:05.104 [2024-07-23 08:22:17.583365] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600003a280 00:09:05.104 [2024-07-23 08:22:17.583374] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:05.104 [2024-07-23 08:22:17.585304] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:05.104 [2024-07-23 08:22:17.585332] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:05.104 TestPT 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:05.364 08:22:17 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:09:05.364 5000+0 records in 00:09:05.364 5000+0 records out 00:09:05.364 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0170972 s, 599 MB/s 00:09:05.364 08:22:17 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:05.364 AIO0 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:05.364 08:22:17 blockdev_general -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:05.364 08:22:17 blockdev_general -- bdev/blockdev.sh@739 -- # cat 00:09:05.364 08:22:17 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:05.364 08:22:17 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:05.364 08:22:17 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:05.364 08:22:17 blockdev_general -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:09:05.364 08:22:17 blockdev_general -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:09:05.364 08:22:17 blockdev_general -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:05.364 08:22:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:05.364 08:22:17 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:09:05.364 08:22:17 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r .name 00:09:05.365 08:22:17 blockdev_general -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "86dea539-55d6-4d4b-9c97-7e50c910fee0"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "86dea539-55d6-4d4b-9c97-7e50c910fee0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "e39005d9-eb59-5112-990d-8bf611531d00"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e39005d9-eb59-5112-990d-8bf611531d00",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "f82f2244-f75d-5d24-bb0d-ff9a01293d4a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "f82f2244-f75d-5d24-bb0d-ff9a01293d4a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "c3f3088d-637e-5e3c-97ff-3117d2d1fafd"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c3f3088d-637e-5e3c-97ff-3117d2d1fafd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "f3ffcb71-d42c-5959-a770-2d5b0bdd63fa"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f3ffcb71-d42c-5959-a770-2d5b0bdd63fa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "15c1e107-d95e-530a-b4f1-318ccb8c0ae9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "15c1e107-d95e-530a-b4f1-318ccb8c0ae9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "7cc67248-6b8e-59d5-a843-1ff43c598d4e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7cc67248-6b8e-59d5-a843-1ff43c598d4e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "55ca48ce-b3bf-5c1b-b4da-4d5aa111fa19"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "55ca48ce-b3bf-5c1b-b4da-4d5aa111fa19",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "16ac8d3a-badb-5b19-997d-e115cca7e58c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "16ac8d3a-badb-5b19-997d-e115cca7e58c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "b08a8b1f-ffb3-5c60-ae5b-583e5c6a5be5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b08a8b1f-ffb3-5c60-ae5b-583e5c6a5be5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "7df619ab-fe78-503a-b9c8-d86fcb95d5b3"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7df619ab-fe78-503a-b9c8-d86fcb95d5b3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "1c79a4fa-9abf-5792-b915-b724f7c5cddb"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "1c79a4fa-9abf-5792-b915-b724f7c5cddb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "95ec41d3-1042-40aa-bc7a-6d2f873f625c"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "95ec41d3-1042-40aa-bc7a-6d2f873f625c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "95ec41d3-1042-40aa-bc7a-6d2f873f625c",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "ae695d5c-b5ea-416c-849a-e53d01863ebc",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "4359d4c9-15a1-48fb-9536-6bb972046090",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "2be01f71-a8b8-42ef-b361-62ae147bcde7"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "2be01f71-a8b8-42ef-b361-62ae147bcde7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2be01f71-a8b8-42ef-b361-62ae147bcde7",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "e25391a7-8267-4ea0-b886-b51bf6e9b6ba",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "149117b7-8df9-47ce-b12c-f6592749ad8f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "059d40a9-e4c7-4753-846b-967d35fe9c43"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "059d40a9-e4c7-4753-846b-967d35fe9c43",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "059d40a9-e4c7-4753-846b-967d35fe9c43",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "fce39a20-491e-4d10-8f3f-342deff5c1fe",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "9c826ed0-d461-4810-8eb5-17079072c107",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "ca0a08f1-6b85-478d-b4a6-a12fc01576b9"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "ca0a08f1-6b85-478d-b4a6-a12fc01576b9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:05.623 08:22:17 blockdev_general -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:09:05.623 08:22:17 blockdev_general -- bdev/blockdev.sh@751 -- # hello_world_bdev=Malloc0 00:09:05.623 08:22:17 blockdev_general -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:09:05.623 08:22:17 blockdev_general -- bdev/blockdev.sh@753 -- # killprocess 1367683 00:09:05.623 08:22:17 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 1367683 ']' 00:09:05.623 08:22:17 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 1367683 00:09:05.623 08:22:17 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:09:05.623 08:22:17 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:05.623 08:22:17 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1367683 00:09:05.623 08:22:17 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:05.623 08:22:17 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:05.623 08:22:17 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1367683' 00:09:05.623 killing process with pid 1367683 00:09:05.623 08:22:17 blockdev_general -- common/autotest_common.sh@967 -- # kill 1367683 00:09:05.623 08:22:17 blockdev_general -- common/autotest_common.sh@972 -- # wait 1367683 00:09:08.905 08:22:21 blockdev_general -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:08.905 08:22:21 blockdev_general -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:09:08.905 08:22:21 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:08.905 08:22:21 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:08.905 08:22:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:08.905 ************************************ 00:09:08.905 START TEST bdev_hello_world 00:09:08.905 ************************************ 00:09:08.905 08:22:21 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:09:09.163 [2024-07-23 08:22:21.469679] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:09:09.163 [2024-07-23 08:22:21.469760] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1368879 ] 00:09:09.163 [2024-07-23 08:22:21.589722] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:09.421 [2024-07-23 08:22:21.802060] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:09.986 [2024-07-23 08:22:22.298401] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:09.986 [2024-07-23 08:22:22.298459] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:09.986 [2024-07-23 08:22:22.298488] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:09.986 [2024-07-23 08:22:22.306392] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:09.986 [2024-07-23 08:22:22.306427] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:09.986 [2024-07-23 08:22:22.314409] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:09.986 [2024-07-23 08:22:22.314437] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:09.986 [2024-07-23 08:22:22.502645] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:09.986 [2024-07-23 08:22:22.502706] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:09.986 [2024-07-23 08:22:22.502720] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037880 00:09:09.986 [2024-07-23 08:22:22.502729] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:09.986 [2024-07-23 08:22:22.504698] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:09.986 [2024-07-23 08:22:22.504725] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:10.554 [2024-07-23 08:22:22.856162] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:09:10.554 [2024-07-23 08:22:22.856211] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:09:10.554 [2024-07-23 08:22:22.856246] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:09:10.554 [2024-07-23 08:22:22.856302] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:09:10.554 [2024-07-23 08:22:22.856355] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:09:10.554 [2024-07-23 08:22:22.856372] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:09:10.554 [2024-07-23 08:22:22.856412] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:09:10.554 00:09:10.554 [2024-07-23 08:22:22.856437] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:09:13.083 00:09:13.084 real 0m3.686s 00:09:13.084 user 0m3.234s 00:09:13.084 sys 0m0.358s 00:09:13.084 08:22:25 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:13.084 08:22:25 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:09:13.084 ************************************ 00:09:13.084 END TEST bdev_hello_world 00:09:13.084 ************************************ 00:09:13.084 08:22:25 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:13.084 08:22:25 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:09:13.084 08:22:25 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:13.084 08:22:25 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:13.084 08:22:25 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:13.084 ************************************ 00:09:13.084 START TEST bdev_bounds 00:09:13.084 ************************************ 00:09:13.084 08:22:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:09:13.084 08:22:25 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1369648 00:09:13.084 08:22:25 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:09:13.084 08:22:25 blockdev_general.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:09:13.084 08:22:25 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1369648' 00:09:13.084 Process bdevio pid: 1369648 00:09:13.084 08:22:25 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1369648 00:09:13.084 08:22:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1369648 ']' 00:09:13.084 08:22:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:13.084 08:22:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:13.084 08:22:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:13.084 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:13.084 08:22:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:13.084 08:22:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:09:13.084 [2024-07-23 08:22:25.224015] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:09:13.084 [2024-07-23 08:22:25.224102] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1369648 ] 00:09:13.084 [2024-07-23 08:22:25.343923] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:09:13.084 [2024-07-23 08:22:25.558607] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:13.084 [2024-07-23 08:22:25.558683] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:13.084 [2024-07-23 08:22:25.558692] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:13.650 [2024-07-23 08:22:26.007232] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:13.650 [2024-07-23 08:22:26.007294] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:13.650 [2024-07-23 08:22:26.007322] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:13.650 [2024-07-23 08:22:26.015245] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:13.650 [2024-07-23 08:22:26.015277] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:13.650 [2024-07-23 08:22:26.023260] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:13.650 [2024-07-23 08:22:26.023291] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:13.908 [2024-07-23 08:22:26.222886] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:13.908 [2024-07-23 08:22:26.222940] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:13.908 [2024-07-23 08:22:26.222973] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037e80 00:09:13.908 [2024-07-23 08:22:26.222983] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:13.908 [2024-07-23 08:22:26.225000] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:13.908 [2024-07-23 08:22:26.225028] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:14.166 08:22:26 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:14.166 08:22:26 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:09:14.166 08:22:26 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:09:14.427 I/O targets: 00:09:14.427 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:09:14.427 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:09:14.427 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:09:14.427 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:09:14.427 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:09:14.427 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:09:14.427 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:09:14.427 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:09:14.427 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:09:14.427 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:09:14.427 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:09:14.427 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:09:14.427 raid0: 131072 blocks of 512 bytes (64 MiB) 00:09:14.427 concat0: 131072 blocks of 512 bytes (64 MiB) 00:09:14.427 raid1: 65536 blocks of 512 bytes (32 MiB) 00:09:14.427 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:09:14.427 00:09:14.427 00:09:14.427 CUnit - A unit testing framework for C - Version 2.1-3 00:09:14.427 http://cunit.sourceforge.net/ 00:09:14.427 00:09:14.427 00:09:14.427 Suite: bdevio tests on: AIO0 00:09:14.427 Test: blockdev write read block ...passed 00:09:14.427 Test: blockdev write zeroes read block ...passed 00:09:14.427 Test: blockdev write zeroes read no split ...passed 00:09:14.427 Test: blockdev write zeroes read split ...passed 00:09:14.427 Test: blockdev write zeroes read split partial ...passed 00:09:14.427 Test: blockdev reset ...passed 00:09:14.427 Test: blockdev write read 8 blocks ...passed 00:09:14.427 Test: blockdev write read size > 128k ...passed 00:09:14.427 Test: blockdev write read invalid size ...passed 00:09:14.427 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:14.427 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:14.427 Test: blockdev write read max offset ...passed 00:09:14.427 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:14.427 Test: blockdev writev readv 8 blocks ...passed 00:09:14.427 Test: blockdev writev readv 30 x 1block ...passed 00:09:14.427 Test: blockdev writev readv block ...passed 00:09:14.427 Test: blockdev writev readv size > 128k ...passed 00:09:14.427 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:14.427 Test: blockdev comparev and writev ...passed 00:09:14.427 Test: blockdev nvme passthru rw ...passed 00:09:14.427 Test: blockdev nvme passthru vendor specific ...passed 00:09:14.427 Test: blockdev nvme admin passthru ...passed 00:09:14.427 Test: blockdev copy ...passed 00:09:14.427 Suite: bdevio tests on: raid1 00:09:14.427 Test: blockdev write read block ...passed 00:09:14.427 Test: blockdev write zeroes read block ...passed 00:09:14.427 Test: blockdev write zeroes read no split ...passed 00:09:14.427 Test: blockdev write zeroes read split ...passed 00:09:14.427 Test: blockdev write zeroes read split partial ...passed 00:09:14.427 Test: blockdev reset ...passed 00:09:14.427 Test: blockdev write read 8 blocks ...passed 00:09:14.427 Test: blockdev write read size > 128k ...passed 00:09:14.427 Test: blockdev write read invalid size ...passed 00:09:14.427 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:14.427 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:14.427 Test: blockdev write read max offset ...passed 00:09:14.427 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:14.427 Test: blockdev writev readv 8 blocks ...passed 00:09:14.427 Test: blockdev writev readv 30 x 1block ...passed 00:09:14.427 Test: blockdev writev readv block ...passed 00:09:14.427 Test: blockdev writev readv size > 128k ...passed 00:09:14.427 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:14.427 Test: blockdev comparev and writev ...passed 00:09:14.427 Test: blockdev nvme passthru rw ...passed 00:09:14.427 Test: blockdev nvme passthru vendor specific ...passed 00:09:14.427 Test: blockdev nvme admin passthru ...passed 00:09:14.427 Test: blockdev copy ...passed 00:09:14.427 Suite: bdevio tests on: concat0 00:09:14.427 Test: blockdev write read block ...passed 00:09:14.427 Test: blockdev write zeroes read block ...passed 00:09:14.427 Test: blockdev write zeroes read no split ...passed 00:09:14.427 Test: blockdev write zeroes read split ...passed 00:09:14.427 Test: blockdev write zeroes read split partial ...passed 00:09:14.427 Test: blockdev reset ...passed 00:09:14.428 Test: blockdev write read 8 blocks ...passed 00:09:14.428 Test: blockdev write read size > 128k ...passed 00:09:14.428 Test: blockdev write read invalid size ...passed 00:09:14.428 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:14.428 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:14.428 Test: blockdev write read max offset ...passed 00:09:14.428 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:14.428 Test: blockdev writev readv 8 blocks ...passed 00:09:14.428 Test: blockdev writev readv 30 x 1block ...passed 00:09:14.428 Test: blockdev writev readv block ...passed 00:09:14.428 Test: blockdev writev readv size > 128k ...passed 00:09:14.428 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:14.428 Test: blockdev comparev and writev ...passed 00:09:14.428 Test: blockdev nvme passthru rw ...passed 00:09:14.428 Test: blockdev nvme passthru vendor specific ...passed 00:09:14.428 Test: blockdev nvme admin passthru ...passed 00:09:14.428 Test: blockdev copy ...passed 00:09:14.428 Suite: bdevio tests on: raid0 00:09:14.428 Test: blockdev write read block ...passed 00:09:14.428 Test: blockdev write zeroes read block ...passed 00:09:14.428 Test: blockdev write zeroes read no split ...passed 00:09:14.686 Test: blockdev write zeroes read split ...passed 00:09:14.686 Test: blockdev write zeroes read split partial ...passed 00:09:14.686 Test: blockdev reset ...passed 00:09:14.686 Test: blockdev write read 8 blocks ...passed 00:09:14.686 Test: blockdev write read size > 128k ...passed 00:09:14.686 Test: blockdev write read invalid size ...passed 00:09:14.686 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:14.686 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:14.686 Test: blockdev write read max offset ...passed 00:09:14.686 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:14.686 Test: blockdev writev readv 8 blocks ...passed 00:09:14.686 Test: blockdev writev readv 30 x 1block ...passed 00:09:14.686 Test: blockdev writev readv block ...passed 00:09:14.686 Test: blockdev writev readv size > 128k ...passed 00:09:14.686 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:14.686 Test: blockdev comparev and writev ...passed 00:09:14.686 Test: blockdev nvme passthru rw ...passed 00:09:14.686 Test: blockdev nvme passthru vendor specific ...passed 00:09:14.686 Test: blockdev nvme admin passthru ...passed 00:09:14.686 Test: blockdev copy ...passed 00:09:14.686 Suite: bdevio tests on: TestPT 00:09:14.686 Test: blockdev write read block ...passed 00:09:14.686 Test: blockdev write zeroes read block ...passed 00:09:14.686 Test: blockdev write zeroes read no split ...passed 00:09:14.686 Test: blockdev write zeroes read split ...passed 00:09:14.686 Test: blockdev write zeroes read split partial ...passed 00:09:14.686 Test: blockdev reset ...passed 00:09:14.686 Test: blockdev write read 8 blocks ...passed 00:09:14.686 Test: blockdev write read size > 128k ...passed 00:09:14.686 Test: blockdev write read invalid size ...passed 00:09:14.686 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:14.686 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:14.686 Test: blockdev write read max offset ...passed 00:09:14.686 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:14.686 Test: blockdev writev readv 8 blocks ...passed 00:09:14.686 Test: blockdev writev readv 30 x 1block ...passed 00:09:14.686 Test: blockdev writev readv block ...passed 00:09:14.686 Test: blockdev writev readv size > 128k ...passed 00:09:14.686 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:14.686 Test: blockdev comparev and writev ...passed 00:09:14.686 Test: blockdev nvme passthru rw ...passed 00:09:14.686 Test: blockdev nvme passthru vendor specific ...passed 00:09:14.686 Test: blockdev nvme admin passthru ...passed 00:09:14.686 Test: blockdev copy ...passed 00:09:14.686 Suite: bdevio tests on: Malloc2p7 00:09:14.686 Test: blockdev write read block ...passed 00:09:14.686 Test: blockdev write zeroes read block ...passed 00:09:14.686 Test: blockdev write zeroes read no split ...passed 00:09:14.686 Test: blockdev write zeroes read split ...passed 00:09:14.686 Test: blockdev write zeroes read split partial ...passed 00:09:14.686 Test: blockdev reset ...passed 00:09:14.686 Test: blockdev write read 8 blocks ...passed 00:09:14.686 Test: blockdev write read size > 128k ...passed 00:09:14.686 Test: blockdev write read invalid size ...passed 00:09:14.686 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:14.686 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:14.686 Test: blockdev write read max offset ...passed 00:09:14.686 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:14.686 Test: blockdev writev readv 8 blocks ...passed 00:09:14.686 Test: blockdev writev readv 30 x 1block ...passed 00:09:14.686 Test: blockdev writev readv block ...passed 00:09:14.686 Test: blockdev writev readv size > 128k ...passed 00:09:14.686 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:14.686 Test: blockdev comparev and writev ...passed 00:09:14.686 Test: blockdev nvme passthru rw ...passed 00:09:14.686 Test: blockdev nvme passthru vendor specific ...passed 00:09:14.686 Test: blockdev nvme admin passthru ...passed 00:09:14.686 Test: blockdev copy ...passed 00:09:14.686 Suite: bdevio tests on: Malloc2p6 00:09:14.686 Test: blockdev write read block ...passed 00:09:14.686 Test: blockdev write zeroes read block ...passed 00:09:14.686 Test: blockdev write zeroes read no split ...passed 00:09:14.686 Test: blockdev write zeroes read split ...passed 00:09:14.686 Test: blockdev write zeroes read split partial ...passed 00:09:14.686 Test: blockdev reset ...passed 00:09:14.686 Test: blockdev write read 8 blocks ...passed 00:09:14.686 Test: blockdev write read size > 128k ...passed 00:09:14.686 Test: blockdev write read invalid size ...passed 00:09:14.686 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:14.686 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:14.686 Test: blockdev write read max offset ...passed 00:09:14.686 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:14.686 Test: blockdev writev readv 8 blocks ...passed 00:09:14.686 Test: blockdev writev readv 30 x 1block ...passed 00:09:14.686 Test: blockdev writev readv block ...passed 00:09:14.686 Test: blockdev writev readv size > 128k ...passed 00:09:14.686 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:14.686 Test: blockdev comparev and writev ...passed 00:09:14.686 Test: blockdev nvme passthru rw ...passed 00:09:14.686 Test: blockdev nvme passthru vendor specific ...passed 00:09:14.686 Test: blockdev nvme admin passthru ...passed 00:09:14.686 Test: blockdev copy ...passed 00:09:14.686 Suite: bdevio tests on: Malloc2p5 00:09:14.686 Test: blockdev write read block ...passed 00:09:14.686 Test: blockdev write zeroes read block ...passed 00:09:14.945 Test: blockdev write zeroes read no split ...passed 00:09:14.945 Test: blockdev write zeroes read split ...passed 00:09:14.945 Test: blockdev write zeroes read split partial ...passed 00:09:14.945 Test: blockdev reset ...passed 00:09:14.945 Test: blockdev write read 8 blocks ...passed 00:09:14.945 Test: blockdev write read size > 128k ...passed 00:09:14.945 Test: blockdev write read invalid size ...passed 00:09:14.945 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:14.945 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:14.945 Test: blockdev write read max offset ...passed 00:09:14.945 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:14.945 Test: blockdev writev readv 8 blocks ...passed 00:09:14.945 Test: blockdev writev readv 30 x 1block ...passed 00:09:14.945 Test: blockdev writev readv block ...passed 00:09:14.945 Test: blockdev writev readv size > 128k ...passed 00:09:14.945 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:14.945 Test: blockdev comparev and writev ...passed 00:09:14.945 Test: blockdev nvme passthru rw ...passed 00:09:14.945 Test: blockdev nvme passthru vendor specific ...passed 00:09:14.945 Test: blockdev nvme admin passthru ...passed 00:09:14.945 Test: blockdev copy ...passed 00:09:14.945 Suite: bdevio tests on: Malloc2p4 00:09:14.945 Test: blockdev write read block ...passed 00:09:14.945 Test: blockdev write zeroes read block ...passed 00:09:14.945 Test: blockdev write zeroes read no split ...passed 00:09:14.945 Test: blockdev write zeroes read split ...passed 00:09:14.945 Test: blockdev write zeroes read split partial ...passed 00:09:14.945 Test: blockdev reset ...passed 00:09:14.945 Test: blockdev write read 8 blocks ...passed 00:09:14.945 Test: blockdev write read size > 128k ...passed 00:09:14.945 Test: blockdev write read invalid size ...passed 00:09:14.945 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:14.945 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:14.945 Test: blockdev write read max offset ...passed 00:09:14.945 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:14.945 Test: blockdev writev readv 8 blocks ...passed 00:09:14.945 Test: blockdev writev readv 30 x 1block ...passed 00:09:14.945 Test: blockdev writev readv block ...passed 00:09:14.945 Test: blockdev writev readv size > 128k ...passed 00:09:14.945 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:14.945 Test: blockdev comparev and writev ...passed 00:09:14.945 Test: blockdev nvme passthru rw ...passed 00:09:14.945 Test: blockdev nvme passthru vendor specific ...passed 00:09:14.945 Test: blockdev nvme admin passthru ...passed 00:09:14.945 Test: blockdev copy ...passed 00:09:14.945 Suite: bdevio tests on: Malloc2p3 00:09:14.945 Test: blockdev write read block ...passed 00:09:14.945 Test: blockdev write zeroes read block ...passed 00:09:14.945 Test: blockdev write zeroes read no split ...passed 00:09:14.945 Test: blockdev write zeroes read split ...passed 00:09:14.945 Test: blockdev write zeroes read split partial ...passed 00:09:14.945 Test: blockdev reset ...passed 00:09:14.945 Test: blockdev write read 8 blocks ...passed 00:09:14.945 Test: blockdev write read size > 128k ...passed 00:09:14.945 Test: blockdev write read invalid size ...passed 00:09:14.945 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:14.945 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:14.945 Test: blockdev write read max offset ...passed 00:09:14.945 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:14.945 Test: blockdev writev readv 8 blocks ...passed 00:09:14.945 Test: blockdev writev readv 30 x 1block ...passed 00:09:14.945 Test: blockdev writev readv block ...passed 00:09:14.945 Test: blockdev writev readv size > 128k ...passed 00:09:14.945 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:14.945 Test: blockdev comparev and writev ...passed 00:09:14.945 Test: blockdev nvme passthru rw ...passed 00:09:14.945 Test: blockdev nvme passthru vendor specific ...passed 00:09:14.945 Test: blockdev nvme admin passthru ...passed 00:09:14.945 Test: blockdev copy ...passed 00:09:14.945 Suite: bdevio tests on: Malloc2p2 00:09:14.945 Test: blockdev write read block ...passed 00:09:14.945 Test: blockdev write zeroes read block ...passed 00:09:14.945 Test: blockdev write zeroes read no split ...passed 00:09:14.945 Test: blockdev write zeroes read split ...passed 00:09:14.945 Test: blockdev write zeroes read split partial ...passed 00:09:14.945 Test: blockdev reset ...passed 00:09:14.945 Test: blockdev write read 8 blocks ...passed 00:09:14.945 Test: blockdev write read size > 128k ...passed 00:09:14.945 Test: blockdev write read invalid size ...passed 00:09:14.945 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:14.945 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:14.945 Test: blockdev write read max offset ...passed 00:09:14.945 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:14.945 Test: blockdev writev readv 8 blocks ...passed 00:09:14.945 Test: blockdev writev readv 30 x 1block ...passed 00:09:14.945 Test: blockdev writev readv block ...passed 00:09:14.945 Test: blockdev writev readv size > 128k ...passed 00:09:14.945 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:14.945 Test: blockdev comparev and writev ...passed 00:09:14.945 Test: blockdev nvme passthru rw ...passed 00:09:14.945 Test: blockdev nvme passthru vendor specific ...passed 00:09:14.945 Test: blockdev nvme admin passthru ...passed 00:09:14.945 Test: blockdev copy ...passed 00:09:14.945 Suite: bdevio tests on: Malloc2p1 00:09:14.945 Test: blockdev write read block ...passed 00:09:14.945 Test: blockdev write zeroes read block ...passed 00:09:14.945 Test: blockdev write zeroes read no split ...passed 00:09:15.204 Test: blockdev write zeroes read split ...passed 00:09:15.204 Test: blockdev write zeroes read split partial ...passed 00:09:15.204 Test: blockdev reset ...passed 00:09:15.204 Test: blockdev write read 8 blocks ...passed 00:09:15.204 Test: blockdev write read size > 128k ...passed 00:09:15.204 Test: blockdev write read invalid size ...passed 00:09:15.204 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:15.204 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:15.204 Test: blockdev write read max offset ...passed 00:09:15.204 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:15.204 Test: blockdev writev readv 8 blocks ...passed 00:09:15.204 Test: blockdev writev readv 30 x 1block ...passed 00:09:15.204 Test: blockdev writev readv block ...passed 00:09:15.204 Test: blockdev writev readv size > 128k ...passed 00:09:15.204 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:15.204 Test: blockdev comparev and writev ...passed 00:09:15.204 Test: blockdev nvme passthru rw ...passed 00:09:15.204 Test: blockdev nvme passthru vendor specific ...passed 00:09:15.204 Test: blockdev nvme admin passthru ...passed 00:09:15.204 Test: blockdev copy ...passed 00:09:15.204 Suite: bdevio tests on: Malloc2p0 00:09:15.204 Test: blockdev write read block ...passed 00:09:15.204 Test: blockdev write zeroes read block ...passed 00:09:15.204 Test: blockdev write zeroes read no split ...passed 00:09:15.204 Test: blockdev write zeroes read split ...passed 00:09:15.204 Test: blockdev write zeroes read split partial ...passed 00:09:15.204 Test: blockdev reset ...passed 00:09:15.204 Test: blockdev write read 8 blocks ...passed 00:09:15.204 Test: blockdev write read size > 128k ...passed 00:09:15.204 Test: blockdev write read invalid size ...passed 00:09:15.204 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:15.204 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:15.204 Test: blockdev write read max offset ...passed 00:09:15.204 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:15.204 Test: blockdev writev readv 8 blocks ...passed 00:09:15.204 Test: blockdev writev readv 30 x 1block ...passed 00:09:15.204 Test: blockdev writev readv block ...passed 00:09:15.204 Test: blockdev writev readv size > 128k ...passed 00:09:15.204 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:15.204 Test: blockdev comparev and writev ...passed 00:09:15.204 Test: blockdev nvme passthru rw ...passed 00:09:15.204 Test: blockdev nvme passthru vendor specific ...passed 00:09:15.204 Test: blockdev nvme admin passthru ...passed 00:09:15.204 Test: blockdev copy ...passed 00:09:15.204 Suite: bdevio tests on: Malloc1p1 00:09:15.204 Test: blockdev write read block ...passed 00:09:15.204 Test: blockdev write zeroes read block ...passed 00:09:15.204 Test: blockdev write zeroes read no split ...passed 00:09:15.204 Test: blockdev write zeroes read split ...passed 00:09:15.204 Test: blockdev write zeroes read split partial ...passed 00:09:15.204 Test: blockdev reset ...passed 00:09:15.204 Test: blockdev write read 8 blocks ...passed 00:09:15.204 Test: blockdev write read size > 128k ...passed 00:09:15.204 Test: blockdev write read invalid size ...passed 00:09:15.204 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:15.204 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:15.204 Test: blockdev write read max offset ...passed 00:09:15.204 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:15.204 Test: blockdev writev readv 8 blocks ...passed 00:09:15.204 Test: blockdev writev readv 30 x 1block ...passed 00:09:15.204 Test: blockdev writev readv block ...passed 00:09:15.204 Test: blockdev writev readv size > 128k ...passed 00:09:15.204 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:15.204 Test: blockdev comparev and writev ...passed 00:09:15.204 Test: blockdev nvme passthru rw ...passed 00:09:15.204 Test: blockdev nvme passthru vendor specific ...passed 00:09:15.204 Test: blockdev nvme admin passthru ...passed 00:09:15.204 Test: blockdev copy ...passed 00:09:15.204 Suite: bdevio tests on: Malloc1p0 00:09:15.204 Test: blockdev write read block ...passed 00:09:15.204 Test: blockdev write zeroes read block ...passed 00:09:15.204 Test: blockdev write zeroes read no split ...passed 00:09:15.204 Test: blockdev write zeroes read split ...passed 00:09:15.204 Test: blockdev write zeroes read split partial ...passed 00:09:15.204 Test: blockdev reset ...passed 00:09:15.204 Test: blockdev write read 8 blocks ...passed 00:09:15.204 Test: blockdev write read size > 128k ...passed 00:09:15.204 Test: blockdev write read invalid size ...passed 00:09:15.204 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:15.204 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:15.204 Test: blockdev write read max offset ...passed 00:09:15.204 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:15.204 Test: blockdev writev readv 8 blocks ...passed 00:09:15.204 Test: blockdev writev readv 30 x 1block ...passed 00:09:15.204 Test: blockdev writev readv block ...passed 00:09:15.204 Test: blockdev writev readv size > 128k ...passed 00:09:15.204 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:15.204 Test: blockdev comparev and writev ...passed 00:09:15.204 Test: blockdev nvme passthru rw ...passed 00:09:15.204 Test: blockdev nvme passthru vendor specific ...passed 00:09:15.204 Test: blockdev nvme admin passthru ...passed 00:09:15.204 Test: blockdev copy ...passed 00:09:15.204 Suite: bdevio tests on: Malloc0 00:09:15.205 Test: blockdev write read block ...passed 00:09:15.205 Test: blockdev write zeroes read block ...passed 00:09:15.205 Test: blockdev write zeroes read no split ...passed 00:09:15.463 Test: blockdev write zeroes read split ...passed 00:09:15.463 Test: blockdev write zeroes read split partial ...passed 00:09:15.463 Test: blockdev reset ...passed 00:09:15.463 Test: blockdev write read 8 blocks ...passed 00:09:15.463 Test: blockdev write read size > 128k ...passed 00:09:15.463 Test: blockdev write read invalid size ...passed 00:09:15.463 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:15.463 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:15.463 Test: blockdev write read max offset ...passed 00:09:15.463 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:15.463 Test: blockdev writev readv 8 blocks ...passed 00:09:15.463 Test: blockdev writev readv 30 x 1block ...passed 00:09:15.463 Test: blockdev writev readv block ...passed 00:09:15.463 Test: blockdev writev readv size > 128k ...passed 00:09:15.463 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:15.463 Test: blockdev comparev and writev ...passed 00:09:15.463 Test: blockdev nvme passthru rw ...passed 00:09:15.463 Test: blockdev nvme passthru vendor specific ...passed 00:09:15.463 Test: blockdev nvme admin passthru ...passed 00:09:15.463 Test: blockdev copy ...passed 00:09:15.463 00:09:15.463 Run Summary: Type Total Ran Passed Failed Inactive 00:09:15.463 suites 16 16 n/a 0 0 00:09:15.463 tests 368 368 368 0 0 00:09:15.463 asserts 2224 2224 2224 0 n/a 00:09:15.463 00:09:15.463 Elapsed time = 3.017 seconds 00:09:15.463 0 00:09:15.463 08:22:27 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1369648 00:09:15.463 08:22:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1369648 ']' 00:09:15.463 08:22:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1369648 00:09:15.463 08:22:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:09:15.463 08:22:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:15.463 08:22:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1369648 00:09:15.463 08:22:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:15.463 08:22:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:15.463 08:22:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1369648' 00:09:15.463 killing process with pid 1369648 00:09:15.463 08:22:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1369648 00:09:15.463 08:22:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1369648 00:09:17.365 08:22:29 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:09:17.365 00:09:17.365 real 0m4.706s 00:09:17.365 user 0m12.134s 00:09:17.365 sys 0m0.503s 00:09:17.365 08:22:29 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:17.365 08:22:29 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:09:17.365 ************************************ 00:09:17.365 END TEST bdev_bounds 00:09:17.365 ************************************ 00:09:17.623 08:22:29 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:17.623 08:22:29 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:09:17.623 08:22:29 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:17.623 08:22:29 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:17.623 08:22:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:17.623 ************************************ 00:09:17.623 START TEST bdev_nbd 00:09:17.623 ************************************ 00:09:17.623 08:22:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:09:17.623 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:09:17.623 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:09:17.623 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:17.623 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:17.623 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:17.623 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:09:17.623 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=16 00:09:17.623 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:09:17.623 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:17.623 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:09:17.623 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=16 00:09:17.623 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:17.624 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:09:17.624 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:17.624 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:09:17.624 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1370458 00:09:17.624 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:09:17.624 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:09:17.624 08:22:29 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1370458 /var/tmp/spdk-nbd.sock 00:09:17.624 08:22:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1370458 ']' 00:09:17.624 08:22:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:09:17.624 08:22:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:17.624 08:22:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:09:17.624 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:09:17.624 08:22:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:17.624 08:22:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:09:17.624 [2024-07-23 08:22:30.009797] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:09:17.624 [2024-07-23 08:22:30.009884] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:17.624 [2024-07-23 08:22:30.137658] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:17.882 [2024-07-23 08:22:30.372804] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:18.448 [2024-07-23 08:22:30.839220] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:18.448 [2024-07-23 08:22:30.839279] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:18.448 [2024-07-23 08:22:30.839309] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:18.448 [2024-07-23 08:22:30.847228] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:18.448 [2024-07-23 08:22:30.847263] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:18.448 [2024-07-23 08:22:30.855246] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:18.448 [2024-07-23 08:22:30.855276] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:18.707 [2024-07-23 08:22:31.048223] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:18.707 [2024-07-23 08:22:31.048273] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:18.707 [2024-07-23 08:22:31.048287] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037b80 00:09:18.707 [2024-07-23 08:22:31.048296] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:18.707 [2024-07-23 08:22:31.050174] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:18.707 [2024-07-23 08:22:31.050203] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:18.965 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:18.965 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:09:18.965 08:22:31 blockdev_general.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:09:18.965 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:18.965 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:18.965 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:09:18.965 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:09:18.965 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:18.965 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:18.965 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:09:18.965 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:09:18.965 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:09:18.965 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:09:18.965 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:18.965 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:19.225 1+0 records in 00:09:19.225 1+0 records out 00:09:19.225 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000202017 s, 20.3 MB/s 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:19.225 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:19.485 1+0 records in 00:09:19.485 1+0 records out 00:09:19.485 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000231292 s, 17.7 MB/s 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:19.485 08:22:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:19.743 1+0 records in 00:09:19.743 1+0 records out 00:09:19.743 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257813 s, 15.9 MB/s 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:19.743 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:20.003 1+0 records in 00:09:20.003 1+0 records out 00:09:20.003 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235515 s, 17.4 MB/s 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:20.003 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:20.262 1+0 records in 00:09:20.262 1+0 records out 00:09:20.262 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252581 s, 16.2 MB/s 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:20.262 1+0 records in 00:09:20.262 1+0 records out 00:09:20.262 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281998 s, 14.5 MB/s 00:09:20.262 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:20.521 1+0 records in 00:09:20.521 1+0 records out 00:09:20.521 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000288394 s, 14.2 MB/s 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:20.521 08:22:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:20.521 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:20.780 1+0 records in 00:09:20.780 1+0 records out 00:09:20.780 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000264573 s, 15.5 MB/s 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:20.780 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:21.038 1+0 records in 00:09:21.038 1+0 records out 00:09:21.038 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000313149 s, 13.1 MB/s 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:21.038 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:21.296 1+0 records in 00:09:21.296 1+0 records out 00:09:21.296 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278819 s, 14.7 MB/s 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:21.296 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:21.556 1+0 records in 00:09:21.556 1+0 records out 00:09:21.556 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000298956 s, 13.7 MB/s 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:21.556 08:22:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:21.816 1+0 records in 00:09:21.816 1+0 records out 00:09:21.816 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000353119 s, 11.6 MB/s 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:21.816 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:22.075 1+0 records in 00:09:22.075 1+0 records out 00:09:22.075 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000377248 s, 10.9 MB/s 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:22.075 1+0 records in 00:09:22.075 1+0 records out 00:09:22.075 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000315117 s, 13.0 MB/s 00:09:22.075 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:22.334 1+0 records in 00:09:22.334 1+0 records out 00:09:22.334 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000400338 s, 10.2 MB/s 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:22.334 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:09:22.593 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:09:22.593 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:09:22.593 08:22:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:22.593 1+0 records in 00:09:22.593 1+0 records out 00:09:22.593 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000381708 s, 10.7 MB/s 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:22.593 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:22.851 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:09:22.851 { 00:09:22.851 "nbd_device": "/dev/nbd0", 00:09:22.851 "bdev_name": "Malloc0" 00:09:22.851 }, 00:09:22.851 { 00:09:22.851 "nbd_device": "/dev/nbd1", 00:09:22.851 "bdev_name": "Malloc1p0" 00:09:22.851 }, 00:09:22.851 { 00:09:22.852 "nbd_device": "/dev/nbd2", 00:09:22.852 "bdev_name": "Malloc1p1" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd3", 00:09:22.852 "bdev_name": "Malloc2p0" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd4", 00:09:22.852 "bdev_name": "Malloc2p1" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd5", 00:09:22.852 "bdev_name": "Malloc2p2" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd6", 00:09:22.852 "bdev_name": "Malloc2p3" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd7", 00:09:22.852 "bdev_name": "Malloc2p4" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd8", 00:09:22.852 "bdev_name": "Malloc2p5" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd9", 00:09:22.852 "bdev_name": "Malloc2p6" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd10", 00:09:22.852 "bdev_name": "Malloc2p7" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd11", 00:09:22.852 "bdev_name": "TestPT" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd12", 00:09:22.852 "bdev_name": "raid0" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd13", 00:09:22.852 "bdev_name": "concat0" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd14", 00:09:22.852 "bdev_name": "raid1" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd15", 00:09:22.852 "bdev_name": "AIO0" 00:09:22.852 } 00:09:22.852 ]' 00:09:22.852 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:09:22.852 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:09:22.852 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd0", 00:09:22.852 "bdev_name": "Malloc0" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd1", 00:09:22.852 "bdev_name": "Malloc1p0" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd2", 00:09:22.852 "bdev_name": "Malloc1p1" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd3", 00:09:22.852 "bdev_name": "Malloc2p0" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd4", 00:09:22.852 "bdev_name": "Malloc2p1" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd5", 00:09:22.852 "bdev_name": "Malloc2p2" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd6", 00:09:22.852 "bdev_name": "Malloc2p3" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd7", 00:09:22.852 "bdev_name": "Malloc2p4" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd8", 00:09:22.852 "bdev_name": "Malloc2p5" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd9", 00:09:22.852 "bdev_name": "Malloc2p6" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd10", 00:09:22.852 "bdev_name": "Malloc2p7" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd11", 00:09:22.852 "bdev_name": "TestPT" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd12", 00:09:22.852 "bdev_name": "raid0" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd13", 00:09:22.852 "bdev_name": "concat0" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd14", 00:09:22.852 "bdev_name": "raid1" 00:09:22.852 }, 00:09:22.852 { 00:09:22.852 "nbd_device": "/dev/nbd15", 00:09:22.852 "bdev_name": "AIO0" 00:09:22.852 } 00:09:22.852 ]' 00:09:22.852 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:09:22.852 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:22.852 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:09:22.852 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:22.852 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:22.852 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:22.852 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:23.111 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:23.370 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:23.370 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:23.370 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:23.370 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:23.370 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:23.370 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:23.370 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:23.370 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:23.370 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:23.370 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:23.628 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:23.629 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:23.629 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:23.629 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:23.629 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:23.629 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:23.629 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:23.629 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:23.629 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:23.629 08:22:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:23.887 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:24.146 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:24.146 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:24.146 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:24.146 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:24.146 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:24.146 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:24.146 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:24.146 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:24.146 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:24.146 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:24.405 08:22:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:24.664 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:24.664 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:24.664 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:24.664 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:24.664 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:24.664 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:24.664 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:24.664 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:24.664 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:24.664 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:09:24.924 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:09:24.924 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:09:24.924 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:09:24.924 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:24.924 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:24.924 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:09:24.924 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:24.924 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:24.924 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:24.924 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:25.184 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:09:25.444 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:09:25.444 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:09:25.444 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:09:25.444 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.444 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.444 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:09:25.444 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:25.444 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.444 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:25.444 08:22:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:25.704 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:25.963 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:09:26.225 /dev/nbd0 00:09:26.225 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:26.225 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:26.225 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:26.225 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:26.225 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:26.225 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:26.225 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:26.226 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:26.226 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:26.226 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:26.226 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:26.226 1+0 records in 00:09:26.226 1+0 records out 00:09:26.226 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000221223 s, 18.5 MB/s 00:09:26.226 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:26.226 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:26.226 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:26.226 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:26.226 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:26.226 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:26.226 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:26.226 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:09:26.484 /dev/nbd1 00:09:26.484 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:09:26.484 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:09:26.484 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:09:26.484 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:26.484 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:26.484 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:26.484 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:09:26.484 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:26.484 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:26.484 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:26.484 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:26.484 1+0 records in 00:09:26.484 1+0 records out 00:09:26.484 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000266669 s, 15.4 MB/s 00:09:26.485 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:26.485 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:26.485 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:26.485 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:26.485 08:22:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:26.485 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:26.485 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:26.485 08:22:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:09:26.744 /dev/nbd10 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:26.744 1+0 records in 00:09:26.744 1+0 records out 00:09:26.744 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241002 s, 17.0 MB/s 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:26.744 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:09:27.003 /dev/nbd11 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:27.003 1+0 records in 00:09:27.003 1+0 records out 00:09:27.003 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277462 s, 14.8 MB/s 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:27.003 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:09:27.003 /dev/nbd12 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:27.262 1+0 records in 00:09:27.262 1+0 records out 00:09:27.262 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278303 s, 14.7 MB/s 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:09:27.262 /dev/nbd13 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:27.262 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:27.262 1+0 records in 00:09:27.262 1+0 records out 00:09:27.262 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278206 s, 14.7 MB/s 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:09:27.522 /dev/nbd14 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:27.522 1+0 records in 00:09:27.522 1+0 records out 00:09:27.522 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306205 s, 13.4 MB/s 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:27.522 08:22:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:09:27.781 /dev/nbd15 00:09:27.781 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:09:27.781 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:09:27.781 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:09:27.781 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:27.781 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:27.781 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:27.782 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:09:27.782 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:27.782 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:27.782 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:27.782 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:27.782 1+0 records in 00:09:27.782 1+0 records out 00:09:27.782 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000262653 s, 15.6 MB/s 00:09:27.782 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:27.782 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:27.782 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:27.782 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:27.782 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:27.782 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:27.782 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:27.782 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:09:28.041 /dev/nbd2 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:28.041 1+0 records in 00:09:28.041 1+0 records out 00:09:28.041 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000342767 s, 11.9 MB/s 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:28.041 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:28.042 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:28.042 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:09:28.301 /dev/nbd3 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:28.301 1+0 records in 00:09:28.301 1+0 records out 00:09:28.301 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00031197 s, 13.1 MB/s 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:28.301 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:09:28.559 /dev/nbd4 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:28.559 1+0 records in 00:09:28.559 1+0 records out 00:09:28.559 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000513733 s, 8.0 MB/s 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:28.559 08:22:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:09:28.559 /dev/nbd5 00:09:28.559 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:09:28.559 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:09:28.559 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:09:28.559 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:28.559 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:28.559 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:28.559 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:09:28.559 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:28.559 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:28.559 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:28.559 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:28.818 1+0 records in 00:09:28.818 1+0 records out 00:09:28.818 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000377925 s, 10.8 MB/s 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:09:28.818 /dev/nbd6 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:28.818 1+0 records in 00:09:28.818 1+0 records out 00:09:28.818 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000300864 s, 13.6 MB/s 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:28.818 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:09:29.077 /dev/nbd7 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:29.077 1+0 records in 00:09:29.077 1+0 records out 00:09:29.077 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000337689 s, 12.1 MB/s 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:29.077 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:09:29.335 /dev/nbd8 00:09:29.335 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:09:29.335 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:09:29.335 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:09:29.335 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:29.335 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:29.335 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:29.335 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:09:29.335 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:29.335 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:29.335 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:29.335 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:29.336 1+0 records in 00:09:29.336 1+0 records out 00:09:29.336 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000295176 s, 13.9 MB/s 00:09:29.336 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:29.336 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:29.336 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:29.336 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:29.336 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:29.336 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:29.336 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:29.336 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:09:29.594 /dev/nbd9 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:29.594 1+0 records in 00:09:29.594 1+0 records out 00:09:29.594 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000367565 s, 11.1 MB/s 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:29.594 08:22:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd0", 00:09:29.853 "bdev_name": "Malloc0" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd1", 00:09:29.853 "bdev_name": "Malloc1p0" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd10", 00:09:29.853 "bdev_name": "Malloc1p1" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd11", 00:09:29.853 "bdev_name": "Malloc2p0" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd12", 00:09:29.853 "bdev_name": "Malloc2p1" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd13", 00:09:29.853 "bdev_name": "Malloc2p2" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd14", 00:09:29.853 "bdev_name": "Malloc2p3" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd15", 00:09:29.853 "bdev_name": "Malloc2p4" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd2", 00:09:29.853 "bdev_name": "Malloc2p5" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd3", 00:09:29.853 "bdev_name": "Malloc2p6" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd4", 00:09:29.853 "bdev_name": "Malloc2p7" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd5", 00:09:29.853 "bdev_name": "TestPT" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd6", 00:09:29.853 "bdev_name": "raid0" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd7", 00:09:29.853 "bdev_name": "concat0" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd8", 00:09:29.853 "bdev_name": "raid1" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd9", 00:09:29.853 "bdev_name": "AIO0" 00:09:29.853 } 00:09:29.853 ]' 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd0", 00:09:29.853 "bdev_name": "Malloc0" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd1", 00:09:29.853 "bdev_name": "Malloc1p0" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd10", 00:09:29.853 "bdev_name": "Malloc1p1" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd11", 00:09:29.853 "bdev_name": "Malloc2p0" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd12", 00:09:29.853 "bdev_name": "Malloc2p1" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd13", 00:09:29.853 "bdev_name": "Malloc2p2" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd14", 00:09:29.853 "bdev_name": "Malloc2p3" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd15", 00:09:29.853 "bdev_name": "Malloc2p4" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd2", 00:09:29.853 "bdev_name": "Malloc2p5" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd3", 00:09:29.853 "bdev_name": "Malloc2p6" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd4", 00:09:29.853 "bdev_name": "Malloc2p7" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd5", 00:09:29.853 "bdev_name": "TestPT" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd6", 00:09:29.853 "bdev_name": "raid0" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd7", 00:09:29.853 "bdev_name": "concat0" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd8", 00:09:29.853 "bdev_name": "raid1" 00:09:29.853 }, 00:09:29.853 { 00:09:29.853 "nbd_device": "/dev/nbd9", 00:09:29.853 "bdev_name": "AIO0" 00:09:29.853 } 00:09:29.853 ]' 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:09:29.853 /dev/nbd1 00:09:29.853 /dev/nbd10 00:09:29.853 /dev/nbd11 00:09:29.853 /dev/nbd12 00:09:29.853 /dev/nbd13 00:09:29.853 /dev/nbd14 00:09:29.853 /dev/nbd15 00:09:29.853 /dev/nbd2 00:09:29.853 /dev/nbd3 00:09:29.853 /dev/nbd4 00:09:29.853 /dev/nbd5 00:09:29.853 /dev/nbd6 00:09:29.853 /dev/nbd7 00:09:29.853 /dev/nbd8 00:09:29.853 /dev/nbd9' 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:09:29.853 /dev/nbd1 00:09:29.853 /dev/nbd10 00:09:29.853 /dev/nbd11 00:09:29.853 /dev/nbd12 00:09:29.853 /dev/nbd13 00:09:29.853 /dev/nbd14 00:09:29.853 /dev/nbd15 00:09:29.853 /dev/nbd2 00:09:29.853 /dev/nbd3 00:09:29.853 /dev/nbd4 00:09:29.853 /dev/nbd5 00:09:29.853 /dev/nbd6 00:09:29.853 /dev/nbd7 00:09:29.853 /dev/nbd8 00:09:29.853 /dev/nbd9' 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:09:29.853 256+0 records in 00:09:29.853 256+0 records out 00:09:29.853 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00430587 s, 244 MB/s 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:09:29.853 256+0 records in 00:09:29.853 256+0 records out 00:09:29.853 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0690418 s, 15.2 MB/s 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:29.853 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:09:30.112 256+0 records in 00:09:30.112 256+0 records out 00:09:30.112 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0768356 s, 13.6 MB/s 00:09:30.112 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:30.112 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:09:30.112 256+0 records in 00:09:30.112 256+0 records out 00:09:30.112 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0720347 s, 14.6 MB/s 00:09:30.112 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:30.112 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:09:30.112 256+0 records in 00:09:30.112 256+0 records out 00:09:30.112 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0715825 s, 14.6 MB/s 00:09:30.112 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:30.112 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:09:30.112 256+0 records in 00:09:30.112 256+0 records out 00:09:30.112 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0711534 s, 14.7 MB/s 00:09:30.112 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:30.112 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:09:30.370 256+0 records in 00:09:30.370 256+0 records out 00:09:30.370 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0711797 s, 14.7 MB/s 00:09:30.370 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:30.370 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:09:30.370 256+0 records in 00:09:30.370 256+0 records out 00:09:30.370 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0707538 s, 14.8 MB/s 00:09:30.370 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:30.370 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:09:30.370 256+0 records in 00:09:30.370 256+0 records out 00:09:30.370 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0695485 s, 15.1 MB/s 00:09:30.370 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:30.370 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:09:30.628 256+0 records in 00:09:30.628 256+0 records out 00:09:30.628 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0706807 s, 14.8 MB/s 00:09:30.628 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:30.628 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:09:30.628 256+0 records in 00:09:30.628 256+0 records out 00:09:30.628 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0706279 s, 14.8 MB/s 00:09:30.628 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:30.628 08:22:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:09:30.628 256+0 records in 00:09:30.628 256+0 records out 00:09:30.628 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0704669 s, 14.9 MB/s 00:09:30.628 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:30.628 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:09:30.628 256+0 records in 00:09:30.628 256+0 records out 00:09:30.628 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0703659 s, 14.9 MB/s 00:09:30.628 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:30.629 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:09:30.887 256+0 records in 00:09:30.887 256+0 records out 00:09:30.887 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0686408 s, 15.3 MB/s 00:09:30.887 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:30.887 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:09:30.887 256+0 records in 00:09:30.887 256+0 records out 00:09:30.887 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0726697 s, 14.4 MB/s 00:09:30.887 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:30.887 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:09:30.887 256+0 records in 00:09:30.887 256+0 records out 00:09:30.887 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0763582 s, 13.7 MB/s 00:09:30.887 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:30.887 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:09:31.147 256+0 records in 00:09:31.147 256+0 records out 00:09:31.147 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0704798 s, 14.9 MB/s 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:31.147 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:31.405 08:22:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:09:31.663 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:09:31.663 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:09:31.663 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:09:31.663 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:31.663 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:31.664 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:09:31.664 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:31.664 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:31.664 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:31.664 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:09:31.923 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:09:31.923 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:09:31.923 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:09:31.923 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:31.923 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:31.923 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:09:31.923 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:31.923 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:31.923 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:31.923 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:09:31.923 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:09:31.923 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:09:31.923 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:09:31.923 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:31.923 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:31.923 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:09:32.182 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:32.182 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:32.182 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:32.182 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:09:32.182 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:09:32.182 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:09:32.182 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:09:32.182 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:32.182 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:32.182 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:09:32.182 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:32.182 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:32.182 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:32.182 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:32.440 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:32.440 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:32.440 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:32.440 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:32.440 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:32.440 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:32.440 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:32.440 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:32.440 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:32.440 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:32.698 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:32.698 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:32.698 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:32.698 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:32.698 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:32.698 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:32.698 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:32.698 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:32.698 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:32.698 08:22:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:32.698 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:32.698 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:32.698 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:32.698 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:32.698 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:32.698 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:32.698 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:32.698 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:32.698 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:32.698 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:32.957 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:32.957 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:32.957 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:32.957 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:32.957 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:32.957 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:32.957 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:32.957 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:32.957 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:32.957 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:33.216 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:33.216 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:33.216 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:33.216 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:33.216 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:33.216 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:33.216 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:33.216 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:33.216 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:33.216 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:33.474 08:22:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:33.731 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:33.732 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:33.732 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:33.732 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:33.732 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:33.732 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:33.732 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:33.732 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:33.732 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:33.732 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:33.989 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:33.989 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:33.989 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:33.989 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:33.989 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:33.989 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:33.989 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:33.989 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:33.989 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:33.989 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:34.248 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:34.248 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:34.248 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:34.248 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:34.248 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:34.248 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:34.248 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:34.248 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:34.248 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:34.248 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:34.248 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:34.248 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:34.248 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:34.248 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:34.507 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:34.507 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:34.507 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:34.507 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:34.507 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:34.507 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:34.507 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:09:34.507 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:09:34.507 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:09:34.507 08:22:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:34.507 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:34.507 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:34.507 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:09:34.507 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:09:34.507 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:09:34.507 malloc_lvol_verify 00:09:34.507 08:22:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:09:34.788 babbfd34-758c-49a9-8b40-5ed77ec7bc08 00:09:34.788 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:09:34.788 5ce40b8a-f36c-4511-bd1c-e4594aee7887 00:09:35.047 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:09:35.047 /dev/nbd0 00:09:35.047 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:09:35.047 mke2fs 1.46.5 (30-Dec-2021) 00:09:35.047 Discarding device blocks: 0/4096 done 00:09:35.047 Creating filesystem with 4096 1k blocks and 1024 inodes 00:09:35.047 00:09:35.047 Allocating group tables: 0/1 done 00:09:35.047 Writing inode tables: 0/1 done 00:09:35.047 Creating journal (1024 blocks): done 00:09:35.047 Writing superblocks and filesystem accounting information: 0/1 done 00:09:35.047 00:09:35.047 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:09:35.047 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:09:35.047 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:35.047 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:35.047 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:35.047 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:35.047 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:35.047 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:35.305 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:35.305 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1370458 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1370458 ']' 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1370458 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1370458 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1370458' 00:09:35.306 killing process with pid 1370458 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1370458 00:09:35.306 08:22:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1370458 00:09:37.838 08:22:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:09:37.838 00:09:37.838 real 0m20.067s 00:09:37.838 user 0m25.409s 00:09:37.838 sys 0m8.072s 00:09:37.838 08:22:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:37.838 08:22:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:09:37.838 ************************************ 00:09:37.838 END TEST bdev_nbd 00:09:37.838 ************************************ 00:09:37.838 08:22:50 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:37.838 08:22:50 blockdev_general -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:09:37.838 08:22:50 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = nvme ']' 00:09:37.838 08:22:50 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = gpt ']' 00:09:37.838 08:22:50 blockdev_general -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:09:37.838 08:22:50 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:37.838 08:22:50 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:37.838 08:22:50 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:37.838 ************************************ 00:09:37.838 START TEST bdev_fio 00:09:37.839 ************************************ 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:37.839 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc0]' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc0 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p0]' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p0 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p1]' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p1 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p0]' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p0 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p1]' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p1 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p2]' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p2 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p3]' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p3 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p4]' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p4 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p5]' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p5 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p6]' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p6 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p7]' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p7 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_TestPT]' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=TestPT 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid0]' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid0 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_concat0]' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=concat0 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid1]' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid1 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_AIO0]' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=AIO0 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:37.839 08:22:50 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:37.839 ************************************ 00:09:37.839 START TEST bdev_fio_rw_verify 00:09:37.839 ************************************ 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib=/usr/lib64/libasan.so.8 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1347 -- # break 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:37.839 08:22:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:38.097 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:38.097 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:38.097 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:38.097 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:38.097 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:38.097 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:38.097 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:38.097 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:38.097 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:38.097 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:38.097 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:38.097 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:38.097 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:38.097 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:38.097 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:38.097 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:38.097 fio-3.35 00:09:38.097 Starting 16 threads 00:09:50.301 00:09:50.301 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=1375272: Tue Jul 23 08:23:01 2024 00:09:50.301 read: IOPS=97.0k, BW=379MiB/s (397MB/s)(3788MiB/10001msec) 00:09:50.301 slat (usec): min=2, max=371, avg=34.84, stdev=14.60 00:09:50.301 clat (usec): min=9, max=1736, avg=279.37, stdev=129.06 00:09:50.301 lat (usec): min=17, max=1802, avg=314.21, stdev=136.28 00:09:50.301 clat percentiles (usec): 00:09:50.301 | 50.000th=[ 273], 99.000th=[ 553], 99.900th=[ 611], 99.990th=[ 865], 00:09:50.301 | 99.999th=[ 1336] 00:09:50.301 write: IOPS=153k, BW=599MiB/s (628MB/s)(5898MiB/9854msec); 0 zone resets 00:09:50.301 slat (usec): min=5, max=448, avg=45.13, stdev=14.23 00:09:50.301 clat (usec): min=9, max=1282, avg=317.11, stdev=142.84 00:09:50.302 lat (usec): min=25, max=1368, avg=362.23, stdev=149.29 00:09:50.302 clat percentiles (usec): 00:09:50.302 | 50.000th=[ 306], 99.000th=[ 668], 99.900th=[ 881], 99.990th=[ 963], 00:09:50.302 | 99.999th=[ 1057] 00:09:50.302 bw ( KiB/s): min=518760, max=789459, per=98.68%, avg=604848.89, stdev=4030.55, samples=304 00:09:50.302 iops : min=129690, max=197362, avg=151212.11, stdev=1007.62, samples=304 00:09:50.302 lat (usec) : 10=0.01%, 20=0.03%, 50=0.72%, 100=5.11%, 250=34.25% 00:09:50.302 lat (usec) : 500=51.54%, 750=8.00%, 1000=0.35% 00:09:50.302 lat (msec) : 2=0.01% 00:09:50.302 cpu : usr=98.86%, sys=0.53%, ctx=682, majf=0, minf=121877 00:09:50.302 IO depths : 1=12.4%, 2=24.8%, 4=50.2%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:50.302 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:50.302 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:50.302 issued rwts: total=969713,1510014,0,0 short=0,0,0,0 dropped=0,0,0,0 00:09:50.302 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:50.302 00:09:50.302 Run status group 0 (all jobs): 00:09:50.302 READ: bw=379MiB/s (397MB/s), 379MiB/s-379MiB/s (397MB/s-397MB/s), io=3788MiB (3972MB), run=10001-10001msec 00:09:50.302 WRITE: bw=599MiB/s (628MB/s), 599MiB/s-599MiB/s (628MB/s-628MB/s), io=5898MiB (6185MB), run=9854-9854msec 00:09:52.207 ----------------------------------------------------- 00:09:52.207 Suppressions used: 00:09:52.207 count bytes template 00:09:52.207 16 140 /usr/src/fio/parse.c 00:09:52.207 10391 997536 /usr/src/fio/iolog.c 00:09:52.207 1 8 libtcmalloc_minimal.so 00:09:52.207 1 904 libcrypto.so 00:09:52.207 ----------------------------------------------------- 00:09:52.207 00:09:52.207 00:09:52.207 real 0m14.503s 00:09:52.207 user 2m49.075s 00:09:52.207 sys 0m2.332s 00:09:52.207 08:23:04 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:52.207 08:23:04 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:09:52.207 ************************************ 00:09:52.207 END TEST bdev_fio_rw_verify 00:09:52.207 ************************************ 00:09:52.207 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:52.207 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:09:52.208 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:52.208 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:09:52.208 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:52.208 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:09:52.208 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:09:52.208 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:52.208 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:52.208 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:52.208 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:09:52.208 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:52.208 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:52.208 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:52.208 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:09:52.208 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:09:52.208 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:09:52.208 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:52.209 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "86dea539-55d6-4d4b-9c97-7e50c910fee0"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "86dea539-55d6-4d4b-9c97-7e50c910fee0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "e39005d9-eb59-5112-990d-8bf611531d00"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e39005d9-eb59-5112-990d-8bf611531d00",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "f82f2244-f75d-5d24-bb0d-ff9a01293d4a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "f82f2244-f75d-5d24-bb0d-ff9a01293d4a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "c3f3088d-637e-5e3c-97ff-3117d2d1fafd"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c3f3088d-637e-5e3c-97ff-3117d2d1fafd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "f3ffcb71-d42c-5959-a770-2d5b0bdd63fa"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f3ffcb71-d42c-5959-a770-2d5b0bdd63fa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "15c1e107-d95e-530a-b4f1-318ccb8c0ae9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "15c1e107-d95e-530a-b4f1-318ccb8c0ae9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "7cc67248-6b8e-59d5-a843-1ff43c598d4e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7cc67248-6b8e-59d5-a843-1ff43c598d4e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "55ca48ce-b3bf-5c1b-b4da-4d5aa111fa19"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "55ca48ce-b3bf-5c1b-b4da-4d5aa111fa19",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "16ac8d3a-badb-5b19-997d-e115cca7e58c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "16ac8d3a-badb-5b19-997d-e115cca7e58c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "b08a8b1f-ffb3-5c60-ae5b-583e5c6a5be5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b08a8b1f-ffb3-5c60-ae5b-583e5c6a5be5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "7df619ab-fe78-503a-b9c8-d86fcb95d5b3"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7df619ab-fe78-503a-b9c8-d86fcb95d5b3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "1c79a4fa-9abf-5792-b915-b724f7c5cddb"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "1c79a4fa-9abf-5792-b915-b724f7c5cddb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "95ec41d3-1042-40aa-bc7a-6d2f873f625c"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "95ec41d3-1042-40aa-bc7a-6d2f873f625c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "95ec41d3-1042-40aa-bc7a-6d2f873f625c",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "ae695d5c-b5ea-416c-849a-e53d01863ebc",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "4359d4c9-15a1-48fb-9536-6bb972046090",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "2be01f71-a8b8-42ef-b361-62ae147bcde7"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "2be01f71-a8b8-42ef-b361-62ae147bcde7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2be01f71-a8b8-42ef-b361-62ae147bcde7",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "e25391a7-8267-4ea0-b886-b51bf6e9b6ba",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "149117b7-8df9-47ce-b12c-f6592749ad8f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "059d40a9-e4c7-4753-846b-967d35fe9c43"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "059d40a9-e4c7-4753-846b-967d35fe9c43",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "059d40a9-e4c7-4753-846b-967d35fe9c43",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "fce39a20-491e-4d10-8f3f-342deff5c1fe",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "9c826ed0-d461-4810-8eb5-17079072c107",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "ca0a08f1-6b85-478d-b4a6-a12fc01576b9"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "ca0a08f1-6b85-478d-b4a6-a12fc01576b9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:52.469 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n Malloc0 00:09:52.469 Malloc1p0 00:09:52.469 Malloc1p1 00:09:52.469 Malloc2p0 00:09:52.469 Malloc2p1 00:09:52.469 Malloc2p2 00:09:52.469 Malloc2p3 00:09:52.469 Malloc2p4 00:09:52.469 Malloc2p5 00:09:52.469 Malloc2p6 00:09:52.469 Malloc2p7 00:09:52.469 TestPT 00:09:52.469 raid0 00:09:52.469 concat0 ]] 00:09:52.469 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "86dea539-55d6-4d4b-9c97-7e50c910fee0"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "86dea539-55d6-4d4b-9c97-7e50c910fee0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "e39005d9-eb59-5112-990d-8bf611531d00"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e39005d9-eb59-5112-990d-8bf611531d00",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "f82f2244-f75d-5d24-bb0d-ff9a01293d4a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "f82f2244-f75d-5d24-bb0d-ff9a01293d4a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "c3f3088d-637e-5e3c-97ff-3117d2d1fafd"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c3f3088d-637e-5e3c-97ff-3117d2d1fafd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "f3ffcb71-d42c-5959-a770-2d5b0bdd63fa"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f3ffcb71-d42c-5959-a770-2d5b0bdd63fa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "15c1e107-d95e-530a-b4f1-318ccb8c0ae9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "15c1e107-d95e-530a-b4f1-318ccb8c0ae9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "7cc67248-6b8e-59d5-a843-1ff43c598d4e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7cc67248-6b8e-59d5-a843-1ff43c598d4e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "55ca48ce-b3bf-5c1b-b4da-4d5aa111fa19"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "55ca48ce-b3bf-5c1b-b4da-4d5aa111fa19",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "16ac8d3a-badb-5b19-997d-e115cca7e58c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "16ac8d3a-badb-5b19-997d-e115cca7e58c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "b08a8b1f-ffb3-5c60-ae5b-583e5c6a5be5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b08a8b1f-ffb3-5c60-ae5b-583e5c6a5be5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "7df619ab-fe78-503a-b9c8-d86fcb95d5b3"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7df619ab-fe78-503a-b9c8-d86fcb95d5b3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "1c79a4fa-9abf-5792-b915-b724f7c5cddb"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "1c79a4fa-9abf-5792-b915-b724f7c5cddb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "95ec41d3-1042-40aa-bc7a-6d2f873f625c"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "95ec41d3-1042-40aa-bc7a-6d2f873f625c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "95ec41d3-1042-40aa-bc7a-6d2f873f625c",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "ae695d5c-b5ea-416c-849a-e53d01863ebc",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "4359d4c9-15a1-48fb-9536-6bb972046090",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "2be01f71-a8b8-42ef-b361-62ae147bcde7"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "2be01f71-a8b8-42ef-b361-62ae147bcde7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2be01f71-a8b8-42ef-b361-62ae147bcde7",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "e25391a7-8267-4ea0-b886-b51bf6e9b6ba",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "149117b7-8df9-47ce-b12c-f6592749ad8f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "059d40a9-e4c7-4753-846b-967d35fe9c43"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "059d40a9-e4c7-4753-846b-967d35fe9c43",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "059d40a9-e4c7-4753-846b-967d35fe9c43",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "fce39a20-491e-4d10-8f3f-342deff5c1fe",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "9c826ed0-d461-4810-8eb5-17079072c107",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "ca0a08f1-6b85-478d-b4a6-a12fc01576b9"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "ca0a08f1-6b85-478d-b4a6-a12fc01576b9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc0]' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc0 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p0]' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p0 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p1]' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p1 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p0]' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p0 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p1]' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p1 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p2]' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p2 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p3]' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p3 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p4]' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p4 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p5]' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p5 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p6]' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p6 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p7]' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p7 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_TestPT]' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=TestPT 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_raid0]' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=raid0 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_concat0]' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=concat0 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:52.471 08:23:04 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:52.471 ************************************ 00:09:52.471 START TEST bdev_fio_trim 00:09:52.471 ************************************ 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib=/usr/lib64/libasan.so.8 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1347 -- # break 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:52.471 08:23:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:52.731 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:52.731 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:52.731 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:52.731 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:52.731 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:52.731 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:52.731 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:52.731 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:52.731 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:52.731 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:52.731 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:52.731 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:52.731 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:52.731 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:52.731 fio-3.35 00:09:52.731 Starting 14 threads 00:10:04.942 00:10:04.942 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=1378128: Tue Jul 23 08:23:16 2024 00:10:04.942 write: IOPS=138k, BW=541MiB/s (567MB/s)(5408MiB/10001msec); 0 zone resets 00:10:04.942 slat (usec): min=2, max=451, avg=35.88, stdev=10.42 00:10:04.942 clat (usec): min=22, max=1111, avg=252.64, stdev=88.64 00:10:04.942 lat (usec): min=31, max=1199, avg=288.52, stdev=92.24 00:10:04.942 clat percentiles (usec): 00:10:04.942 | 50.000th=[ 245], 99.000th=[ 465], 99.900th=[ 510], 99.990th=[ 570], 00:10:04.942 | 99.999th=[ 824] 00:10:04.942 bw ( KiB/s): min=508768, max=774747, per=100.00%, avg=554995.95, stdev=4652.34, samples=266 00:10:04.942 iops : min=127192, max=193686, avg=138748.95, stdev=1163.07, samples=266 00:10:04.942 trim: IOPS=138k, BW=541MiB/s (567MB/s)(5408MiB/10001msec); 0 zone resets 00:10:04.942 slat (usec): min=4, max=390, avg=25.07, stdev= 6.93 00:10:04.942 clat (usec): min=3, max=1200, avg=283.57, stdev=96.61 00:10:04.942 lat (usec): min=11, max=1248, avg=308.64, stdev=99.44 00:10:04.942 clat percentiles (usec): 00:10:04.942 | 50.000th=[ 277], 99.000th=[ 506], 99.900th=[ 562], 99.990th=[ 627], 00:10:04.942 | 99.999th=[ 914] 00:10:04.942 bw ( KiB/s): min=508768, max=774747, per=100.00%, avg=554996.37, stdev=4652.45, samples=266 00:10:04.942 iops : min=127192, max=193686, avg=138749.05, stdev=1163.10, samples=266 00:10:04.943 lat (usec) : 4=0.01%, 10=0.01%, 20=0.02%, 50=0.17%, 100=1.54% 00:10:04.943 lat (usec) : 250=44.51%, 500=53.03%, 750=0.72%, 1000=0.01% 00:10:04.943 lat (msec) : 2=0.01% 00:10:04.943 cpu : usr=99.63%, sys=0.02%, ctx=474, majf=0, minf=15757 00:10:04.943 IO depths : 1=12.5%, 2=24.9%, 4=50.0%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:10:04.943 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:04.943 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:04.943 issued rwts: total=0,1384417,1384422,0 short=0,0,0,0 dropped=0,0,0,0 00:10:04.943 latency : target=0, window=0, percentile=100.00%, depth=8 00:10:04.943 00:10:04.943 Run status group 0 (all jobs): 00:10:04.943 WRITE: bw=541MiB/s (567MB/s), 541MiB/s-541MiB/s (567MB/s-567MB/s), io=5408MiB (5671MB), run=10001-10001msec 00:10:04.943 TRIM: bw=541MiB/s (567MB/s), 541MiB/s-541MiB/s (567MB/s-567MB/s), io=5408MiB (5671MB), run=10001-10001msec 00:10:06.846 ----------------------------------------------------- 00:10:06.846 Suppressions used: 00:10:06.846 count bytes template 00:10:06.846 14 129 /usr/src/fio/parse.c 00:10:06.846 1 8 libtcmalloc_minimal.so 00:10:06.846 1 904 libcrypto.so 00:10:06.846 ----------------------------------------------------- 00:10:06.846 00:10:06.846 00:10:06.846 real 0m14.357s 00:10:06.846 user 2m30.564s 00:10:06.846 sys 0m1.388s 00:10:06.846 08:23:19 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:06.846 08:23:19 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:10:06.846 ************************************ 00:10:06.846 END TEST bdev_fio_trim 00:10:06.846 ************************************ 00:10:06.846 08:23:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:10:06.846 08:23:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:10:06.846 08:23:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:06.846 08:23:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:10:06.846 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:10:06.846 08:23:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:10:06.846 00:10:06.846 real 0m29.165s 00:10:06.846 user 5m19.824s 00:10:06.846 sys 0m3.864s 00:10:06.846 08:23:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:06.846 08:23:19 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:10:06.846 ************************************ 00:10:06.846 END TEST bdev_fio 00:10:06.846 ************************************ 00:10:06.846 08:23:19 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:06.846 08:23:19 blockdev_general -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:10:06.846 08:23:19 blockdev_general -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:10:06.846 08:23:19 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:10:06.846 08:23:19 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:06.846 08:23:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:06.846 ************************************ 00:10:06.846 START TEST bdev_verify 00:10:06.846 ************************************ 00:10:06.846 08:23:19 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:10:07.105 [2024-07-23 08:23:19.366137] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:10:07.105 [2024-07-23 08:23:19.366225] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1380557 ] 00:10:07.105 [2024-07-23 08:23:19.490664] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:07.363 [2024-07-23 08:23:19.713005] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:07.363 [2024-07-23 08:23:19.713014] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:07.930 [2024-07-23 08:23:20.178339] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:07.930 [2024-07-23 08:23:20.178399] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:07.930 [2024-07-23 08:23:20.178412] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:07.930 [2024-07-23 08:23:20.186340] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:07.930 [2024-07-23 08:23:20.186373] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:07.930 [2024-07-23 08:23:20.194359] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:07.930 [2024-07-23 08:23:20.194387] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:07.930 [2024-07-23 08:23:20.391635] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:07.930 [2024-07-23 08:23:20.391678] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:07.930 [2024-07-23 08:23:20.391696] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037b80 00:10:07.930 [2024-07-23 08:23:20.391706] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:07.930 [2024-07-23 08:23:20.393711] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:07.930 [2024-07-23 08:23:20.393739] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:08.496 Running I/O for 5 seconds... 00:10:13.822 00:10:13.822 Latency(us) 00:10:13.822 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:13.822 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x0 length 0x1000 00:10:13.822 Malloc0 : 5.15 1490.86 5.82 0.00 0.00 85709.69 497.37 205720.62 00:10:13.822 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x1000 length 0x1000 00:10:13.822 Malloc0 : 5.15 1466.82 5.73 0.00 0.00 87115.14 452.51 309579.58 00:10:13.822 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x0 length 0x800 00:10:13.822 Malloc1p0 : 5.20 763.67 2.98 0.00 0.00 166837.05 2871.10 191739.61 00:10:13.822 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x800 length 0x800 00:10:13.822 Malloc1p0 : 5.19 764.42 2.99 0.00 0.00 166692.27 2871.10 183750.46 00:10:13.822 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x0 length 0x800 00:10:13.822 Malloc1p1 : 5.20 763.29 2.98 0.00 0.00 166513.22 2886.70 187745.04 00:10:13.822 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x800 length 0x800 00:10:13.822 Malloc1p1 : 5.19 764.15 2.98 0.00 0.00 166329.66 2902.31 176759.95 00:10:13.822 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x0 length 0x200 00:10:13.822 Malloc2p0 : 5.20 762.91 2.98 0.00 0.00 166187.81 2902.31 182751.82 00:10:13.822 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x200 length 0x200 00:10:13.822 Malloc2p0 : 5.19 763.88 2.98 0.00 0.00 165976.11 2917.91 173764.02 00:10:13.822 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x0 length 0x200 00:10:13.822 Malloc2p1 : 5.20 762.53 2.98 0.00 0.00 165859.48 2839.89 176759.95 00:10:13.822 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x200 length 0x200 00:10:13.822 Malloc2p1 : 5.20 763.62 2.98 0.00 0.00 165624.96 2824.29 167772.16 00:10:13.822 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x0 length 0x200 00:10:13.822 Malloc2p2 : 5.21 762.14 2.98 0.00 0.00 165538.94 2839.89 172765.38 00:10:13.822 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x200 length 0x200 00:10:13.822 Malloc2p2 : 5.20 763.23 2.98 0.00 0.00 165306.25 2871.10 162778.94 00:10:13.822 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x0 length 0x200 00:10:13.822 Malloc2p3 : 5.21 761.76 2.98 0.00 0.00 165237.69 2839.89 166773.52 00:10:13.822 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x200 length 0x200 00:10:13.822 Malloc2p3 : 5.20 762.85 2.98 0.00 0.00 164991.13 2839.89 159783.01 00:10:13.822 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x0 length 0x200 00:10:13.822 Malloc2p4 : 5.21 761.37 2.97 0.00 0.00 164933.45 2839.89 162778.94 00:10:13.822 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x200 length 0x200 00:10:13.822 Malloc2p4 : 5.20 762.47 2.98 0.00 0.00 164687.72 2839.89 153791.15 00:10:13.822 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x0 length 0x200 00:10:13.822 Malloc2p5 : 5.21 760.98 2.97 0.00 0.00 164639.79 2839.89 160781.65 00:10:13.822 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x200 length 0x200 00:10:13.822 Malloc2p5 : 5.21 762.09 2.98 0.00 0.00 164397.11 2839.89 150795.22 00:10:13.822 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x0 length 0x200 00:10:13.822 Malloc2p6 : 5.22 760.69 2.97 0.00 0.00 164305.18 2871.10 155788.43 00:10:13.822 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x200 length 0x200 00:10:13.822 Malloc2p6 : 5.21 761.71 2.98 0.00 0.00 164078.86 2917.91 146800.64 00:10:13.822 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x0 length 0x200 00:10:13.822 Malloc2p7 : 5.22 760.22 2.97 0.00 0.00 164003.04 2824.29 149796.57 00:10:13.822 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:13.822 Verification LBA range: start 0x200 length 0x200 00:10:13.822 Malloc2p7 : 5.21 761.32 2.97 0.00 0.00 163758.05 2824.29 141807.42 00:10:13.822 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:13.823 Verification LBA range: start 0x0 length 0x1000 00:10:13.823 TestPT : 5.23 758.38 2.96 0.00 0.00 163925.02 8800.55 150795.22 00:10:13.823 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:13.823 Verification LBA range: start 0x1000 length 0x1000 00:10:13.823 TestPT : 5.23 737.00 2.88 0.00 0.00 168326.08 11796.48 202724.69 00:10:13.823 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:13.823 Verification LBA range: start 0x0 length 0x2000 00:10:13.823 raid0 : 5.22 759.54 2.97 0.00 0.00 163171.76 2855.50 129823.70 00:10:13.823 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:13.823 Verification LBA range: start 0x2000 length 0x2000 00:10:13.823 raid0 : 5.22 760.73 2.97 0.00 0.00 162932.46 2855.50 118838.61 00:10:13.823 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:13.823 Verification LBA range: start 0x0 length 0x2000 00:10:13.823 concat0 : 5.23 759.31 2.97 0.00 0.00 162850.01 2871.10 125329.80 00:10:13.823 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:13.823 Verification LBA range: start 0x2000 length 0x2000 00:10:13.823 concat0 : 5.22 760.26 2.97 0.00 0.00 162657.71 2839.89 119337.94 00:10:13.823 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:13.823 Verification LBA range: start 0x0 length 0x1000 00:10:13.823 raid1 : 5.23 759.09 2.97 0.00 0.00 162497.72 3588.88 120835.90 00:10:13.823 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:13.823 Verification LBA range: start 0x1000 length 0x1000 00:10:13.823 raid1 : 5.22 759.82 2.97 0.00 0.00 162367.28 3682.50 122333.87 00:10:13.823 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:13.823 Verification LBA range: start 0x0 length 0x4e2 00:10:13.823 AIO0 : 5.23 782.53 3.06 0.00 0.00 157246.56 674.86 125829.12 00:10:13.823 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:13.823 Verification LBA range: start 0x4e2 length 0x4e2 00:10:13.823 AIO0 : 5.23 783.34 3.06 0.00 0.00 157117.92 819.20 127327.09 00:10:13.823 =================================================================================================================== 00:10:13.823 Total : 25826.98 100.89 0.00 0.00 155455.54 452.51 309579.58 00:10:16.355 00:10:16.355 real 0m9.274s 00:10:16.355 user 0m17.020s 00:10:16.355 sys 0m0.421s 00:10:16.355 08:23:28 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:16.355 08:23:28 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:10:16.355 ************************************ 00:10:16.355 END TEST bdev_verify 00:10:16.355 ************************************ 00:10:16.355 08:23:28 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:16.355 08:23:28 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:10:16.355 08:23:28 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:10:16.355 08:23:28 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:16.355 08:23:28 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:16.355 ************************************ 00:10:16.355 START TEST bdev_verify_big_io 00:10:16.355 ************************************ 00:10:16.355 08:23:28 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:10:16.355 [2024-07-23 08:23:28.702321] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:10:16.355 [2024-07-23 08:23:28.702417] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1382325 ] 00:10:16.355 [2024-07-23 08:23:28.825140] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:16.613 [2024-07-23 08:23:29.036117] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:16.613 [2024-07-23 08:23:29.036128] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:17.181 [2024-07-23 08:23:29.548065] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:17.181 [2024-07-23 08:23:29.548117] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:17.181 [2024-07-23 08:23:29.548136] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:17.181 [2024-07-23 08:23:29.556074] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:17.181 [2024-07-23 08:23:29.556111] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:17.181 [2024-07-23 08:23:29.564091] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:17.181 [2024-07-23 08:23:29.564125] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:17.439 [2024-07-23 08:23:29.768270] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:17.439 [2024-07-23 08:23:29.768321] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:17.439 [2024-07-23 08:23:29.768338] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037b80 00:10:17.439 [2024-07-23 08:23:29.768347] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:17.439 [2024-07-23 08:23:29.770318] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:17.439 [2024-07-23 08:23:29.770348] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:17.697 [2024-07-23 08:23:30.192140] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:10:17.697 [2024-07-23 08:23:30.196004] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:10:17.697 [2024-07-23 08:23:30.200289] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:10:17.697 [2024-07-23 08:23:30.204332] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:10:17.697 [2024-07-23 08:23:30.208697] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:10:17.697 [2024-07-23 08:23:30.212420] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:10:17.955 [2024-07-23 08:23:30.216520] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:10:17.955 [2024-07-23 08:23:30.220673] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:10:17.955 [2024-07-23 08:23:30.224117] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:10:17.955 [2024-07-23 08:23:30.228042] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:10:17.955 [2024-07-23 08:23:30.231990] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:10:17.955 [2024-07-23 08:23:30.236317] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:10:17.955 [2024-07-23 08:23:30.240204] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:10:17.955 [2024-07-23 08:23:30.244549] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:10:17.955 [2024-07-23 08:23:30.248485] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:10:17.955 [2024-07-23 08:23:30.252790] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:10:17.955 [2024-07-23 08:23:30.351128] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:10:17.955 [2024-07-23 08:23:30.359008] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:10:17.955 Running I/O for 5 seconds... 00:10:24.513 00:10:24.513 Latency(us) 00:10:24.514 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:24.514 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x0 length 0x100 00:10:24.514 Malloc0 : 5.71 246.44 15.40 0.00 0.00 511730.72 639.76 1629786.70 00:10:24.514 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x100 length 0x100 00:10:24.514 Malloc0 : 5.73 245.83 15.36 0.00 0.00 512871.21 604.65 1837504.61 00:10:24.514 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x0 length 0x80 00:10:24.514 Malloc1p0 : 5.97 109.85 6.87 0.00 0.00 1096076.43 2793.08 1877450.36 00:10:24.514 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x80 length 0x80 00:10:24.514 Malloc1p0 : 6.11 75.92 4.75 0.00 0.00 1573943.39 2293.76 2444680.05 00:10:24.514 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x0 length 0x80 00:10:24.514 Malloc1p1 : 6.22 48.87 3.05 0.00 0.00 2377643.99 1895.86 3786857.33 00:10:24.514 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x80 length 0x80 00:10:24.514 Malloc1p1 : 6.24 51.29 3.21 0.00 0.00 2263834.83 1872.46 3483269.61 00:10:24.514 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x0 length 0x20 00:10:24.514 Malloc2p0 : 5.89 35.33 2.21 0.00 0.00 819657.94 643.66 1342177.28 00:10:24.514 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x20 length 0x20 00:10:24.514 Malloc2p0 : 5.89 38.05 2.38 0.00 0.00 761624.08 635.86 1174405.12 00:10:24.514 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x0 length 0x20 00:10:24.514 Malloc2p1 : 5.89 35.31 2.21 0.00 0.00 813404.69 659.26 1310220.68 00:10:24.514 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x20 length 0x20 00:10:24.514 Malloc2p1 : 5.89 38.03 2.38 0.00 0.00 755995.64 651.46 1150437.67 00:10:24.514 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x0 length 0x20 00:10:24.514 Malloc2p2 : 5.97 37.50 2.34 0.00 0.00 767734.33 643.66 1278264.08 00:10:24.514 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x20 length 0x20 00:10:24.514 Malloc2p2 : 5.89 38.01 2.38 0.00 0.00 750104.64 620.25 1126470.22 00:10:24.514 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x0 length 0x20 00:10:24.514 Malloc2p3 : 5.98 37.48 2.34 0.00 0.00 762747.20 550.03 1262285.78 00:10:24.514 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x20 length 0x20 00:10:24.514 Malloc2p3 : 5.89 38.00 2.38 0.00 0.00 745064.54 542.23 1110491.92 00:10:24.514 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x0 length 0x20 00:10:24.514 Malloc2p4 : 5.98 37.46 2.34 0.00 0.00 758000.07 546.13 1246307.47 00:10:24.514 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x20 length 0x20 00:10:24.514 Malloc2p4 : 5.97 40.17 2.51 0.00 0.00 707049.63 534.43 1086524.46 00:10:24.514 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x0 length 0x20 00:10:24.514 Malloc2p5 : 5.98 37.44 2.34 0.00 0.00 753305.24 592.94 1230329.17 00:10:24.514 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x20 length 0x20 00:10:24.514 Malloc2p5 : 5.98 40.15 2.51 0.00 0.00 702727.10 577.34 1070546.16 00:10:24.514 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x0 length 0x20 00:10:24.514 Malloc2p6 : 5.99 37.42 2.34 0.00 0.00 748900.96 561.74 1214350.87 00:10:24.514 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x20 length 0x20 00:10:24.514 Malloc2p6 : 5.98 40.13 2.51 0.00 0.00 698425.80 546.13 1054567.86 00:10:24.514 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x0 length 0x20 00:10:24.514 Malloc2p7 : 5.99 37.41 2.34 0.00 0.00 744039.07 526.63 1198372.57 00:10:24.514 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x20 length 0x20 00:10:24.514 Malloc2p7 : 5.98 40.11 2.51 0.00 0.00 693941.65 534.43 1038589.56 00:10:24.514 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x0 length 0x100 00:10:24.514 TestPT : 6.26 48.88 3.06 0.00 0.00 2211420.02 74398.96 3259573.39 00:10:24.514 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x100 length 0x100 00:10:24.514 TestPT : 6.28 48.39 3.02 0.00 0.00 2231701.19 73899.64 3115768.69 00:10:24.514 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x0 length 0x200 00:10:24.514 raid0 : 6.29 53.42 3.34 0.00 0.00 1982190.57 1295.12 3387399.80 00:10:24.514 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x200 length 0x200 00:10:24.514 raid0 : 6.22 56.62 3.54 0.00 0.00 1872884.37 1271.71 3083812.08 00:10:24.514 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x0 length 0x200 00:10:24.514 concat0 : 6.22 59.12 3.70 0.00 0.00 1764517.53 1240.50 3275551.70 00:10:24.514 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x200 length 0x200 00:10:24.514 concat0 : 6.24 64.09 4.01 0.00 0.00 1628953.81 1240.50 2971963.98 00:10:24.514 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x0 length 0x100 00:10:24.514 raid1 : 6.29 66.12 4.13 0.00 0.00 1550474.70 1646.20 3147725.29 00:10:24.514 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x100 length 0x100 00:10:24.514 raid1 : 6.28 66.67 4.17 0.00 0.00 1532490.80 1630.60 2844137.57 00:10:24.514 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x0 length 0x4e 00:10:24.514 AIO0 : 6.37 92.03 5.75 0.00 0.00 669572.40 639.76 1853482.91 00:10:24.514 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:10:24.514 Verification LBA range: start 0x4e length 0x4e 00:10:24.514 AIO0 : 6.36 92.09 5.76 0.00 0.00 667824.27 628.05 1613808.40 00:10:24.514 =================================================================================================================== 00:10:24.514 Total : 2033.64 127.10 0.00 0.00 1069001.07 526.63 3786857.33 00:10:27.048 00:10:27.048 real 0m10.897s 00:10:27.048 user 0m20.184s 00:10:27.048 sys 0m0.483s 00:10:27.048 08:23:39 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:27.048 08:23:39 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:10:27.048 ************************************ 00:10:27.048 END TEST bdev_verify_big_io 00:10:27.048 ************************************ 00:10:27.048 08:23:39 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:27.048 08:23:39 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:27.048 08:23:39 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:27.048 08:23:39 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:27.048 08:23:39 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:27.306 ************************************ 00:10:27.306 START TEST bdev_write_zeroes 00:10:27.306 ************************************ 00:10:27.306 08:23:39 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:27.307 [2024-07-23 08:23:39.665001] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:10:27.307 [2024-07-23 08:23:39.665098] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1384347 ] 00:10:27.307 [2024-07-23 08:23:39.787757] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:27.565 [2024-07-23 08:23:39.998200] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:28.132 [2024-07-23 08:23:40.464880] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:28.132 [2024-07-23 08:23:40.464938] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:28.132 [2024-07-23 08:23:40.464953] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:28.132 [2024-07-23 08:23:40.472864] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:28.132 [2024-07-23 08:23:40.472900] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:28.132 [2024-07-23 08:23:40.480884] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:28.132 [2024-07-23 08:23:40.480918] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:28.391 [2024-07-23 08:23:40.675809] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:28.391 [2024-07-23 08:23:40.675858] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:28.391 [2024-07-23 08:23:40.675874] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037880 00:10:28.391 [2024-07-23 08:23:40.675884] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:28.391 [2024-07-23 08:23:40.677773] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:28.391 [2024-07-23 08:23:40.677800] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:28.649 Running I/O for 1 seconds... 00:10:30.022 00:10:30.022 Latency(us) 00:10:30.022 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:30.022 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:30.022 Malloc0 : 1.02 6754.53 26.38 0.00 0.00 18941.97 483.72 30957.96 00:10:30.022 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:30.022 Malloc1p0 : 1.02 6748.04 26.36 0.00 0.00 18932.20 678.77 30333.81 00:10:30.022 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:30.022 Malloc1p1 : 1.03 6741.64 26.33 0.00 0.00 18920.24 655.36 29709.65 00:10:30.022 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:30.022 Malloc2p0 : 1.03 6735.26 26.31 0.00 0.00 18905.91 663.16 29210.33 00:10:30.022 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:30.022 Malloc2p1 : 1.03 6728.90 26.28 0.00 0.00 18892.79 667.06 28461.35 00:10:30.022 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:30.022 Malloc2p2 : 1.03 6722.55 26.26 0.00 0.00 18884.90 655.36 27837.20 00:10:30.022 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:30.022 Malloc2p3 : 1.03 6716.22 26.24 0.00 0.00 18864.30 663.16 27213.04 00:10:30.022 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:30.022 Malloc2p4 : 1.03 6709.91 26.21 0.00 0.00 18855.12 659.26 26713.72 00:10:30.022 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:30.022 Malloc2p5 : 1.04 6757.83 26.40 0.00 0.00 18690.06 655.36 25964.74 00:10:30.022 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:30.022 Malloc2p6 : 1.04 6751.50 26.37 0.00 0.00 18677.78 698.27 25215.76 00:10:30.022 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:30.023 Malloc2p7 : 1.04 6745.23 26.35 0.00 0.00 18664.49 690.47 24591.60 00:10:30.023 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:30.023 TestPT : 1.04 6738.96 26.32 0.00 0.00 18653.84 678.77 23842.62 00:10:30.023 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:30.023 raid0 : 1.05 6731.70 26.30 0.00 0.00 18636.30 1185.89 23592.96 00:10:30.023 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:30.023 concat0 : 1.05 6724.51 26.27 0.00 0.00 18602.25 1185.89 23592.96 00:10:30.023 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:30.023 raid1 : 1.05 6715.38 26.23 0.00 0.00 18570.43 1895.86 23468.13 00:10:30.023 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:30.023 AIO0 : 1.05 6710.05 26.21 0.00 0.00 18522.86 702.17 22968.81 00:10:30.023 =================================================================================================================== 00:10:30.023 Total : 107732.20 420.83 0.00 0.00 18762.22 483.72 30957.96 00:10:32.602 00:10:32.602 real 0m4.911s 00:10:32.602 user 0m4.413s 00:10:32.602 sys 0m0.385s 00:10:32.602 08:23:44 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:32.602 08:23:44 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:10:32.602 ************************************ 00:10:32.602 END TEST bdev_write_zeroes 00:10:32.602 ************************************ 00:10:32.602 08:23:44 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:32.602 08:23:44 blockdev_general -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:32.602 08:23:44 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:32.602 08:23:44 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:32.602 08:23:44 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:32.602 ************************************ 00:10:32.602 START TEST bdev_json_nonenclosed 00:10:32.602 ************************************ 00:10:32.602 08:23:44 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:32.602 [2024-07-23 08:23:44.629039] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:10:32.603 [2024-07-23 08:23:44.629133] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1385350 ] 00:10:32.603 [2024-07-23 08:23:44.750722] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:32.603 [2024-07-23 08:23:44.951623] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:32.603 [2024-07-23 08:23:44.951717] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:10:32.603 [2024-07-23 08:23:44.951734] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:10:32.603 [2024-07-23 08:23:44.951745] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:32.861 00:10:32.861 real 0m0.816s 00:10:32.861 user 0m0.644s 00:10:32.861 sys 0m0.168s 00:10:32.861 08:23:45 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:10:32.861 08:23:45 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:32.861 08:23:45 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:10:32.861 ************************************ 00:10:32.861 END TEST bdev_json_nonenclosed 00:10:32.861 ************************************ 00:10:33.119 08:23:45 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:10:33.119 08:23:45 blockdev_general -- bdev/blockdev.sh@781 -- # true 00:10:33.119 08:23:45 blockdev_general -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:33.119 08:23:45 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:33.119 08:23:45 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:33.119 08:23:45 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:33.119 ************************************ 00:10:33.119 START TEST bdev_json_nonarray 00:10:33.119 ************************************ 00:10:33.119 08:23:45 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:33.119 [2024-07-23 08:23:45.516888] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:10:33.119 [2024-07-23 08:23:45.516970] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1385402 ] 00:10:33.378 [2024-07-23 08:23:45.648825] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:33.378 [2024-07-23 08:23:45.868269] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:33.378 [2024-07-23 08:23:45.868363] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:10:33.378 [2024-07-23 08:23:45.868380] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:10:33.378 [2024-07-23 08:23:45.868390] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:33.945 00:10:33.945 real 0m0.844s 00:10:33.945 user 0m0.664s 00:10:33.945 sys 0m0.176s 00:10:33.945 08:23:46 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:10:33.945 08:23:46 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:33.945 08:23:46 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:10:33.945 ************************************ 00:10:33.945 END TEST bdev_json_nonarray 00:10:33.945 ************************************ 00:10:33.945 08:23:46 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:10:33.945 08:23:46 blockdev_general -- bdev/blockdev.sh@784 -- # true 00:10:33.945 08:23:46 blockdev_general -- bdev/blockdev.sh@786 -- # [[ bdev == bdev ]] 00:10:33.945 08:23:46 blockdev_general -- bdev/blockdev.sh@787 -- # run_test bdev_qos qos_test_suite '' 00:10:33.945 08:23:46 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:33.945 08:23:46 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:33.945 08:23:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:33.945 ************************************ 00:10:33.945 START TEST bdev_qos 00:10:33.945 ************************************ 00:10:33.945 08:23:46 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:10:33.945 08:23:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # QOS_PID=1385663 00:10:33.945 08:23:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # echo 'Process qos testing pid: 1385663' 00:10:33.945 Process qos testing pid: 1385663 00:10:33.945 08:23:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@444 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:10:33.945 08:23:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:10:33.945 08:23:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # waitforlisten 1385663 00:10:33.945 08:23:46 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 1385663 ']' 00:10:33.945 08:23:46 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:33.945 08:23:46 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:33.945 08:23:46 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:33.945 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:33.945 08:23:46 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:33.945 08:23:46 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:33.945 [2024-07-23 08:23:46.416994] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:10:33.945 [2024-07-23 08:23:46.417084] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1385663 ] 00:10:34.204 [2024-07-23 08:23:46.541653] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:34.463 [2024-07-23 08:23:46.755632] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:34.722 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:34.722 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:10:34.722 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@450 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:10:34.722 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:34.722 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:34.982 Malloc_0 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # waitforbdev Malloc_0 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:34.982 [ 00:10:34.982 { 00:10:34.982 "name": "Malloc_0", 00:10:34.982 "aliases": [ 00:10:34.982 "39ed31bd-4d95-4b49-a39e-48ae88dd8cfe" 00:10:34.982 ], 00:10:34.982 "product_name": "Malloc disk", 00:10:34.982 "block_size": 512, 00:10:34.982 "num_blocks": 262144, 00:10:34.982 "uuid": "39ed31bd-4d95-4b49-a39e-48ae88dd8cfe", 00:10:34.982 "assigned_rate_limits": { 00:10:34.982 "rw_ios_per_sec": 0, 00:10:34.982 "rw_mbytes_per_sec": 0, 00:10:34.982 "r_mbytes_per_sec": 0, 00:10:34.982 "w_mbytes_per_sec": 0 00:10:34.982 }, 00:10:34.982 "claimed": false, 00:10:34.982 "zoned": false, 00:10:34.982 "supported_io_types": { 00:10:34.982 "read": true, 00:10:34.982 "write": true, 00:10:34.982 "unmap": true, 00:10:34.982 "flush": true, 00:10:34.982 "reset": true, 00:10:34.982 "nvme_admin": false, 00:10:34.982 "nvme_io": false, 00:10:34.982 "nvme_io_md": false, 00:10:34.982 "write_zeroes": true, 00:10:34.982 "zcopy": true, 00:10:34.982 "get_zone_info": false, 00:10:34.982 "zone_management": false, 00:10:34.982 "zone_append": false, 00:10:34.982 "compare": false, 00:10:34.982 "compare_and_write": false, 00:10:34.982 "abort": true, 00:10:34.982 "seek_hole": false, 00:10:34.982 "seek_data": false, 00:10:34.982 "copy": true, 00:10:34.982 "nvme_iov_md": false 00:10:34.982 }, 00:10:34.982 "memory_domains": [ 00:10:34.982 { 00:10:34.982 "dma_device_id": "system", 00:10:34.982 "dma_device_type": 1 00:10:34.982 }, 00:10:34.982 { 00:10:34.982 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:34.982 "dma_device_type": 2 00:10:34.982 } 00:10:34.982 ], 00:10:34.982 "driver_specific": {} 00:10:34.982 } 00:10:34.982 ] 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # rpc_cmd bdev_null_create Null_1 128 512 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:34.982 Null_1 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # waitforbdev Null_1 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:34.982 [ 00:10:34.982 { 00:10:34.982 "name": "Null_1", 00:10:34.982 "aliases": [ 00:10:34.982 "f07fc49a-5de8-4c97-9f3d-109f6dcf9303" 00:10:34.982 ], 00:10:34.982 "product_name": "Null disk", 00:10:34.982 "block_size": 512, 00:10:34.982 "num_blocks": 262144, 00:10:34.982 "uuid": "f07fc49a-5de8-4c97-9f3d-109f6dcf9303", 00:10:34.982 "assigned_rate_limits": { 00:10:34.982 "rw_ios_per_sec": 0, 00:10:34.982 "rw_mbytes_per_sec": 0, 00:10:34.982 "r_mbytes_per_sec": 0, 00:10:34.982 "w_mbytes_per_sec": 0 00:10:34.982 }, 00:10:34.982 "claimed": false, 00:10:34.982 "zoned": false, 00:10:34.982 "supported_io_types": { 00:10:34.982 "read": true, 00:10:34.982 "write": true, 00:10:34.982 "unmap": false, 00:10:34.982 "flush": false, 00:10:34.982 "reset": true, 00:10:34.982 "nvme_admin": false, 00:10:34.982 "nvme_io": false, 00:10:34.982 "nvme_io_md": false, 00:10:34.982 "write_zeroes": true, 00:10:34.982 "zcopy": false, 00:10:34.982 "get_zone_info": false, 00:10:34.982 "zone_management": false, 00:10:34.982 "zone_append": false, 00:10:34.982 "compare": false, 00:10:34.982 "compare_and_write": false, 00:10:34.982 "abort": true, 00:10:34.982 "seek_hole": false, 00:10:34.982 "seek_data": false, 00:10:34.982 "copy": false, 00:10:34.982 "nvme_iov_md": false 00:10:34.982 }, 00:10:34.982 "driver_specific": {} 00:10:34.982 } 00:10:34.982 ] 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # qos_function_test 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@455 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@409 -- # local qos_lower_iops_limit=1000 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_bw_limit=2 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local io_result=0 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local iops_limit=0 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local bw_limit=0 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # get_io_result IOPS Malloc_0 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:10:34.982 08:23:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:10:34.982 Running I/O for 60 seconds... 00:10:40.249 08:23:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 81183.10 324732.38 0.00 0.00 327680.00 0.00 0.00 ' 00:10:40.249 08:23:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:10:40.249 08:23:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:10:40.249 08:23:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # iostat_result=81183.10 00:10:40.249 08:23:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 81183 00:10:40.249 08:23:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # io_result=81183 00:10:40.249 08:23:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@417 -- # iops_limit=20000 00:10:40.249 08:23:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # '[' 20000 -gt 1000 ']' 00:10:40.249 08:23:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@421 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 20000 Malloc_0 00:10:40.249 08:23:52 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:40.249 08:23:52 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:40.249 08:23:52 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:40.249 08:23:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # run_test bdev_qos_iops run_qos_test 20000 IOPS Malloc_0 00:10:40.249 08:23:52 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:40.249 08:23:52 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:40.249 08:23:52 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:40.249 ************************************ 00:10:40.249 START TEST bdev_qos_iops 00:10:40.249 ************************************ 00:10:40.249 08:23:52 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 20000 IOPS Malloc_0 00:10:40.249 08:23:52 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@388 -- # local qos_limit=20000 00:10:40.249 08:23:52 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_result=0 00:10:40.249 08:23:52 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # get_io_result IOPS Malloc_0 00:10:40.250 08:23:52 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:10:40.250 08:23:52 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:10:40.250 08:23:52 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local iostat_result 00:10:40.250 08:23:52 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:40.250 08:23:52 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:10:40.250 08:23:52 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # tail -1 00:10:45.528 08:23:57 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 20017.24 80068.94 0.00 0.00 80800.00 0.00 0.00 ' 00:10:45.528 08:23:57 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:10:45.528 08:23:57 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:10:45.528 08:23:57 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # iostat_result=20017.24 00:10:45.528 08:23:57 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@384 -- # echo 20017 00:10:45.528 08:23:57 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # qos_result=20017 00:10:45.528 08:23:57 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # '[' IOPS = BANDWIDTH ']' 00:10:45.528 08:23:57 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@395 -- # lower_limit=18000 00:10:45.528 08:23:57 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # upper_limit=22000 00:10:45.528 08:23:57 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 20017 -lt 18000 ']' 00:10:45.528 08:23:57 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 20017 -gt 22000 ']' 00:10:45.528 00:10:45.528 real 0m5.176s 00:10:45.528 user 0m0.100s 00:10:45.528 sys 0m0.025s 00:10:45.528 08:23:57 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:45.528 08:23:57 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:10:45.528 ************************************ 00:10:45.528 END TEST bdev_qos_iops 00:10:45.528 ************************************ 00:10:45.528 08:23:57 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:45.528 08:23:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # get_io_result BANDWIDTH Null_1 00:10:45.528 08:23:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:10:45.528 08:23:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:10:45.528 08:23:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:10:45.528 08:23:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:45.528 08:23:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Null_1 00:10:45.528 08:23:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 34553.01 138212.02 0.00 0.00 139264.00 0.00 0.00 ' 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # iostat_result=139264.00 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 139264 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # bw_limit=139264 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=13 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # '[' 13 -lt 2 ']' 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@431 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 13 Null_1 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # run_test bdev_qos_bw run_qos_test 13 BANDWIDTH Null_1 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:50.799 08:24:03 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:50.799 ************************************ 00:10:50.799 START TEST bdev_qos_bw 00:10:50.799 ************************************ 00:10:50.799 08:24:03 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 13 BANDWIDTH Null_1 00:10:50.799 08:24:03 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@388 -- # local qos_limit=13 00:10:50.799 08:24:03 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:10:50.799 08:24:03 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Null_1 00:10:50.799 08:24:03 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:10:50.799 08:24:03 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:10:50.799 08:24:03 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:10:50.799 08:24:03 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:50.799 08:24:03 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # grep Null_1 00:10:50.799 08:24:03 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # tail -1 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 3326.60 13306.42 0.00 0.00 13548.00 0.00 0.00 ' 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # iostat_result=13548.00 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@384 -- # echo 13548 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # qos_result=13548 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # qos_limit=13312 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@395 -- # lower_limit=11980 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # upper_limit=14643 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 13548 -lt 11980 ']' 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 13548 -gt 14643 ']' 00:10:56.078 00:10:56.078 real 0m5.183s 00:10:56.078 user 0m0.070s 00:10:56.078 sys 0m0.028s 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:10:56.078 ************************************ 00:10:56.078 END TEST bdev_qos_bw 00:10:56.078 ************************************ 00:10:56.078 08:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:56.078 08:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@435 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:10:56.078 08:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:56.078 08:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:56.078 08:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:56.078 08:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:10:56.078 08:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:56.078 08:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:56.078 08:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:56.078 ************************************ 00:10:56.078 START TEST bdev_qos_ro_bw 00:10:56.078 ************************************ 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@388 -- # local qos_limit=2 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Malloc_0 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # tail -1 00:10:56.078 08:24:08 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:01.356 08:24:13 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 511.94 2047.77 0.00 0.00 2060.00 0.00 0.00 ' 00:11:01.356 08:24:13 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:11:01.356 08:24:13 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:01.356 08:24:13 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:11:01.356 08:24:13 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # iostat_result=2060.00 00:11:01.356 08:24:13 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@384 -- # echo 2060 00:11:01.356 08:24:13 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # qos_result=2060 00:11:01.356 08:24:13 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:01.356 08:24:13 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # qos_limit=2048 00:11:01.357 08:24:13 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@395 -- # lower_limit=1843 00:11:01.357 08:24:13 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # upper_limit=2252 00:11:01.357 08:24:13 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2060 -lt 1843 ']' 00:11:01.357 08:24:13 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2060 -gt 2252 ']' 00:11:01.357 00:11:01.357 real 0m5.157s 00:11:01.357 user 0m0.085s 00:11:01.357 sys 0m0.036s 00:11:01.357 08:24:13 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:01.357 08:24:13 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:11:01.357 ************************************ 00:11:01.357 END TEST bdev_qos_ro_bw 00:11:01.357 ************************************ 00:11:01.357 08:24:13 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:11:01.357 08:24:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@458 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:11:01.357 08:24:13 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:01.357 08:24:13 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:01.925 08:24:14 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:01.925 08:24:14 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_null_delete Null_1 00:11:01.925 08:24:14 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:01.925 08:24:14 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:01.925 00:11:01.925 Latency(us) 00:11:01.925 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:01.925 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:01.925 Malloc_0 : 26.50 27742.46 108.37 0.00 0.00 9139.26 1669.61 503316.48 00:11:01.925 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:01.925 Null_1 : 26.73 29954.72 117.01 0.00 0.00 8530.44 585.14 213709.78 00:11:01.925 =================================================================================================================== 00:11:01.925 Total : 57697.18 225.38 0.00 0.00 8821.88 585.14 503316.48 00:11:01.925 0 00:11:01.925 08:24:14 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:01.925 08:24:14 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # killprocess 1385663 00:11:01.925 08:24:14 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 1385663 ']' 00:11:01.925 08:24:14 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 1385663 00:11:01.925 08:24:14 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:11:01.925 08:24:14 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:01.925 08:24:14 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1385663 00:11:01.925 08:24:14 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:11:01.925 08:24:14 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:11:01.925 08:24:14 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1385663' 00:11:01.925 killing process with pid 1385663 00:11:01.925 08:24:14 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 1385663 00:11:01.925 Received shutdown signal, test time was about 26.776892 seconds 00:11:01.925 00:11:01.925 Latency(us) 00:11:01.925 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:01.925 =================================================================================================================== 00:11:01.925 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:01.925 08:24:14 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 1385663 00:11:03.305 08:24:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # trap - SIGINT SIGTERM EXIT 00:11:03.305 00:11:03.305 real 0m29.275s 00:11:03.305 user 0m29.730s 00:11:03.305 sys 0m0.707s 00:11:03.305 08:24:15 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:03.305 08:24:15 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:03.305 ************************************ 00:11:03.305 END TEST bdev_qos 00:11:03.305 ************************************ 00:11:03.305 08:24:15 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:03.305 08:24:15 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:11:03.305 08:24:15 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:03.305 08:24:15 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:03.305 08:24:15 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:03.305 ************************************ 00:11:03.305 START TEST bdev_qd_sampling 00:11:03.305 ************************************ 00:11:03.305 08:24:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:11:03.305 08:24:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@537 -- # QD_DEV=Malloc_QD 00:11:03.305 08:24:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # QD_PID=1391266 00:11:03.305 08:24:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # echo 'Process bdev QD sampling period testing pid: 1391266' 00:11:03.305 Process bdev QD sampling period testing pid: 1391266 00:11:03.305 08:24:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:11:03.305 08:24:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:11:03.305 08:24:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # waitforlisten 1391266 00:11:03.305 08:24:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 1391266 ']' 00:11:03.305 08:24:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:03.305 08:24:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:03.305 08:24:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:03.305 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:03.305 08:24:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:03.305 08:24:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:03.305 [2024-07-23 08:24:15.754992] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:11:03.305 [2024-07-23 08:24:15.755099] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1391266 ] 00:11:03.565 [2024-07-23 08:24:15.877130] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:03.824 [2024-07-23 08:24:16.091478] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:03.824 [2024-07-23 08:24:16.091488] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:04.083 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:04.083 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:11:04.083 08:24:16 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@545 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:11:04.083 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:04.083 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:04.342 Malloc_QD 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # waitforbdev Malloc_QD 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:04.342 [ 00:11:04.342 { 00:11:04.342 "name": "Malloc_QD", 00:11:04.342 "aliases": [ 00:11:04.342 "c4da0122-9265-4a9f-8a8e-3177f4719053" 00:11:04.342 ], 00:11:04.342 "product_name": "Malloc disk", 00:11:04.342 "block_size": 512, 00:11:04.342 "num_blocks": 262144, 00:11:04.342 "uuid": "c4da0122-9265-4a9f-8a8e-3177f4719053", 00:11:04.342 "assigned_rate_limits": { 00:11:04.342 "rw_ios_per_sec": 0, 00:11:04.342 "rw_mbytes_per_sec": 0, 00:11:04.342 "r_mbytes_per_sec": 0, 00:11:04.342 "w_mbytes_per_sec": 0 00:11:04.342 }, 00:11:04.342 "claimed": false, 00:11:04.342 "zoned": false, 00:11:04.342 "supported_io_types": { 00:11:04.342 "read": true, 00:11:04.342 "write": true, 00:11:04.342 "unmap": true, 00:11:04.342 "flush": true, 00:11:04.342 "reset": true, 00:11:04.342 "nvme_admin": false, 00:11:04.342 "nvme_io": false, 00:11:04.342 "nvme_io_md": false, 00:11:04.342 "write_zeroes": true, 00:11:04.342 "zcopy": true, 00:11:04.342 "get_zone_info": false, 00:11:04.342 "zone_management": false, 00:11:04.342 "zone_append": false, 00:11:04.342 "compare": false, 00:11:04.342 "compare_and_write": false, 00:11:04.342 "abort": true, 00:11:04.342 "seek_hole": false, 00:11:04.342 "seek_data": false, 00:11:04.342 "copy": true, 00:11:04.342 "nvme_iov_md": false 00:11:04.342 }, 00:11:04.342 "memory_domains": [ 00:11:04.342 { 00:11:04.342 "dma_device_id": "system", 00:11:04.342 "dma_device_type": 1 00:11:04.342 }, 00:11:04.342 { 00:11:04.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:04.342 "dma_device_type": 2 00:11:04.342 } 00:11:04.342 ], 00:11:04.342 "driver_specific": {} 00:11:04.342 } 00:11:04.342 ] 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # sleep 2 00:11:04.342 08:24:16 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:11:04.343 Running I/O for 5 seconds... 00:11:06.250 08:24:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # qd_sampling_function_test Malloc_QD 00:11:06.250 08:24:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@518 -- # local bdev_name=Malloc_QD 00:11:06.250 08:24:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local sampling_period=10 00:11:06.250 08:24:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local iostats 00:11:06.250 08:24:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@522 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:11:06.250 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:06.250 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:06.250 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:06.250 08:24:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:11:06.250 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:06.250 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:06.250 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:06.250 08:24:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # iostats='{ 00:11:06.250 "tick_rate": 2100000000, 00:11:06.250 "ticks": 3954383442028730, 00:11:06.250 "bdevs": [ 00:11:06.250 { 00:11:06.250 "name": "Malloc_QD", 00:11:06.250 "bytes_read": 933278208, 00:11:06.250 "num_read_ops": 227844, 00:11:06.250 "bytes_written": 0, 00:11:06.250 "num_write_ops": 0, 00:11:06.250 "bytes_unmapped": 0, 00:11:06.250 "num_unmap_ops": 0, 00:11:06.250 "bytes_copied": 0, 00:11:06.250 "num_copy_ops": 0, 00:11:06.250 "read_latency_ticks": 2070769623688, 00:11:06.250 "max_read_latency_ticks": 9651620, 00:11:06.250 "min_read_latency_ticks": 320178, 00:11:06.250 "write_latency_ticks": 0, 00:11:06.250 "max_write_latency_ticks": 0, 00:11:06.250 "min_write_latency_ticks": 0, 00:11:06.250 "unmap_latency_ticks": 0, 00:11:06.250 "max_unmap_latency_ticks": 0, 00:11:06.250 "min_unmap_latency_ticks": 0, 00:11:06.250 "copy_latency_ticks": 0, 00:11:06.250 "max_copy_latency_ticks": 0, 00:11:06.250 "min_copy_latency_ticks": 0, 00:11:06.250 "io_error": {}, 00:11:06.250 "queue_depth_polling_period": 10, 00:11:06.250 "queue_depth": 512, 00:11:06.250 "io_time": 30, 00:11:06.250 "weighted_io_time": 15360 00:11:06.250 } 00:11:06.250 ] 00:11:06.250 }' 00:11:06.250 08:24:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # qd_sampling_period=10 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 == null ']' 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 -ne 10 ']' 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@552 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:06.510 00:11:06.510 Latency(us) 00:11:06.510 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:06.510 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:11:06.510 Malloc_QD : 1.99 58779.35 229.61 0.00 0.00 4344.40 1178.09 4649.94 00:11:06.510 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:06.510 Malloc_QD : 2.00 59278.74 231.56 0.00 0.00 4308.39 916.72 4493.90 00:11:06.510 =================================================================================================================== 00:11:06.510 Total : 118058.10 461.16 0.00 0.00 4326.32 916.72 4649.94 00:11:06.510 0 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # killprocess 1391266 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 1391266 ']' 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 1391266 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1391266 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1391266' 00:11:06.510 killing process with pid 1391266 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 1391266 00:11:06.510 Received shutdown signal, test time was about 2.151124 seconds 00:11:06.510 00:11:06.510 Latency(us) 00:11:06.510 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:06.510 =================================================================================================================== 00:11:06.510 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:06.510 08:24:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 1391266 00:11:07.892 08:24:20 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # trap - SIGINT SIGTERM EXIT 00:11:07.892 00:11:07.892 real 0m4.563s 00:11:07.892 user 0m8.353s 00:11:07.892 sys 0m0.395s 00:11:07.892 08:24:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:07.892 08:24:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:07.892 ************************************ 00:11:07.892 END TEST bdev_qd_sampling 00:11:07.892 ************************************ 00:11:07.892 08:24:20 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:07.892 08:24:20 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_error error_test_suite '' 00:11:07.892 08:24:20 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:07.892 08:24:20 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:07.892 08:24:20 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:07.892 ************************************ 00:11:07.892 START TEST bdev_error 00:11:07.892 ************************************ 00:11:07.892 08:24:20 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:11:07.892 08:24:20 blockdev_general.bdev_error -- bdev/blockdev.sh@465 -- # DEV_1=Dev_1 00:11:07.892 08:24:20 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_2=Dev_2 00:11:07.892 08:24:20 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # ERR_DEV=EE_Dev_1 00:11:07.892 08:24:20 blockdev_general.bdev_error -- bdev/blockdev.sh@470 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:11:07.892 08:24:20 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # ERR_PID=1392064 00:11:07.892 08:24:20 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # echo 'Process error testing pid: 1392064' 00:11:07.892 Process error testing pid: 1392064 00:11:07.892 08:24:20 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # waitforlisten 1392064 00:11:07.892 08:24:20 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 1392064 ']' 00:11:07.892 08:24:20 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:07.892 08:24:20 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:07.892 08:24:20 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:07.892 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:07.892 08:24:20 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:07.892 08:24:20 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:07.892 [2024-07-23 08:24:20.370942] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:11:07.892 [2024-07-23 08:24:20.371023] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1392064 ] 00:11:08.152 [2024-07-23 08:24:20.493154] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:08.411 [2024-07-23 08:24:20.704305] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:08.670 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:08.670 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:11:08.670 08:24:21 blockdev_general.bdev_error -- bdev/blockdev.sh@475 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:11:08.670 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:08.670 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:08.929 Dev_1 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:08.929 08:24:21 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # waitforbdev Dev_1 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:08.929 [ 00:11:08.929 { 00:11:08.929 "name": "Dev_1", 00:11:08.929 "aliases": [ 00:11:08.929 "628708e8-eb45-4c6e-b008-610cca37081a" 00:11:08.929 ], 00:11:08.929 "product_name": "Malloc disk", 00:11:08.929 "block_size": 512, 00:11:08.929 "num_blocks": 262144, 00:11:08.929 "uuid": "628708e8-eb45-4c6e-b008-610cca37081a", 00:11:08.929 "assigned_rate_limits": { 00:11:08.929 "rw_ios_per_sec": 0, 00:11:08.929 "rw_mbytes_per_sec": 0, 00:11:08.929 "r_mbytes_per_sec": 0, 00:11:08.929 "w_mbytes_per_sec": 0 00:11:08.929 }, 00:11:08.929 "claimed": false, 00:11:08.929 "zoned": false, 00:11:08.929 "supported_io_types": { 00:11:08.929 "read": true, 00:11:08.929 "write": true, 00:11:08.929 "unmap": true, 00:11:08.929 "flush": true, 00:11:08.929 "reset": true, 00:11:08.929 "nvme_admin": false, 00:11:08.929 "nvme_io": false, 00:11:08.929 "nvme_io_md": false, 00:11:08.929 "write_zeroes": true, 00:11:08.929 "zcopy": true, 00:11:08.929 "get_zone_info": false, 00:11:08.929 "zone_management": false, 00:11:08.929 "zone_append": false, 00:11:08.929 "compare": false, 00:11:08.929 "compare_and_write": false, 00:11:08.929 "abort": true, 00:11:08.929 "seek_hole": false, 00:11:08.929 "seek_data": false, 00:11:08.929 "copy": true, 00:11:08.929 "nvme_iov_md": false 00:11:08.929 }, 00:11:08.929 "memory_domains": [ 00:11:08.929 { 00:11:08.929 "dma_device_id": "system", 00:11:08.929 "dma_device_type": 1 00:11:08.929 }, 00:11:08.929 { 00:11:08.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:08.929 "dma_device_type": 2 00:11:08.929 } 00:11:08.929 ], 00:11:08.929 "driver_specific": {} 00:11:08.929 } 00:11:08.929 ] 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:08.929 08:24:21 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # rpc_cmd bdev_error_create Dev_1 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:08.929 true 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:08.929 08:24:21 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:08.929 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:09.188 Dev_2 00:11:09.188 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:09.188 08:24:21 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # waitforbdev Dev_2 00:11:09.188 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:11:09.188 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:09.188 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:09.188 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:09.188 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:09.188 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:09.188 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:09.188 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:09.188 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:09.188 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:11:09.188 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:09.188 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:09.188 [ 00:11:09.188 { 00:11:09.188 "name": "Dev_2", 00:11:09.188 "aliases": [ 00:11:09.189 "271038fc-fadb-4432-a88c-c2a0f0211973" 00:11:09.189 ], 00:11:09.189 "product_name": "Malloc disk", 00:11:09.189 "block_size": 512, 00:11:09.189 "num_blocks": 262144, 00:11:09.189 "uuid": "271038fc-fadb-4432-a88c-c2a0f0211973", 00:11:09.189 "assigned_rate_limits": { 00:11:09.189 "rw_ios_per_sec": 0, 00:11:09.189 "rw_mbytes_per_sec": 0, 00:11:09.189 "r_mbytes_per_sec": 0, 00:11:09.189 "w_mbytes_per_sec": 0 00:11:09.189 }, 00:11:09.189 "claimed": false, 00:11:09.189 "zoned": false, 00:11:09.189 "supported_io_types": { 00:11:09.189 "read": true, 00:11:09.189 "write": true, 00:11:09.189 "unmap": true, 00:11:09.189 "flush": true, 00:11:09.189 "reset": true, 00:11:09.189 "nvme_admin": false, 00:11:09.189 "nvme_io": false, 00:11:09.189 "nvme_io_md": false, 00:11:09.189 "write_zeroes": true, 00:11:09.189 "zcopy": true, 00:11:09.189 "get_zone_info": false, 00:11:09.189 "zone_management": false, 00:11:09.189 "zone_append": false, 00:11:09.189 "compare": false, 00:11:09.189 "compare_and_write": false, 00:11:09.189 "abort": true, 00:11:09.189 "seek_hole": false, 00:11:09.189 "seek_data": false, 00:11:09.189 "copy": true, 00:11:09.189 "nvme_iov_md": false 00:11:09.189 }, 00:11:09.189 "memory_domains": [ 00:11:09.189 { 00:11:09.189 "dma_device_id": "system", 00:11:09.189 "dma_device_type": 1 00:11:09.189 }, 00:11:09.189 { 00:11:09.189 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:09.189 "dma_device_type": 2 00:11:09.189 } 00:11:09.189 ], 00:11:09.189 "driver_specific": {} 00:11:09.189 } 00:11:09.189 ] 00:11:09.189 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:09.189 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:09.189 08:24:21 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:11:09.189 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:09.189 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:09.189 08:24:21 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:09.189 08:24:21 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # sleep 1 00:11:09.189 08:24:21 blockdev_general.bdev_error -- bdev/blockdev.sh@482 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:11:09.189 Running I/O for 5 seconds... 00:11:10.127 08:24:22 blockdev_general.bdev_error -- bdev/blockdev.sh@486 -- # kill -0 1392064 00:11:10.127 08:24:22 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # echo 'Process is existed as continue on error is set. Pid: 1392064' 00:11:10.127 Process is existed as continue on error is set. Pid: 1392064 00:11:10.127 08:24:22 blockdev_general.bdev_error -- bdev/blockdev.sh@494 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:11:10.127 08:24:22 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:10.127 08:24:22 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:10.127 08:24:22 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:10.127 08:24:22 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_malloc_delete Dev_1 00:11:10.127 08:24:22 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:10.127 08:24:22 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:10.127 Timeout while waiting for response: 00:11:10.127 00:11:10.127 00:11:10.386 08:24:22 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:10.386 08:24:22 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # sleep 5 00:11:14.633 00:11:14.633 Latency(us) 00:11:14.633 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:14.633 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:14.633 EE_Dev_1 : 0.92 47641.41 186.10 5.41 0.00 333.19 108.74 581.24 00:11:14.633 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:14.633 Dev_2 : 5.00 101799.27 397.65 0.00 0.00 154.67 57.78 116841.33 00:11:14.633 =================================================================================================================== 00:11:14.633 Total : 149440.68 583.75 5.41 0.00 168.88 57.78 116841.33 00:11:15.201 08:24:27 blockdev_general.bdev_error -- bdev/blockdev.sh@498 -- # killprocess 1392064 00:11:15.201 08:24:27 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 1392064 ']' 00:11:15.201 08:24:27 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 1392064 00:11:15.201 08:24:27 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:11:15.201 08:24:27 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:15.201 08:24:27 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1392064 00:11:15.201 08:24:27 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:11:15.201 08:24:27 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:11:15.201 08:24:27 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1392064' 00:11:15.201 killing process with pid 1392064 00:11:15.201 08:24:27 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 1392064 00:11:15.201 Received shutdown signal, test time was about 5.000000 seconds 00:11:15.201 00:11:15.201 Latency(us) 00:11:15.201 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:15.201 =================================================================================================================== 00:11:15.201 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:15.201 08:24:27 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 1392064 00:11:17.107 08:24:29 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # ERR_PID=1393813 00:11:17.107 08:24:29 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # echo 'Process error testing pid: 1393813' 00:11:17.107 Process error testing pid: 1393813 00:11:17.107 08:24:29 blockdev_general.bdev_error -- bdev/blockdev.sh@501 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:11:17.107 08:24:29 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # waitforlisten 1393813 00:11:17.107 08:24:29 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 1393813 ']' 00:11:17.107 08:24:29 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:17.107 08:24:29 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:17.107 08:24:29 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:17.107 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:17.107 08:24:29 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:17.107 08:24:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:17.107 [2024-07-23 08:24:29.374495] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:11:17.107 [2024-07-23 08:24:29.374590] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1393813 ] 00:11:17.107 [2024-07-23 08:24:29.496255] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:17.367 [2024-07-23 08:24:29.708805] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:17.626 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:17.626 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:11:17.626 08:24:30 blockdev_general.bdev_error -- bdev/blockdev.sh@506 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:11:17.626 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:17.626 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:17.885 Dev_1 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:17.885 08:24:30 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # waitforbdev Dev_1 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:17.885 [ 00:11:17.885 { 00:11:17.885 "name": "Dev_1", 00:11:17.885 "aliases": [ 00:11:17.885 "dfa8f595-d4b5-42ee-8dd0-cecbba195ae3" 00:11:17.885 ], 00:11:17.885 "product_name": "Malloc disk", 00:11:17.885 "block_size": 512, 00:11:17.885 "num_blocks": 262144, 00:11:17.885 "uuid": "dfa8f595-d4b5-42ee-8dd0-cecbba195ae3", 00:11:17.885 "assigned_rate_limits": { 00:11:17.885 "rw_ios_per_sec": 0, 00:11:17.885 "rw_mbytes_per_sec": 0, 00:11:17.885 "r_mbytes_per_sec": 0, 00:11:17.885 "w_mbytes_per_sec": 0 00:11:17.885 }, 00:11:17.885 "claimed": false, 00:11:17.885 "zoned": false, 00:11:17.885 "supported_io_types": { 00:11:17.885 "read": true, 00:11:17.885 "write": true, 00:11:17.885 "unmap": true, 00:11:17.885 "flush": true, 00:11:17.885 "reset": true, 00:11:17.885 "nvme_admin": false, 00:11:17.885 "nvme_io": false, 00:11:17.885 "nvme_io_md": false, 00:11:17.885 "write_zeroes": true, 00:11:17.885 "zcopy": true, 00:11:17.885 "get_zone_info": false, 00:11:17.885 "zone_management": false, 00:11:17.885 "zone_append": false, 00:11:17.885 "compare": false, 00:11:17.885 "compare_and_write": false, 00:11:17.885 "abort": true, 00:11:17.885 "seek_hole": false, 00:11:17.885 "seek_data": false, 00:11:17.885 "copy": true, 00:11:17.885 "nvme_iov_md": false 00:11:17.885 }, 00:11:17.885 "memory_domains": [ 00:11:17.885 { 00:11:17.885 "dma_device_id": "system", 00:11:17.885 "dma_device_type": 1 00:11:17.885 }, 00:11:17.885 { 00:11:17.885 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:17.885 "dma_device_type": 2 00:11:17.885 } 00:11:17.885 ], 00:11:17.885 "driver_specific": {} 00:11:17.885 } 00:11:17.885 ] 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:17.885 08:24:30 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # rpc_cmd bdev_error_create Dev_1 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:17.885 true 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:17.885 08:24:30 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:17.885 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:18.145 Dev_2 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:18.145 08:24:30 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # waitforbdev Dev_2 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:18.145 [ 00:11:18.145 { 00:11:18.145 "name": "Dev_2", 00:11:18.145 "aliases": [ 00:11:18.145 "1fd63f68-32ab-4413-8dcd-58a41cd3ffd5" 00:11:18.145 ], 00:11:18.145 "product_name": "Malloc disk", 00:11:18.145 "block_size": 512, 00:11:18.145 "num_blocks": 262144, 00:11:18.145 "uuid": "1fd63f68-32ab-4413-8dcd-58a41cd3ffd5", 00:11:18.145 "assigned_rate_limits": { 00:11:18.145 "rw_ios_per_sec": 0, 00:11:18.145 "rw_mbytes_per_sec": 0, 00:11:18.145 "r_mbytes_per_sec": 0, 00:11:18.145 "w_mbytes_per_sec": 0 00:11:18.145 }, 00:11:18.145 "claimed": false, 00:11:18.145 "zoned": false, 00:11:18.145 "supported_io_types": { 00:11:18.145 "read": true, 00:11:18.145 "write": true, 00:11:18.145 "unmap": true, 00:11:18.145 "flush": true, 00:11:18.145 "reset": true, 00:11:18.145 "nvme_admin": false, 00:11:18.145 "nvme_io": false, 00:11:18.145 "nvme_io_md": false, 00:11:18.145 "write_zeroes": true, 00:11:18.145 "zcopy": true, 00:11:18.145 "get_zone_info": false, 00:11:18.145 "zone_management": false, 00:11:18.145 "zone_append": false, 00:11:18.145 "compare": false, 00:11:18.145 "compare_and_write": false, 00:11:18.145 "abort": true, 00:11:18.145 "seek_hole": false, 00:11:18.145 "seek_data": false, 00:11:18.145 "copy": true, 00:11:18.145 "nvme_iov_md": false 00:11:18.145 }, 00:11:18.145 "memory_domains": [ 00:11:18.145 { 00:11:18.145 "dma_device_id": "system", 00:11:18.145 "dma_device_type": 1 00:11:18.145 }, 00:11:18.145 { 00:11:18.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.145 "dma_device_type": 2 00:11:18.145 } 00:11:18.145 ], 00:11:18.145 "driver_specific": {} 00:11:18.145 } 00:11:18.145 ] 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:18.145 08:24:30 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:18.145 08:24:30 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # NOT wait 1393813 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:11:18.145 08:24:30 blockdev_general.bdev_error -- bdev/blockdev.sh@513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 1393813 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:18.145 08:24:30 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 1393813 00:11:18.145 Running I/O for 5 seconds... 00:11:18.145 task offset: 135536 on job bdev=EE_Dev_1 fails 00:11:18.145 00:11:18.145 Latency(us) 00:11:18.145 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:18.145 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:18.145 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:11:18.145 EE_Dev_1 : 0.00 36184.21 141.34 8223.68 0.00 294.69 118.00 530.53 00:11:18.145 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:18.145 Dev_2 : 0.00 24558.71 95.93 0.00 0.00 462.63 104.84 854.31 00:11:18.145 =================================================================================================================== 00:11:18.146 Total : 60742.92 237.28 8223.68 0.00 385.78 104.84 854.31 00:11:18.146 [2024-07-23 08:24:30.622294] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:11:18.146 request: 00:11:18.146 { 00:11:18.146 "method": "perform_tests", 00:11:18.146 "req_id": 1 00:11:18.146 } 00:11:18.146 Got JSON-RPC error response 00:11:18.146 response: 00:11:18.146 { 00:11:18.146 "code": -32603, 00:11:18.146 "message": "bdevperf failed with error Operation not permitted" 00:11:18.146 } 00:11:20.050 08:24:32 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:11:20.050 08:24:32 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:20.050 08:24:32 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:11:20.050 08:24:32 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:11:20.050 08:24:32 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:11:20.050 08:24:32 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:20.050 00:11:20.050 real 0m12.107s 00:11:20.050 user 0m12.138s 00:11:20.050 sys 0m0.848s 00:11:20.050 08:24:32 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:20.050 08:24:32 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:20.050 ************************************ 00:11:20.050 END TEST bdev_error 00:11:20.050 ************************************ 00:11:20.050 08:24:32 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:20.050 08:24:32 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_stat stat_test_suite '' 00:11:20.050 08:24:32 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:20.050 08:24:32 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:20.050 08:24:32 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:20.050 ************************************ 00:11:20.050 START TEST bdev_stat 00:11:20.050 ************************************ 00:11:20.050 08:24:32 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:11:20.050 08:24:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@591 -- # STAT_DEV=Malloc_STAT 00:11:20.050 08:24:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # STAT_PID=1394359 00:11:20.050 08:24:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # echo 'Process Bdev IO statistics testing pid: 1394359' 00:11:20.050 Process Bdev IO statistics testing pid: 1394359 00:11:20.050 08:24:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@594 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:11:20.050 08:24:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:11:20.050 08:24:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # waitforlisten 1394359 00:11:20.050 08:24:32 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 1394359 ']' 00:11:20.050 08:24:32 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:20.050 08:24:32 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:20.050 08:24:32 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:20.050 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:20.050 08:24:32 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:20.050 08:24:32 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:20.050 [2024-07-23 08:24:32.547284] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:11:20.050 [2024-07-23 08:24:32.547372] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1394359 ] 00:11:20.310 [2024-07-23 08:24:32.669762] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:20.569 [2024-07-23 08:24:32.892933] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:20.569 [2024-07-23 08:24:32.892935] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:20.828 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:20.828 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:11:20.828 08:24:33 blockdev_general.bdev_stat -- bdev/blockdev.sh@600 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:11:20.828 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:20.828 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:21.087 Malloc_STAT 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # waitforbdev Malloc_STAT 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:21.087 [ 00:11:21.087 { 00:11:21.087 "name": "Malloc_STAT", 00:11:21.087 "aliases": [ 00:11:21.087 "539847aa-e63d-47d5-9937-4db8b8fb3b12" 00:11:21.087 ], 00:11:21.087 "product_name": "Malloc disk", 00:11:21.087 "block_size": 512, 00:11:21.087 "num_blocks": 262144, 00:11:21.087 "uuid": "539847aa-e63d-47d5-9937-4db8b8fb3b12", 00:11:21.087 "assigned_rate_limits": { 00:11:21.087 "rw_ios_per_sec": 0, 00:11:21.087 "rw_mbytes_per_sec": 0, 00:11:21.087 "r_mbytes_per_sec": 0, 00:11:21.087 "w_mbytes_per_sec": 0 00:11:21.087 }, 00:11:21.087 "claimed": false, 00:11:21.087 "zoned": false, 00:11:21.087 "supported_io_types": { 00:11:21.087 "read": true, 00:11:21.087 "write": true, 00:11:21.087 "unmap": true, 00:11:21.087 "flush": true, 00:11:21.087 "reset": true, 00:11:21.087 "nvme_admin": false, 00:11:21.087 "nvme_io": false, 00:11:21.087 "nvme_io_md": false, 00:11:21.087 "write_zeroes": true, 00:11:21.087 "zcopy": true, 00:11:21.087 "get_zone_info": false, 00:11:21.087 "zone_management": false, 00:11:21.087 "zone_append": false, 00:11:21.087 "compare": false, 00:11:21.087 "compare_and_write": false, 00:11:21.087 "abort": true, 00:11:21.087 "seek_hole": false, 00:11:21.087 "seek_data": false, 00:11:21.087 "copy": true, 00:11:21.087 "nvme_iov_md": false 00:11:21.087 }, 00:11:21.087 "memory_domains": [ 00:11:21.087 { 00:11:21.087 "dma_device_id": "system", 00:11:21.087 "dma_device_type": 1 00:11:21.087 }, 00:11:21.087 { 00:11:21.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:21.087 "dma_device_type": 2 00:11:21.087 } 00:11:21.087 ], 00:11:21.087 "driver_specific": {} 00:11:21.087 } 00:11:21.087 ] 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # sleep 2 00:11:21.087 08:24:33 blockdev_general.bdev_stat -- bdev/blockdev.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:11:21.087 Running I/O for 10 seconds... 00:11:22.991 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # stat_function_test Malloc_STAT 00:11:22.991 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@558 -- # local bdev_name=Malloc_STAT 00:11:22.991 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local iostats 00:11:22.991 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local io_count1 00:11:22.991 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count2 00:11:22.991 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local iostats_per_channel 00:11:22.991 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local io_count_per_channel1 00:11:22.991 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel2 00:11:22.991 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel_all=0 00:11:22.991 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:11:22.991 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:22.991 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:23.250 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:23.250 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # iostats='{ 00:11:23.250 "tick_rate": 2100000000, 00:11:23.250 "ticks": 3954418609276770, 00:11:23.250 "bdevs": [ 00:11:23.250 { 00:11:23.250 "name": "Malloc_STAT", 00:11:23.250 "bytes_read": 915452416, 00:11:23.250 "num_read_ops": 223492, 00:11:23.250 "bytes_written": 0, 00:11:23.250 "num_write_ops": 0, 00:11:23.250 "bytes_unmapped": 0, 00:11:23.250 "num_unmap_ops": 0, 00:11:23.250 "bytes_copied": 0, 00:11:23.250 "num_copy_ops": 0, 00:11:23.250 "read_latency_ticks": 2050046108956, 00:11:23.250 "max_read_latency_ticks": 9787308, 00:11:23.250 "min_read_latency_ticks": 325806, 00:11:23.250 "write_latency_ticks": 0, 00:11:23.250 "max_write_latency_ticks": 0, 00:11:23.250 "min_write_latency_ticks": 0, 00:11:23.250 "unmap_latency_ticks": 0, 00:11:23.250 "max_unmap_latency_ticks": 0, 00:11:23.250 "min_unmap_latency_ticks": 0, 00:11:23.250 "copy_latency_ticks": 0, 00:11:23.250 "max_copy_latency_ticks": 0, 00:11:23.250 "min_copy_latency_ticks": 0, 00:11:23.250 "io_error": {} 00:11:23.250 } 00:11:23.250 ] 00:11:23.250 }' 00:11:23.250 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # jq -r '.bdevs[0].num_read_ops' 00:11:23.250 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # io_count1=223492 00:11:23.250 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:11:23.250 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:23.250 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:23.250 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:23.250 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # iostats_per_channel='{ 00:11:23.250 "tick_rate": 2100000000, 00:11:23.250 "ticks": 3954418694345128, 00:11:23.250 "name": "Malloc_STAT", 00:11:23.250 "channels": [ 00:11:23.250 { 00:11:23.250 "thread_id": 2, 00:11:23.250 "bytes_read": 465567744, 00:11:23.250 "num_read_ops": 113664, 00:11:23.250 "bytes_written": 0, 00:11:23.250 "num_write_ops": 0, 00:11:23.250 "bytes_unmapped": 0, 00:11:23.250 "num_unmap_ops": 0, 00:11:23.250 "bytes_copied": 0, 00:11:23.250 "num_copy_ops": 0, 00:11:23.250 "read_latency_ticks": 1046155181020, 00:11:23.250 "max_read_latency_ticks": 9800870, 00:11:23.250 "min_read_latency_ticks": 6900198, 00:11:23.250 "write_latency_ticks": 0, 00:11:23.250 "max_write_latency_ticks": 0, 00:11:23.250 "min_write_latency_ticks": 0, 00:11:23.250 "unmap_latency_ticks": 0, 00:11:23.250 "max_unmap_latency_ticks": 0, 00:11:23.250 "min_unmap_latency_ticks": 0, 00:11:23.250 "copy_latency_ticks": 0, 00:11:23.250 "max_copy_latency_ticks": 0, 00:11:23.250 "min_copy_latency_ticks": 0 00:11:23.250 }, 00:11:23.250 { 00:11:23.250 "thread_id": 3, 00:11:23.250 "bytes_read": 468713472, 00:11:23.250 "num_read_ops": 114432, 00:11:23.250 "bytes_written": 0, 00:11:23.250 "num_write_ops": 0, 00:11:23.250 "bytes_unmapped": 0, 00:11:23.250 "num_unmap_ops": 0, 00:11:23.250 "bytes_copied": 0, 00:11:23.250 "num_copy_ops": 0, 00:11:23.250 "read_latency_ticks": 1046940478520, 00:11:23.250 "max_read_latency_ticks": 9678796, 00:11:23.250 "min_read_latency_ticks": 6937516, 00:11:23.250 "write_latency_ticks": 0, 00:11:23.250 "max_write_latency_ticks": 0, 00:11:23.250 "min_write_latency_ticks": 0, 00:11:23.250 "unmap_latency_ticks": 0, 00:11:23.250 "max_unmap_latency_ticks": 0, 00:11:23.250 "min_unmap_latency_ticks": 0, 00:11:23.250 "copy_latency_ticks": 0, 00:11:23.250 "max_copy_latency_ticks": 0, 00:11:23.250 "min_copy_latency_ticks": 0 00:11:23.250 } 00:11:23.250 ] 00:11:23.250 }' 00:11:23.250 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # jq -r '.channels[0].num_read_ops' 00:11:23.250 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # io_count_per_channel1=113664 00:11:23.250 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel_all=113664 00:11:23.250 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # jq -r '.channels[1].num_read_ops' 00:11:23.250 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel2=114432 00:11:23.251 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel_all=228096 00:11:23.251 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:11:23.251 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:23.251 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:23.251 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:23.251 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # iostats='{ 00:11:23.251 "tick_rate": 2100000000, 00:11:23.251 "ticks": 3954418909490782, 00:11:23.251 "bdevs": [ 00:11:23.251 { 00:11:23.251 "name": "Malloc_STAT", 00:11:23.251 "bytes_read": 982561280, 00:11:23.251 "num_read_ops": 239876, 00:11:23.251 "bytes_written": 0, 00:11:23.251 "num_write_ops": 0, 00:11:23.251 "bytes_unmapped": 0, 00:11:23.251 "num_unmap_ops": 0, 00:11:23.251 "bytes_copied": 0, 00:11:23.251 "num_copy_ops": 0, 00:11:23.251 "read_latency_ticks": 2202582723676, 00:11:23.251 "max_read_latency_ticks": 9965648, 00:11:23.251 "min_read_latency_ticks": 325806, 00:11:23.251 "write_latency_ticks": 0, 00:11:23.251 "max_write_latency_ticks": 0, 00:11:23.251 "min_write_latency_ticks": 0, 00:11:23.251 "unmap_latency_ticks": 0, 00:11:23.251 "max_unmap_latency_ticks": 0, 00:11:23.251 "min_unmap_latency_ticks": 0, 00:11:23.251 "copy_latency_ticks": 0, 00:11:23.251 "max_copy_latency_ticks": 0, 00:11:23.251 "min_copy_latency_ticks": 0, 00:11:23.251 "io_error": {} 00:11:23.251 } 00:11:23.251 ] 00:11:23.251 }' 00:11:23.251 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # jq -r '.bdevs[0].num_read_ops' 00:11:23.251 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # io_count2=239876 00:11:23.251 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 228096 -lt 223492 ']' 00:11:23.251 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 228096 -gt 239876 ']' 00:11:23.251 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@607 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:11:23.251 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:23.251 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:23.251 00:11:23.251 Latency(us) 00:11:23.251 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:23.251 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:11:23.251 Malloc_STAT : 2.12 58246.91 227.53 0.00 0.00 4385.03 1022.05 4774.77 00:11:23.251 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:23.251 Malloc_STAT : 2.12 58673.92 229.20 0.00 0.00 4353.19 725.58 4618.73 00:11:23.251 =================================================================================================================== 00:11:23.251 Total : 116920.83 456.72 0.00 0.00 4369.04 725.58 4774.77 00:11:23.510 0 00:11:23.510 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:23.510 08:24:35 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # killprocess 1394359 00:11:23.510 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 1394359 ']' 00:11:23.510 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 1394359 00:11:23.510 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:11:23.510 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:23.510 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1394359 00:11:23.510 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:23.510 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:23.510 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1394359' 00:11:23.510 killing process with pid 1394359 00:11:23.510 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 1394359 00:11:23.510 Received shutdown signal, test time was about 2.286355 seconds 00:11:23.510 00:11:23.510 Latency(us) 00:11:23.510 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:23.510 =================================================================================================================== 00:11:23.510 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:23.510 08:24:35 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 1394359 00:11:24.888 08:24:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # trap - SIGINT SIGTERM EXIT 00:11:24.888 00:11:24.888 real 0m4.745s 00:11:24.888 user 0m8.768s 00:11:24.888 sys 0m0.433s 00:11:24.888 08:24:37 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:24.888 08:24:37 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:24.888 ************************************ 00:11:24.888 END TEST bdev_stat 00:11:24.888 ************************************ 00:11:24.888 08:24:37 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:24.888 08:24:37 blockdev_general -- bdev/blockdev.sh@793 -- # [[ bdev == gpt ]] 00:11:24.888 08:24:37 blockdev_general -- bdev/blockdev.sh@797 -- # [[ bdev == crypto_sw ]] 00:11:24.888 08:24:37 blockdev_general -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:11:24.888 08:24:37 blockdev_general -- bdev/blockdev.sh@810 -- # cleanup 00:11:24.888 08:24:37 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:11:24.888 08:24:37 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:11:24.888 08:24:37 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:11:24.888 08:24:37 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:11:24.888 08:24:37 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:11:24.888 08:24:37 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:11:24.888 00:11:24.888 real 2m22.037s 00:11:24.888 user 7m48.773s 00:11:24.888 sys 0m18.128s 00:11:24.888 08:24:37 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:24.888 08:24:37 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:24.888 ************************************ 00:11:24.888 END TEST blockdev_general 00:11:24.888 ************************************ 00:11:24.888 08:24:37 -- common/autotest_common.sh@1142 -- # return 0 00:11:24.888 08:24:37 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:11:24.888 08:24:37 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:24.888 08:24:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:24.888 08:24:37 -- common/autotest_common.sh@10 -- # set +x 00:11:24.888 ************************************ 00:11:24.888 START TEST bdev_raid 00:11:24.888 ************************************ 00:11:24.888 08:24:37 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:11:24.888 * Looking for test storage... 00:11:24.888 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:11:24.888 08:24:37 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:11:24.888 08:24:37 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:11:24.888 08:24:37 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:11:24.888 08:24:37 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:11:25.147 08:24:37 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:11:25.147 08:24:37 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:11:25.147 08:24:37 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:11:25.147 08:24:37 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:11:25.147 08:24:37 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:11:25.147 08:24:37 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:11:25.147 08:24:37 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:11:25.147 08:24:37 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:11:25.147 08:24:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:25.147 08:24:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:25.147 08:24:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:25.147 ************************************ 00:11:25.147 START TEST raid_function_test_raid0 00:11:25.147 ************************************ 00:11:25.147 08:24:37 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:11:25.147 08:24:37 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:11:25.147 08:24:37 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:11:25.147 08:24:37 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:11:25.147 08:24:37 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=1395439 00:11:25.147 08:24:37 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1395439' 00:11:25.147 Process raid pid: 1395439 00:11:25.147 08:24:37 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:25.147 08:24:37 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 1395439 /var/tmp/spdk-raid.sock 00:11:25.147 08:24:37 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 1395439 ']' 00:11:25.147 08:24:37 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:25.147 08:24:37 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:25.147 08:24:37 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:25.147 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:25.147 08:24:37 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:25.147 08:24:37 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:11:25.147 [2024-07-23 08:24:37.536849] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:11:25.147 [2024-07-23 08:24:37.536938] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:25.147 [2024-07-23 08:24:37.662126] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:25.406 [2024-07-23 08:24:37.870119] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:25.665 [2024-07-23 08:24:38.101306] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:25.665 [2024-07-23 08:24:38.101340] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:25.925 08:24:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:25.925 08:24:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:11:25.925 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:11:25.925 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:11:25.925 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:25.925 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:11:25.925 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:11:26.186 [2024-07-23 08:24:38.569718] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:26.186 [2024-07-23 08:24:38.571456] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:26.186 [2024-07-23 08:24:38.571520] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:11:26.186 [2024-07-23 08:24:38.571535] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:26.186 [2024-07-23 08:24:38.571805] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:11:26.186 [2024-07-23 08:24:38.572006] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:11:26.186 [2024-07-23 08:24:38.572018] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x616000035180 00:11:26.186 [2024-07-23 08:24:38.572195] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:26.186 Base_1 00:11:26.186 Base_2 00:11:26.186 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:26.186 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:26.186 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:11:26.445 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:11:26.445 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:11:26.445 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:11:26.445 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:26.445 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:11:26.445 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:26.445 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:11:26.445 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:26.445 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:11:26.445 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:26.445 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:26.445 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:11:26.445 [2024-07-23 08:24:38.926696] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:11:26.445 /dev/nbd0 00:11:26.445 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:26.704 1+0 records in 00:11:26.704 1+0 records out 00:11:26.704 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232225 s, 17.6 MB/s 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:26.704 08:24:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:26.704 { 00:11:26.704 "nbd_device": "/dev/nbd0", 00:11:26.704 "bdev_name": "raid" 00:11:26.704 } 00:11:26.704 ]' 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:26.704 { 00:11:26.704 "nbd_device": "/dev/nbd0", 00:11:26.704 "bdev_name": "raid" 00:11:26.704 } 00:11:26.704 ]' 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:11:26.704 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:11:26.705 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:11:26.705 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:11:26.705 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:11:26.705 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:11:26.705 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:11:26.977 4096+0 records in 00:11:26.977 4096+0 records out 00:11:26.977 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0250513 s, 83.7 MB/s 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:11:26.977 4096+0 records in 00:11:26.977 4096+0 records out 00:11:26.977 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.163871 s, 12.8 MB/s 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:11:26.977 128+0 records in 00:11:26.977 128+0 records out 00:11:26.977 65536 bytes (66 kB, 64 KiB) copied, 0.000360028 s, 182 MB/s 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:11:26.977 2035+0 records in 00:11:26.977 2035+0 records out 00:11:26.977 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00478637 s, 218 MB/s 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:11:26.977 456+0 records in 00:11:26.977 456+0 records out 00:11:26.977 233472 bytes (233 kB, 228 KiB) copied, 0.00126432 s, 185 MB/s 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:26.977 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:11:27.258 [2024-07-23 08:24:39.669223] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:27.258 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:27.258 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:27.258 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:27.258 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:27.258 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:27.258 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:27.258 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:11:27.258 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:11:27.258 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:27.258 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:27.258 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 1395439 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 1395439 ']' 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 1395439 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1395439 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1395439' 00:11:27.518 killing process with pid 1395439 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 1395439 00:11:27.518 [2024-07-23 08:24:39.944059] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:27.518 08:24:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 1395439 00:11:27.518 [2024-07-23 08:24:39.944156] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:27.518 [2024-07-23 08:24:39.944202] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:27.518 [2024-07-23 08:24:39.944215] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name raid, state offline 00:11:27.777 [2024-07-23 08:24:40.096387] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:29.157 08:24:41 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:11:29.157 00:11:29.157 real 0m3.888s 00:11:29.157 user 0m4.696s 00:11:29.157 sys 0m0.875s 00:11:29.157 08:24:41 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:29.157 08:24:41 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:11:29.157 ************************************ 00:11:29.157 END TEST raid_function_test_raid0 00:11:29.157 ************************************ 00:11:29.157 08:24:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:29.157 08:24:41 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:11:29.157 08:24:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:29.157 08:24:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:29.157 08:24:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:29.157 ************************************ 00:11:29.157 START TEST raid_function_test_concat 00:11:29.157 ************************************ 00:11:29.157 08:24:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:11:29.157 08:24:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:11:29.157 08:24:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:11:29.157 08:24:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:11:29.157 08:24:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=1396277 00:11:29.157 08:24:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1396277' 00:11:29.157 Process raid pid: 1396277 00:11:29.158 08:24:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 1396277 /var/tmp/spdk-raid.sock 00:11:29.158 08:24:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 1396277 ']' 00:11:29.158 08:24:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:29.158 08:24:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:29.158 08:24:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:29.158 08:24:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:29.158 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:29.158 08:24:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:29.158 08:24:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:11:29.158 [2024-07-23 08:24:41.484265] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:11:29.158 [2024-07-23 08:24:41.484350] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:29.158 [2024-07-23 08:24:41.608877] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:29.428 [2024-07-23 08:24:41.814102] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:29.687 [2024-07-23 08:24:42.079298] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:29.687 [2024-07-23 08:24:42.079331] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:29.946 08:24:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:29.946 08:24:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:11:29.946 08:24:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:11:29.946 08:24:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:11:29.946 08:24:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:29.946 08:24:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:11:29.946 08:24:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:11:30.206 [2024-07-23 08:24:42.488255] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:30.206 [2024-07-23 08:24:42.489993] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:30.206 [2024-07-23 08:24:42.490060] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:11:30.206 [2024-07-23 08:24:42.490075] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:30.206 [2024-07-23 08:24:42.490350] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:11:30.206 [2024-07-23 08:24:42.490531] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:11:30.206 [2024-07-23 08:24:42.490542] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x616000035180 00:11:30.206 [2024-07-23 08:24:42.490712] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:30.206 Base_1 00:11:30.206 Base_2 00:11:30.206 08:24:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:30.206 08:24:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:30.206 08:24:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:11:30.206 08:24:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:11:30.206 08:24:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:11:30.206 08:24:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:11:30.206 08:24:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:30.206 08:24:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:11:30.206 08:24:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:30.206 08:24:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:11:30.206 08:24:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:30.206 08:24:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:11:30.206 08:24:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:30.206 08:24:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:30.206 08:24:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:11:30.465 [2024-07-23 08:24:42.833212] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:11:30.465 /dev/nbd0 00:11:30.465 08:24:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:30.465 08:24:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:30.465 08:24:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:11:30.465 08:24:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:11:30.465 08:24:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:30.465 08:24:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:30.465 08:24:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:11:30.465 08:24:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:11:30.465 08:24:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:30.466 08:24:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:30.466 08:24:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:30.466 1+0 records in 00:11:30.466 1+0 records out 00:11:30.466 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000231731 s, 17.7 MB/s 00:11:30.466 08:24:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:30.466 08:24:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:11:30.466 08:24:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:30.466 08:24:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:30.466 08:24:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:11:30.466 08:24:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:30.466 08:24:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:30.466 08:24:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:30.466 08:24:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:30.466 08:24:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:30.726 { 00:11:30.726 "nbd_device": "/dev/nbd0", 00:11:30.726 "bdev_name": "raid" 00:11:30.726 } 00:11:30.726 ]' 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:30.726 { 00:11:30.726 "nbd_device": "/dev/nbd0", 00:11:30.726 "bdev_name": "raid" 00:11:30.726 } 00:11:30.726 ]' 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:11:30.726 4096+0 records in 00:11:30.726 4096+0 records out 00:11:30.726 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.024863 s, 84.3 MB/s 00:11:30.726 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:11:30.987 4096+0 records in 00:11:30.987 4096+0 records out 00:11:30.987 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.161146 s, 13.0 MB/s 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:11:30.987 128+0 records in 00:11:30.987 128+0 records out 00:11:30.987 65536 bytes (66 kB, 64 KiB) copied, 0.000364736 s, 180 MB/s 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:11:30.987 2035+0 records in 00:11:30.987 2035+0 records out 00:11:30.987 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00483823 s, 215 MB/s 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:11:30.987 456+0 records in 00:11:30.987 456+0 records out 00:11:30.987 233472 bytes (233 kB, 228 KiB) copied, 0.00112892 s, 207 MB/s 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:30.987 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:11:31.247 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:31.247 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:31.247 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:31.247 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:31.247 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:31.247 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:31.247 [2024-07-23 08:24:43.614189] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:31.247 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:11:31.247 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:11:31.247 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:31.247 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:31.247 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:31.506 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:31.506 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:31.506 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:31.506 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:31.506 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:11:31.506 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:31.506 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:11:31.506 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:11:31.506 08:24:43 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:11:31.506 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:11:31.506 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:11:31.506 08:24:43 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 1396277 00:11:31.507 08:24:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 1396277 ']' 00:11:31.507 08:24:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 1396277 00:11:31.507 08:24:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:11:31.507 08:24:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:31.507 08:24:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1396277 00:11:31.507 08:24:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:31.507 08:24:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:31.507 08:24:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1396277' 00:11:31.507 killing process with pid 1396277 00:11:31.507 08:24:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 1396277 00:11:31.507 [2024-07-23 08:24:43.885433] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:31.507 08:24:43 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 1396277 00:11:31.507 [2024-07-23 08:24:43.885536] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:31.507 [2024-07-23 08:24:43.885589] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:31.507 [2024-07-23 08:24:43.885602] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name raid, state offline 00:11:31.766 [2024-07-23 08:24:44.044396] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:33.144 08:24:45 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:11:33.144 00:11:33.144 real 0m3.933s 00:11:33.144 user 0m4.643s 00:11:33.144 sys 0m0.859s 00:11:33.144 08:24:45 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:33.144 08:24:45 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:11:33.144 ************************************ 00:11:33.144 END TEST raid_function_test_concat 00:11:33.144 ************************************ 00:11:33.144 08:24:45 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:33.144 08:24:45 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:11:33.144 08:24:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:33.144 08:24:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:33.144 08:24:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:33.144 ************************************ 00:11:33.144 START TEST raid0_resize_test 00:11:33.144 ************************************ 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=1397120 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 1397120' 00:11:33.144 Process raid pid: 1397120 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 1397120 /var/tmp/spdk-raid.sock 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 1397120 ']' 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:33.144 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:33.144 08:24:45 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:33.144 [2024-07-23 08:24:45.479281] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:11:33.144 [2024-07-23 08:24:45.479368] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:33.144 [2024-07-23 08:24:45.605624] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:33.403 [2024-07-23 08:24:45.814193] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:33.663 [2024-07-23 08:24:46.075545] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:33.663 [2024-07-23 08:24:46.075577] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:33.922 08:24:46 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:33.922 08:24:46 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:11:33.922 08:24:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:11:33.922 Base_1 00:11:33.922 08:24:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:11:34.182 Base_2 00:11:34.182 08:24:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:11:34.441 [2024-07-23 08:24:46.758931] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:34.441 [2024-07-23 08:24:46.760546] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:34.441 [2024-07-23 08:24:46.760605] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:11:34.441 [2024-07-23 08:24:46.760624] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:34.441 [2024-07-23 08:24:46.760892] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bc50 00:11:34.441 [2024-07-23 08:24:46.761039] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:11:34.441 [2024-07-23 08:24:46.761049] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x616000035180 00:11:34.441 [2024-07-23 08:24:46.761229] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:34.441 08:24:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:11:34.441 [2024-07-23 08:24:46.927374] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:34.441 [2024-07-23 08:24:46.927403] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:11:34.441 true 00:11:34.441 08:24:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:34.441 08:24:46 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:11:34.700 [2024-07-23 08:24:47.095962] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:34.700 08:24:47 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:11:34.700 08:24:47 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:11:34.701 08:24:47 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:11:34.701 08:24:47 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:11:34.960 [2024-07-23 08:24:47.264250] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:34.960 [2024-07-23 08:24:47.264276] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:11:34.960 [2024-07-23 08:24:47.264301] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:11:34.960 true 00:11:34.960 08:24:47 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:11:34.960 08:24:47 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:34.960 [2024-07-23 08:24:47.432882] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:34.960 08:24:47 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:11:34.960 08:24:47 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:11:34.960 08:24:47 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:11:34.960 08:24:47 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 1397120 00:11:34.960 08:24:47 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 1397120 ']' 00:11:34.960 08:24:47 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 1397120 00:11:34.960 08:24:47 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:11:34.960 08:24:47 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:34.960 08:24:47 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1397120 00:11:35.219 08:24:47 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:35.219 08:24:47 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:35.219 08:24:47 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1397120' 00:11:35.219 killing process with pid 1397120 00:11:35.219 08:24:47 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 1397120 00:11:35.219 [2024-07-23 08:24:47.486251] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:35.219 08:24:47 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 1397120 00:11:35.219 [2024-07-23 08:24:47.486331] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:35.219 [2024-07-23 08:24:47.486380] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:35.219 [2024-07-23 08:24:47.486390] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Raid, state offline 00:11:35.219 [2024-07-23 08:24:47.496641] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:36.598 08:24:48 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:11:36.598 00:11:36.598 real 0m3.353s 00:11:36.598 user 0m4.379s 00:11:36.598 sys 0m0.542s 00:11:36.598 08:24:48 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:36.598 08:24:48 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:36.598 ************************************ 00:11:36.598 END TEST raid0_resize_test 00:11:36.598 ************************************ 00:11:36.598 08:24:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:36.598 08:24:48 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:11:36.598 08:24:48 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:36.598 08:24:48 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:11:36.598 08:24:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:36.598 08:24:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:36.598 08:24:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:36.598 ************************************ 00:11:36.598 START TEST raid_state_function_test 00:11:36.598 ************************************ 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1397682 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1397682' 00:11:36.598 Process raid pid: 1397682 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1397682 /var/tmp/spdk-raid.sock 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1397682 ']' 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:36.598 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:36.598 08:24:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:36.598 [2024-07-23 08:24:48.895701] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:11:36.598 [2024-07-23 08:24:48.895785] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:36.598 [2024-07-23 08:24:49.020182] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:36.857 [2024-07-23 08:24:49.249626] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:37.116 [2024-07-23 08:24:49.519914] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:37.116 [2024-07-23 08:24:49.519941] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:37.375 08:24:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:37.375 08:24:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:37.375 08:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:37.375 [2024-07-23 08:24:49.813410] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:37.375 [2024-07-23 08:24:49.813455] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:37.375 [2024-07-23 08:24:49.813464] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:37.375 [2024-07-23 08:24:49.813490] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:37.375 08:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:37.375 08:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:37.375 08:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:37.375 08:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:37.375 08:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:37.375 08:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:37.375 08:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:37.375 08:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:37.375 08:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:37.375 08:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:37.375 08:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.375 08:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:37.634 08:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:37.634 "name": "Existed_Raid", 00:11:37.634 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.634 "strip_size_kb": 64, 00:11:37.634 "state": "configuring", 00:11:37.634 "raid_level": "raid0", 00:11:37.634 "superblock": false, 00:11:37.634 "num_base_bdevs": 2, 00:11:37.634 "num_base_bdevs_discovered": 0, 00:11:37.634 "num_base_bdevs_operational": 2, 00:11:37.634 "base_bdevs_list": [ 00:11:37.634 { 00:11:37.634 "name": "BaseBdev1", 00:11:37.634 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.634 "is_configured": false, 00:11:37.634 "data_offset": 0, 00:11:37.634 "data_size": 0 00:11:37.634 }, 00:11:37.634 { 00:11:37.634 "name": "BaseBdev2", 00:11:37.634 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.634 "is_configured": false, 00:11:37.634 "data_offset": 0, 00:11:37.634 "data_size": 0 00:11:37.634 } 00:11:37.634 ] 00:11:37.634 }' 00:11:37.634 08:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:37.634 08:24:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:38.202 08:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:38.202 [2024-07-23 08:24:50.631475] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:38.202 [2024-07-23 08:24:50.631510] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:11:38.202 08:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:38.461 [2024-07-23 08:24:50.807944] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:38.461 [2024-07-23 08:24:50.807990] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:38.461 [2024-07-23 08:24:50.807998] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:38.461 [2024-07-23 08:24:50.808007] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:38.461 08:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:38.720 [2024-07-23 08:24:51.020619] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:38.720 BaseBdev1 00:11:38.720 08:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:38.720 08:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:38.720 08:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:38.720 08:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:38.720 08:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:38.720 08:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:38.720 08:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:38.720 08:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:38.979 [ 00:11:38.979 { 00:11:38.979 "name": "BaseBdev1", 00:11:38.979 "aliases": [ 00:11:38.979 "c22e8755-d738-4891-ae25-385e04b137c4" 00:11:38.979 ], 00:11:38.979 "product_name": "Malloc disk", 00:11:38.979 "block_size": 512, 00:11:38.979 "num_blocks": 65536, 00:11:38.979 "uuid": "c22e8755-d738-4891-ae25-385e04b137c4", 00:11:38.979 "assigned_rate_limits": { 00:11:38.979 "rw_ios_per_sec": 0, 00:11:38.979 "rw_mbytes_per_sec": 0, 00:11:38.979 "r_mbytes_per_sec": 0, 00:11:38.979 "w_mbytes_per_sec": 0 00:11:38.979 }, 00:11:38.979 "claimed": true, 00:11:38.979 "claim_type": "exclusive_write", 00:11:38.979 "zoned": false, 00:11:38.979 "supported_io_types": { 00:11:38.979 "read": true, 00:11:38.979 "write": true, 00:11:38.979 "unmap": true, 00:11:38.979 "flush": true, 00:11:38.979 "reset": true, 00:11:38.979 "nvme_admin": false, 00:11:38.979 "nvme_io": false, 00:11:38.979 "nvme_io_md": false, 00:11:38.979 "write_zeroes": true, 00:11:38.979 "zcopy": true, 00:11:38.979 "get_zone_info": false, 00:11:38.979 "zone_management": false, 00:11:38.979 "zone_append": false, 00:11:38.979 "compare": false, 00:11:38.979 "compare_and_write": false, 00:11:38.979 "abort": true, 00:11:38.979 "seek_hole": false, 00:11:38.979 "seek_data": false, 00:11:38.979 "copy": true, 00:11:38.979 "nvme_iov_md": false 00:11:38.979 }, 00:11:38.979 "memory_domains": [ 00:11:38.979 { 00:11:38.979 "dma_device_id": "system", 00:11:38.979 "dma_device_type": 1 00:11:38.979 }, 00:11:38.979 { 00:11:38.979 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.979 "dma_device_type": 2 00:11:38.979 } 00:11:38.979 ], 00:11:38.979 "driver_specific": {} 00:11:38.979 } 00:11:38.979 ] 00:11:38.979 08:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:38.979 08:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:38.979 08:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:38.979 08:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:38.979 08:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:38.979 08:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:38.979 08:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:38.979 08:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:38.979 08:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:38.979 08:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:38.979 08:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:38.979 08:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:38.979 08:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.237 08:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:39.237 "name": "Existed_Raid", 00:11:39.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.237 "strip_size_kb": 64, 00:11:39.237 "state": "configuring", 00:11:39.237 "raid_level": "raid0", 00:11:39.237 "superblock": false, 00:11:39.237 "num_base_bdevs": 2, 00:11:39.237 "num_base_bdevs_discovered": 1, 00:11:39.237 "num_base_bdevs_operational": 2, 00:11:39.237 "base_bdevs_list": [ 00:11:39.237 { 00:11:39.237 "name": "BaseBdev1", 00:11:39.237 "uuid": "c22e8755-d738-4891-ae25-385e04b137c4", 00:11:39.237 "is_configured": true, 00:11:39.237 "data_offset": 0, 00:11:39.237 "data_size": 65536 00:11:39.237 }, 00:11:39.237 { 00:11:39.237 "name": "BaseBdev2", 00:11:39.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.237 "is_configured": false, 00:11:39.237 "data_offset": 0, 00:11:39.237 "data_size": 0 00:11:39.237 } 00:11:39.237 ] 00:11:39.237 }' 00:11:39.237 08:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:39.237 08:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:39.531 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:39.790 [2024-07-23 08:24:52.167733] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:39.790 [2024-07-23 08:24:52.167780] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:11:39.790 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:40.049 [2024-07-23 08:24:52.340203] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:40.049 [2024-07-23 08:24:52.341780] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:40.049 [2024-07-23 08:24:52.341814] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:40.049 "name": "Existed_Raid", 00:11:40.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:40.049 "strip_size_kb": 64, 00:11:40.049 "state": "configuring", 00:11:40.049 "raid_level": "raid0", 00:11:40.049 "superblock": false, 00:11:40.049 "num_base_bdevs": 2, 00:11:40.049 "num_base_bdevs_discovered": 1, 00:11:40.049 "num_base_bdevs_operational": 2, 00:11:40.049 "base_bdevs_list": [ 00:11:40.049 { 00:11:40.049 "name": "BaseBdev1", 00:11:40.049 "uuid": "c22e8755-d738-4891-ae25-385e04b137c4", 00:11:40.049 "is_configured": true, 00:11:40.049 "data_offset": 0, 00:11:40.049 "data_size": 65536 00:11:40.049 }, 00:11:40.049 { 00:11:40.049 "name": "BaseBdev2", 00:11:40.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:40.049 "is_configured": false, 00:11:40.049 "data_offset": 0, 00:11:40.049 "data_size": 0 00:11:40.049 } 00:11:40.049 ] 00:11:40.049 }' 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:40.049 08:24:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:40.617 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:40.876 [2024-07-23 08:24:53.215798] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:40.876 [2024-07-23 08:24:53.215841] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:11:40.876 [2024-07-23 08:24:53.215849] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:40.876 [2024-07-23 08:24:53.216074] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:11:40.876 [2024-07-23 08:24:53.216246] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:11:40.876 [2024-07-23 08:24:53.216260] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:11:40.876 [2024-07-23 08:24:53.216530] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:40.876 BaseBdev2 00:11:40.876 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:40.876 08:24:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:40.876 08:24:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:40.876 08:24:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:40.876 08:24:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:40.876 08:24:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:40.876 08:24:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:41.135 08:24:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:41.135 [ 00:11:41.135 { 00:11:41.135 "name": "BaseBdev2", 00:11:41.135 "aliases": [ 00:11:41.135 "fcedeb2e-ef42-4aca-ace1-d18cab422ed0" 00:11:41.135 ], 00:11:41.135 "product_name": "Malloc disk", 00:11:41.135 "block_size": 512, 00:11:41.135 "num_blocks": 65536, 00:11:41.135 "uuid": "fcedeb2e-ef42-4aca-ace1-d18cab422ed0", 00:11:41.135 "assigned_rate_limits": { 00:11:41.135 "rw_ios_per_sec": 0, 00:11:41.135 "rw_mbytes_per_sec": 0, 00:11:41.135 "r_mbytes_per_sec": 0, 00:11:41.135 "w_mbytes_per_sec": 0 00:11:41.135 }, 00:11:41.135 "claimed": true, 00:11:41.135 "claim_type": "exclusive_write", 00:11:41.135 "zoned": false, 00:11:41.135 "supported_io_types": { 00:11:41.135 "read": true, 00:11:41.135 "write": true, 00:11:41.135 "unmap": true, 00:11:41.135 "flush": true, 00:11:41.135 "reset": true, 00:11:41.135 "nvme_admin": false, 00:11:41.135 "nvme_io": false, 00:11:41.135 "nvme_io_md": false, 00:11:41.135 "write_zeroes": true, 00:11:41.135 "zcopy": true, 00:11:41.135 "get_zone_info": false, 00:11:41.135 "zone_management": false, 00:11:41.135 "zone_append": false, 00:11:41.135 "compare": false, 00:11:41.135 "compare_and_write": false, 00:11:41.135 "abort": true, 00:11:41.135 "seek_hole": false, 00:11:41.135 "seek_data": false, 00:11:41.135 "copy": true, 00:11:41.135 "nvme_iov_md": false 00:11:41.135 }, 00:11:41.135 "memory_domains": [ 00:11:41.135 { 00:11:41.135 "dma_device_id": "system", 00:11:41.135 "dma_device_type": 1 00:11:41.135 }, 00:11:41.135 { 00:11:41.135 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:41.135 "dma_device_type": 2 00:11:41.135 } 00:11:41.135 ], 00:11:41.135 "driver_specific": {} 00:11:41.135 } 00:11:41.135 ] 00:11:41.135 08:24:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:41.135 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:41.135 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:41.135 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:41.135 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:41.135 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:41.135 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:41.135 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:41.136 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:41.136 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:41.136 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:41.136 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:41.136 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:41.136 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:41.136 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:41.395 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:41.395 "name": "Existed_Raid", 00:11:41.395 "uuid": "93ab7a8a-a8cd-4822-8998-150b077a4604", 00:11:41.395 "strip_size_kb": 64, 00:11:41.395 "state": "online", 00:11:41.395 "raid_level": "raid0", 00:11:41.395 "superblock": false, 00:11:41.395 "num_base_bdevs": 2, 00:11:41.395 "num_base_bdevs_discovered": 2, 00:11:41.395 "num_base_bdevs_operational": 2, 00:11:41.395 "base_bdevs_list": [ 00:11:41.395 { 00:11:41.395 "name": "BaseBdev1", 00:11:41.395 "uuid": "c22e8755-d738-4891-ae25-385e04b137c4", 00:11:41.395 "is_configured": true, 00:11:41.395 "data_offset": 0, 00:11:41.395 "data_size": 65536 00:11:41.395 }, 00:11:41.395 { 00:11:41.395 "name": "BaseBdev2", 00:11:41.395 "uuid": "fcedeb2e-ef42-4aca-ace1-d18cab422ed0", 00:11:41.395 "is_configured": true, 00:11:41.395 "data_offset": 0, 00:11:41.395 "data_size": 65536 00:11:41.395 } 00:11:41.395 ] 00:11:41.395 }' 00:11:41.395 08:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:41.395 08:24:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:41.963 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:41.963 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:41.963 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:41.963 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:41.963 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:41.963 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:41.963 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:41.963 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:41.963 [2024-07-23 08:24:54.379155] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:41.963 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:41.963 "name": "Existed_Raid", 00:11:41.963 "aliases": [ 00:11:41.964 "93ab7a8a-a8cd-4822-8998-150b077a4604" 00:11:41.964 ], 00:11:41.964 "product_name": "Raid Volume", 00:11:41.964 "block_size": 512, 00:11:41.964 "num_blocks": 131072, 00:11:41.964 "uuid": "93ab7a8a-a8cd-4822-8998-150b077a4604", 00:11:41.964 "assigned_rate_limits": { 00:11:41.964 "rw_ios_per_sec": 0, 00:11:41.964 "rw_mbytes_per_sec": 0, 00:11:41.964 "r_mbytes_per_sec": 0, 00:11:41.964 "w_mbytes_per_sec": 0 00:11:41.964 }, 00:11:41.964 "claimed": false, 00:11:41.964 "zoned": false, 00:11:41.964 "supported_io_types": { 00:11:41.964 "read": true, 00:11:41.964 "write": true, 00:11:41.964 "unmap": true, 00:11:41.964 "flush": true, 00:11:41.964 "reset": true, 00:11:41.964 "nvme_admin": false, 00:11:41.964 "nvme_io": false, 00:11:41.964 "nvme_io_md": false, 00:11:41.964 "write_zeroes": true, 00:11:41.964 "zcopy": false, 00:11:41.964 "get_zone_info": false, 00:11:41.964 "zone_management": false, 00:11:41.964 "zone_append": false, 00:11:41.964 "compare": false, 00:11:41.964 "compare_and_write": false, 00:11:41.964 "abort": false, 00:11:41.964 "seek_hole": false, 00:11:41.964 "seek_data": false, 00:11:41.964 "copy": false, 00:11:41.964 "nvme_iov_md": false 00:11:41.964 }, 00:11:41.964 "memory_domains": [ 00:11:41.964 { 00:11:41.964 "dma_device_id": "system", 00:11:41.964 "dma_device_type": 1 00:11:41.964 }, 00:11:41.964 { 00:11:41.964 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:41.964 "dma_device_type": 2 00:11:41.964 }, 00:11:41.964 { 00:11:41.964 "dma_device_id": "system", 00:11:41.964 "dma_device_type": 1 00:11:41.964 }, 00:11:41.964 { 00:11:41.964 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:41.964 "dma_device_type": 2 00:11:41.964 } 00:11:41.964 ], 00:11:41.964 "driver_specific": { 00:11:41.964 "raid": { 00:11:41.964 "uuid": "93ab7a8a-a8cd-4822-8998-150b077a4604", 00:11:41.964 "strip_size_kb": 64, 00:11:41.964 "state": "online", 00:11:41.964 "raid_level": "raid0", 00:11:41.964 "superblock": false, 00:11:41.964 "num_base_bdevs": 2, 00:11:41.964 "num_base_bdevs_discovered": 2, 00:11:41.964 "num_base_bdevs_operational": 2, 00:11:41.964 "base_bdevs_list": [ 00:11:41.964 { 00:11:41.964 "name": "BaseBdev1", 00:11:41.964 "uuid": "c22e8755-d738-4891-ae25-385e04b137c4", 00:11:41.964 "is_configured": true, 00:11:41.964 "data_offset": 0, 00:11:41.964 "data_size": 65536 00:11:41.964 }, 00:11:41.964 { 00:11:41.964 "name": "BaseBdev2", 00:11:41.964 "uuid": "fcedeb2e-ef42-4aca-ace1-d18cab422ed0", 00:11:41.964 "is_configured": true, 00:11:41.964 "data_offset": 0, 00:11:41.964 "data_size": 65536 00:11:41.964 } 00:11:41.964 ] 00:11:41.964 } 00:11:41.964 } 00:11:41.964 }' 00:11:41.964 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:41.964 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:41.964 BaseBdev2' 00:11:41.964 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:41.964 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:41.964 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:42.223 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:42.223 "name": "BaseBdev1", 00:11:42.223 "aliases": [ 00:11:42.223 "c22e8755-d738-4891-ae25-385e04b137c4" 00:11:42.223 ], 00:11:42.223 "product_name": "Malloc disk", 00:11:42.223 "block_size": 512, 00:11:42.223 "num_blocks": 65536, 00:11:42.223 "uuid": "c22e8755-d738-4891-ae25-385e04b137c4", 00:11:42.223 "assigned_rate_limits": { 00:11:42.223 "rw_ios_per_sec": 0, 00:11:42.223 "rw_mbytes_per_sec": 0, 00:11:42.223 "r_mbytes_per_sec": 0, 00:11:42.223 "w_mbytes_per_sec": 0 00:11:42.223 }, 00:11:42.223 "claimed": true, 00:11:42.223 "claim_type": "exclusive_write", 00:11:42.223 "zoned": false, 00:11:42.223 "supported_io_types": { 00:11:42.223 "read": true, 00:11:42.223 "write": true, 00:11:42.223 "unmap": true, 00:11:42.223 "flush": true, 00:11:42.223 "reset": true, 00:11:42.223 "nvme_admin": false, 00:11:42.223 "nvme_io": false, 00:11:42.223 "nvme_io_md": false, 00:11:42.223 "write_zeroes": true, 00:11:42.223 "zcopy": true, 00:11:42.223 "get_zone_info": false, 00:11:42.223 "zone_management": false, 00:11:42.223 "zone_append": false, 00:11:42.223 "compare": false, 00:11:42.223 "compare_and_write": false, 00:11:42.223 "abort": true, 00:11:42.223 "seek_hole": false, 00:11:42.223 "seek_data": false, 00:11:42.223 "copy": true, 00:11:42.223 "nvme_iov_md": false 00:11:42.223 }, 00:11:42.223 "memory_domains": [ 00:11:42.223 { 00:11:42.223 "dma_device_id": "system", 00:11:42.223 "dma_device_type": 1 00:11:42.223 }, 00:11:42.223 { 00:11:42.223 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:42.223 "dma_device_type": 2 00:11:42.223 } 00:11:42.223 ], 00:11:42.223 "driver_specific": {} 00:11:42.223 }' 00:11:42.223 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:42.223 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:42.223 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:42.223 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:42.482 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:42.482 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:42.482 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:42.482 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:42.482 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:42.482 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:42.482 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:42.482 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:42.482 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:42.482 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:42.482 08:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:42.741 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:42.741 "name": "BaseBdev2", 00:11:42.741 "aliases": [ 00:11:42.741 "fcedeb2e-ef42-4aca-ace1-d18cab422ed0" 00:11:42.741 ], 00:11:42.741 "product_name": "Malloc disk", 00:11:42.741 "block_size": 512, 00:11:42.741 "num_blocks": 65536, 00:11:42.741 "uuid": "fcedeb2e-ef42-4aca-ace1-d18cab422ed0", 00:11:42.741 "assigned_rate_limits": { 00:11:42.741 "rw_ios_per_sec": 0, 00:11:42.741 "rw_mbytes_per_sec": 0, 00:11:42.741 "r_mbytes_per_sec": 0, 00:11:42.741 "w_mbytes_per_sec": 0 00:11:42.741 }, 00:11:42.741 "claimed": true, 00:11:42.741 "claim_type": "exclusive_write", 00:11:42.741 "zoned": false, 00:11:42.741 "supported_io_types": { 00:11:42.741 "read": true, 00:11:42.741 "write": true, 00:11:42.741 "unmap": true, 00:11:42.741 "flush": true, 00:11:42.741 "reset": true, 00:11:42.741 "nvme_admin": false, 00:11:42.741 "nvme_io": false, 00:11:42.741 "nvme_io_md": false, 00:11:42.741 "write_zeroes": true, 00:11:42.741 "zcopy": true, 00:11:42.741 "get_zone_info": false, 00:11:42.741 "zone_management": false, 00:11:42.741 "zone_append": false, 00:11:42.741 "compare": false, 00:11:42.741 "compare_and_write": false, 00:11:42.741 "abort": true, 00:11:42.741 "seek_hole": false, 00:11:42.741 "seek_data": false, 00:11:42.741 "copy": true, 00:11:42.741 "nvme_iov_md": false 00:11:42.741 }, 00:11:42.741 "memory_domains": [ 00:11:42.741 { 00:11:42.741 "dma_device_id": "system", 00:11:42.741 "dma_device_type": 1 00:11:42.741 }, 00:11:42.741 { 00:11:42.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:42.741 "dma_device_type": 2 00:11:42.741 } 00:11:42.741 ], 00:11:42.741 "driver_specific": {} 00:11:42.741 }' 00:11:42.741 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:42.741 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:42.741 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:42.741 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:42.741 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:43.001 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:43.001 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:43.001 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:43.001 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:43.001 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:43.001 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:43.001 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:43.001 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:43.260 [2024-07-23 08:24:55.566106] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:43.260 [2024-07-23 08:24:55.566137] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:43.260 [2024-07-23 08:24:55.566183] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:43.260 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:43.520 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:43.520 "name": "Existed_Raid", 00:11:43.520 "uuid": "93ab7a8a-a8cd-4822-8998-150b077a4604", 00:11:43.520 "strip_size_kb": 64, 00:11:43.520 "state": "offline", 00:11:43.520 "raid_level": "raid0", 00:11:43.520 "superblock": false, 00:11:43.520 "num_base_bdevs": 2, 00:11:43.520 "num_base_bdevs_discovered": 1, 00:11:43.520 "num_base_bdevs_operational": 1, 00:11:43.520 "base_bdevs_list": [ 00:11:43.520 { 00:11:43.520 "name": null, 00:11:43.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:43.520 "is_configured": false, 00:11:43.520 "data_offset": 0, 00:11:43.520 "data_size": 65536 00:11:43.520 }, 00:11:43.520 { 00:11:43.520 "name": "BaseBdev2", 00:11:43.520 "uuid": "fcedeb2e-ef42-4aca-ace1-d18cab422ed0", 00:11:43.520 "is_configured": true, 00:11:43.520 "data_offset": 0, 00:11:43.520 "data_size": 65536 00:11:43.520 } 00:11:43.520 ] 00:11:43.520 }' 00:11:43.520 08:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:43.520 08:24:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:43.778 08:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:43.778 08:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:43.778 08:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:43.778 08:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.037 08:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:44.037 08:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:44.037 08:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:44.297 [2024-07-23 08:24:56.609286] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:44.297 [2024-07-23 08:24:56.609347] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:11:44.297 08:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:44.297 08:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:44.297 08:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.297 08:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:44.556 08:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:44.556 08:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:44.556 08:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:44.556 08:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1397682 00:11:44.556 08:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1397682 ']' 00:11:44.556 08:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1397682 00:11:44.556 08:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:44.556 08:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:44.556 08:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1397682 00:11:44.556 08:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:44.556 08:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:44.556 08:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1397682' 00:11:44.556 killing process with pid 1397682 00:11:44.556 08:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1397682 00:11:44.556 [2024-07-23 08:24:56.937062] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:44.556 08:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1397682 00:11:44.556 [2024-07-23 08:24:56.954746] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:45.935 00:11:45.935 real 0m9.414s 00:11:45.935 user 0m15.753s 00:11:45.935 sys 0m1.397s 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:45.935 ************************************ 00:11:45.935 END TEST raid_state_function_test 00:11:45.935 ************************************ 00:11:45.935 08:24:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:45.935 08:24:58 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:11:45.935 08:24:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:45.935 08:24:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:45.935 08:24:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:45.935 ************************************ 00:11:45.935 START TEST raid_state_function_test_sb 00:11:45.935 ************************************ 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1399672 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1399672' 00:11:45.935 Process raid pid: 1399672 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1399672 /var/tmp/spdk-raid.sock 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1399672 ']' 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:45.935 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:45.935 08:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:45.935 [2024-07-23 08:24:58.382191] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:11:45.935 [2024-07-23 08:24:58.382279] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:46.194 [2024-07-23 08:24:58.510704] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:46.453 [2024-07-23 08:24:58.737689] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:46.712 [2024-07-23 08:24:59.023477] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:46.712 [2024-07-23 08:24:59.023504] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:46.712 08:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:46.712 08:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:46.712 08:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:46.972 [2024-07-23 08:24:59.325360] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:46.972 [2024-07-23 08:24:59.325403] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:46.972 [2024-07-23 08:24:59.325413] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:46.972 [2024-07-23 08:24:59.325424] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:46.972 08:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:46.972 08:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:46.972 08:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:46.972 08:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:46.972 08:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:46.972 08:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:46.972 08:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:46.972 08:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:46.972 08:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:46.972 08:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:46.972 08:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:46.972 08:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:47.231 08:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:47.231 "name": "Existed_Raid", 00:11:47.231 "uuid": "9e0f9d4d-aebc-4e7f-b882-c42324625d26", 00:11:47.231 "strip_size_kb": 64, 00:11:47.231 "state": "configuring", 00:11:47.231 "raid_level": "raid0", 00:11:47.231 "superblock": true, 00:11:47.231 "num_base_bdevs": 2, 00:11:47.231 "num_base_bdevs_discovered": 0, 00:11:47.231 "num_base_bdevs_operational": 2, 00:11:47.231 "base_bdevs_list": [ 00:11:47.231 { 00:11:47.231 "name": "BaseBdev1", 00:11:47.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:47.231 "is_configured": false, 00:11:47.231 "data_offset": 0, 00:11:47.231 "data_size": 0 00:11:47.231 }, 00:11:47.231 { 00:11:47.231 "name": "BaseBdev2", 00:11:47.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:47.231 "is_configured": false, 00:11:47.231 "data_offset": 0, 00:11:47.231 "data_size": 0 00:11:47.231 } 00:11:47.231 ] 00:11:47.231 }' 00:11:47.231 08:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:47.231 08:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:47.490 08:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:47.750 [2024-07-23 08:25:00.147410] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:47.750 [2024-07-23 08:25:00.147445] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:11:47.750 08:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:48.009 [2024-07-23 08:25:00.315860] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:48.009 [2024-07-23 08:25:00.315898] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:48.009 [2024-07-23 08:25:00.315907] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:48.009 [2024-07-23 08:25:00.315916] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:48.009 08:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:48.009 [2024-07-23 08:25:00.526738] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:48.268 BaseBdev1 00:11:48.268 08:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:48.268 08:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:48.268 08:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:48.268 08:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:48.268 08:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:48.268 08:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:48.269 08:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:48.269 08:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:48.528 [ 00:11:48.528 { 00:11:48.528 "name": "BaseBdev1", 00:11:48.528 "aliases": [ 00:11:48.528 "648fda51-45fe-4e23-9fd1-7af0f01e0eb0" 00:11:48.528 ], 00:11:48.528 "product_name": "Malloc disk", 00:11:48.528 "block_size": 512, 00:11:48.528 "num_blocks": 65536, 00:11:48.528 "uuid": "648fda51-45fe-4e23-9fd1-7af0f01e0eb0", 00:11:48.528 "assigned_rate_limits": { 00:11:48.528 "rw_ios_per_sec": 0, 00:11:48.528 "rw_mbytes_per_sec": 0, 00:11:48.528 "r_mbytes_per_sec": 0, 00:11:48.528 "w_mbytes_per_sec": 0 00:11:48.528 }, 00:11:48.528 "claimed": true, 00:11:48.528 "claim_type": "exclusive_write", 00:11:48.528 "zoned": false, 00:11:48.528 "supported_io_types": { 00:11:48.528 "read": true, 00:11:48.528 "write": true, 00:11:48.528 "unmap": true, 00:11:48.528 "flush": true, 00:11:48.528 "reset": true, 00:11:48.528 "nvme_admin": false, 00:11:48.528 "nvme_io": false, 00:11:48.528 "nvme_io_md": false, 00:11:48.528 "write_zeroes": true, 00:11:48.528 "zcopy": true, 00:11:48.528 "get_zone_info": false, 00:11:48.528 "zone_management": false, 00:11:48.528 "zone_append": false, 00:11:48.528 "compare": false, 00:11:48.528 "compare_and_write": false, 00:11:48.528 "abort": true, 00:11:48.528 "seek_hole": false, 00:11:48.528 "seek_data": false, 00:11:48.528 "copy": true, 00:11:48.528 "nvme_iov_md": false 00:11:48.528 }, 00:11:48.528 "memory_domains": [ 00:11:48.528 { 00:11:48.528 "dma_device_id": "system", 00:11:48.528 "dma_device_type": 1 00:11:48.528 }, 00:11:48.528 { 00:11:48.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:48.528 "dma_device_type": 2 00:11:48.528 } 00:11:48.528 ], 00:11:48.528 "driver_specific": {} 00:11:48.528 } 00:11:48.528 ] 00:11:48.528 08:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:48.528 08:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:48.528 08:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:48.528 08:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:48.528 08:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:48.528 08:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:48.528 08:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:48.528 08:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:48.528 08:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:48.528 08:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:48.528 08:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:48.528 08:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.528 08:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:48.528 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:48.528 "name": "Existed_Raid", 00:11:48.528 "uuid": "476f7cda-a092-4d4d-a614-95e3f3c6bd17", 00:11:48.528 "strip_size_kb": 64, 00:11:48.528 "state": "configuring", 00:11:48.528 "raid_level": "raid0", 00:11:48.528 "superblock": true, 00:11:48.528 "num_base_bdevs": 2, 00:11:48.528 "num_base_bdevs_discovered": 1, 00:11:48.528 "num_base_bdevs_operational": 2, 00:11:48.528 "base_bdevs_list": [ 00:11:48.528 { 00:11:48.528 "name": "BaseBdev1", 00:11:48.528 "uuid": "648fda51-45fe-4e23-9fd1-7af0f01e0eb0", 00:11:48.528 "is_configured": true, 00:11:48.528 "data_offset": 2048, 00:11:48.528 "data_size": 63488 00:11:48.528 }, 00:11:48.528 { 00:11:48.528 "name": "BaseBdev2", 00:11:48.528 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:48.528 "is_configured": false, 00:11:48.528 "data_offset": 0, 00:11:48.528 "data_size": 0 00:11:48.528 } 00:11:48.528 ] 00:11:48.528 }' 00:11:48.528 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:48.528 08:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:49.097 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:49.377 [2024-07-23 08:25:01.653735] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:49.377 [2024-07-23 08:25:01.653782] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:11:49.377 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:49.377 [2024-07-23 08:25:01.818199] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:49.377 [2024-07-23 08:25:01.819761] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:49.377 [2024-07-23 08:25:01.819796] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:49.377 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:49.377 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:49.377 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:49.377 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:49.377 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:49.377 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:49.377 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:49.377 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:49.377 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:49.377 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:49.377 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:49.377 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:49.377 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:49.377 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:49.637 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:49.637 "name": "Existed_Raid", 00:11:49.637 "uuid": "80cd6a79-0aab-48a6-9e04-30f46ecfd63c", 00:11:49.637 "strip_size_kb": 64, 00:11:49.637 "state": "configuring", 00:11:49.637 "raid_level": "raid0", 00:11:49.637 "superblock": true, 00:11:49.637 "num_base_bdevs": 2, 00:11:49.637 "num_base_bdevs_discovered": 1, 00:11:49.637 "num_base_bdevs_operational": 2, 00:11:49.637 "base_bdevs_list": [ 00:11:49.637 { 00:11:49.637 "name": "BaseBdev1", 00:11:49.637 "uuid": "648fda51-45fe-4e23-9fd1-7af0f01e0eb0", 00:11:49.637 "is_configured": true, 00:11:49.637 "data_offset": 2048, 00:11:49.637 "data_size": 63488 00:11:49.637 }, 00:11:49.637 { 00:11:49.637 "name": "BaseBdev2", 00:11:49.637 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:49.637 "is_configured": false, 00:11:49.637 "data_offset": 0, 00:11:49.637 "data_size": 0 00:11:49.637 } 00:11:49.637 ] 00:11:49.637 }' 00:11:49.637 08:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:49.637 08:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:50.227 08:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:50.228 [2024-07-23 08:25:02.716251] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:50.228 [2024-07-23 08:25:02.716460] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:11:50.228 [2024-07-23 08:25:02.716476] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:50.228 [2024-07-23 08:25:02.716701] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:11:50.228 [2024-07-23 08:25:02.716871] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:11:50.228 [2024-07-23 08:25:02.716882] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:11:50.228 [2024-07-23 08:25:02.717016] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:50.228 BaseBdev2 00:11:50.228 08:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:50.228 08:25:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:50.228 08:25:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:50.228 08:25:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:50.228 08:25:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:50.228 08:25:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:50.228 08:25:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:50.486 08:25:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:50.745 [ 00:11:50.745 { 00:11:50.745 "name": "BaseBdev2", 00:11:50.745 "aliases": [ 00:11:50.745 "0a133959-4a32-48d8-97cb-14a47717d019" 00:11:50.745 ], 00:11:50.745 "product_name": "Malloc disk", 00:11:50.745 "block_size": 512, 00:11:50.745 "num_blocks": 65536, 00:11:50.745 "uuid": "0a133959-4a32-48d8-97cb-14a47717d019", 00:11:50.745 "assigned_rate_limits": { 00:11:50.745 "rw_ios_per_sec": 0, 00:11:50.745 "rw_mbytes_per_sec": 0, 00:11:50.745 "r_mbytes_per_sec": 0, 00:11:50.745 "w_mbytes_per_sec": 0 00:11:50.745 }, 00:11:50.745 "claimed": true, 00:11:50.745 "claim_type": "exclusive_write", 00:11:50.745 "zoned": false, 00:11:50.745 "supported_io_types": { 00:11:50.745 "read": true, 00:11:50.745 "write": true, 00:11:50.745 "unmap": true, 00:11:50.746 "flush": true, 00:11:50.746 "reset": true, 00:11:50.746 "nvme_admin": false, 00:11:50.746 "nvme_io": false, 00:11:50.746 "nvme_io_md": false, 00:11:50.746 "write_zeroes": true, 00:11:50.746 "zcopy": true, 00:11:50.746 "get_zone_info": false, 00:11:50.746 "zone_management": false, 00:11:50.746 "zone_append": false, 00:11:50.746 "compare": false, 00:11:50.746 "compare_and_write": false, 00:11:50.746 "abort": true, 00:11:50.746 "seek_hole": false, 00:11:50.746 "seek_data": false, 00:11:50.746 "copy": true, 00:11:50.746 "nvme_iov_md": false 00:11:50.746 }, 00:11:50.746 "memory_domains": [ 00:11:50.746 { 00:11:50.746 "dma_device_id": "system", 00:11:50.746 "dma_device_type": 1 00:11:50.746 }, 00:11:50.746 { 00:11:50.746 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:50.746 "dma_device_type": 2 00:11:50.746 } 00:11:50.746 ], 00:11:50.746 "driver_specific": {} 00:11:50.746 } 00:11:50.746 ] 00:11:50.746 08:25:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:50.746 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:50.746 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:50.746 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:50.746 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:50.746 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:50.746 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:50.746 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:50.746 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:50.746 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:50.746 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:50.746 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:50.746 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:50.746 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:50.746 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:51.005 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:51.005 "name": "Existed_Raid", 00:11:51.005 "uuid": "80cd6a79-0aab-48a6-9e04-30f46ecfd63c", 00:11:51.005 "strip_size_kb": 64, 00:11:51.005 "state": "online", 00:11:51.005 "raid_level": "raid0", 00:11:51.005 "superblock": true, 00:11:51.005 "num_base_bdevs": 2, 00:11:51.005 "num_base_bdevs_discovered": 2, 00:11:51.005 "num_base_bdevs_operational": 2, 00:11:51.005 "base_bdevs_list": [ 00:11:51.005 { 00:11:51.005 "name": "BaseBdev1", 00:11:51.005 "uuid": "648fda51-45fe-4e23-9fd1-7af0f01e0eb0", 00:11:51.005 "is_configured": true, 00:11:51.005 "data_offset": 2048, 00:11:51.005 "data_size": 63488 00:11:51.005 }, 00:11:51.005 { 00:11:51.005 "name": "BaseBdev2", 00:11:51.005 "uuid": "0a133959-4a32-48d8-97cb-14a47717d019", 00:11:51.005 "is_configured": true, 00:11:51.005 "data_offset": 2048, 00:11:51.005 "data_size": 63488 00:11:51.005 } 00:11:51.005 ] 00:11:51.005 }' 00:11:51.005 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:51.005 08:25:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:51.263 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:51.263 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:51.264 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:51.264 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:51.264 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:51.264 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:51.264 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:51.264 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:51.527 [2024-07-23 08:25:03.911805] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:51.527 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:51.527 "name": "Existed_Raid", 00:11:51.527 "aliases": [ 00:11:51.527 "80cd6a79-0aab-48a6-9e04-30f46ecfd63c" 00:11:51.527 ], 00:11:51.527 "product_name": "Raid Volume", 00:11:51.527 "block_size": 512, 00:11:51.527 "num_blocks": 126976, 00:11:51.527 "uuid": "80cd6a79-0aab-48a6-9e04-30f46ecfd63c", 00:11:51.527 "assigned_rate_limits": { 00:11:51.527 "rw_ios_per_sec": 0, 00:11:51.527 "rw_mbytes_per_sec": 0, 00:11:51.527 "r_mbytes_per_sec": 0, 00:11:51.527 "w_mbytes_per_sec": 0 00:11:51.527 }, 00:11:51.527 "claimed": false, 00:11:51.527 "zoned": false, 00:11:51.527 "supported_io_types": { 00:11:51.527 "read": true, 00:11:51.527 "write": true, 00:11:51.527 "unmap": true, 00:11:51.527 "flush": true, 00:11:51.527 "reset": true, 00:11:51.527 "nvme_admin": false, 00:11:51.527 "nvme_io": false, 00:11:51.527 "nvme_io_md": false, 00:11:51.527 "write_zeroes": true, 00:11:51.527 "zcopy": false, 00:11:51.527 "get_zone_info": false, 00:11:51.527 "zone_management": false, 00:11:51.527 "zone_append": false, 00:11:51.527 "compare": false, 00:11:51.527 "compare_and_write": false, 00:11:51.527 "abort": false, 00:11:51.527 "seek_hole": false, 00:11:51.527 "seek_data": false, 00:11:51.527 "copy": false, 00:11:51.527 "nvme_iov_md": false 00:11:51.527 }, 00:11:51.527 "memory_domains": [ 00:11:51.527 { 00:11:51.527 "dma_device_id": "system", 00:11:51.527 "dma_device_type": 1 00:11:51.527 }, 00:11:51.527 { 00:11:51.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.527 "dma_device_type": 2 00:11:51.527 }, 00:11:51.527 { 00:11:51.527 "dma_device_id": "system", 00:11:51.527 "dma_device_type": 1 00:11:51.527 }, 00:11:51.527 { 00:11:51.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.527 "dma_device_type": 2 00:11:51.527 } 00:11:51.527 ], 00:11:51.527 "driver_specific": { 00:11:51.527 "raid": { 00:11:51.527 "uuid": "80cd6a79-0aab-48a6-9e04-30f46ecfd63c", 00:11:51.527 "strip_size_kb": 64, 00:11:51.527 "state": "online", 00:11:51.527 "raid_level": "raid0", 00:11:51.527 "superblock": true, 00:11:51.527 "num_base_bdevs": 2, 00:11:51.527 "num_base_bdevs_discovered": 2, 00:11:51.527 "num_base_bdevs_operational": 2, 00:11:51.527 "base_bdevs_list": [ 00:11:51.527 { 00:11:51.527 "name": "BaseBdev1", 00:11:51.527 "uuid": "648fda51-45fe-4e23-9fd1-7af0f01e0eb0", 00:11:51.527 "is_configured": true, 00:11:51.527 "data_offset": 2048, 00:11:51.527 "data_size": 63488 00:11:51.527 }, 00:11:51.527 { 00:11:51.527 "name": "BaseBdev2", 00:11:51.527 "uuid": "0a133959-4a32-48d8-97cb-14a47717d019", 00:11:51.527 "is_configured": true, 00:11:51.527 "data_offset": 2048, 00:11:51.527 "data_size": 63488 00:11:51.527 } 00:11:51.527 ] 00:11:51.527 } 00:11:51.527 } 00:11:51.527 }' 00:11:51.527 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:51.527 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:51.527 BaseBdev2' 00:11:51.527 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:51.527 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:51.528 08:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:51.821 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:51.821 "name": "BaseBdev1", 00:11:51.821 "aliases": [ 00:11:51.821 "648fda51-45fe-4e23-9fd1-7af0f01e0eb0" 00:11:51.821 ], 00:11:51.821 "product_name": "Malloc disk", 00:11:51.821 "block_size": 512, 00:11:51.821 "num_blocks": 65536, 00:11:51.821 "uuid": "648fda51-45fe-4e23-9fd1-7af0f01e0eb0", 00:11:51.821 "assigned_rate_limits": { 00:11:51.821 "rw_ios_per_sec": 0, 00:11:51.821 "rw_mbytes_per_sec": 0, 00:11:51.821 "r_mbytes_per_sec": 0, 00:11:51.821 "w_mbytes_per_sec": 0 00:11:51.821 }, 00:11:51.821 "claimed": true, 00:11:51.821 "claim_type": "exclusive_write", 00:11:51.821 "zoned": false, 00:11:51.821 "supported_io_types": { 00:11:51.821 "read": true, 00:11:51.821 "write": true, 00:11:51.821 "unmap": true, 00:11:51.821 "flush": true, 00:11:51.821 "reset": true, 00:11:51.821 "nvme_admin": false, 00:11:51.821 "nvme_io": false, 00:11:51.821 "nvme_io_md": false, 00:11:51.821 "write_zeroes": true, 00:11:51.821 "zcopy": true, 00:11:51.821 "get_zone_info": false, 00:11:51.821 "zone_management": false, 00:11:51.821 "zone_append": false, 00:11:51.821 "compare": false, 00:11:51.821 "compare_and_write": false, 00:11:51.821 "abort": true, 00:11:51.821 "seek_hole": false, 00:11:51.821 "seek_data": false, 00:11:51.821 "copy": true, 00:11:51.821 "nvme_iov_md": false 00:11:51.821 }, 00:11:51.821 "memory_domains": [ 00:11:51.821 { 00:11:51.821 "dma_device_id": "system", 00:11:51.821 "dma_device_type": 1 00:11:51.821 }, 00:11:51.821 { 00:11:51.821 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.821 "dma_device_type": 2 00:11:51.821 } 00:11:51.821 ], 00:11:51.821 "driver_specific": {} 00:11:51.821 }' 00:11:51.821 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.821 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.821 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:51.821 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:51.821 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:51.821 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:51.821 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.080 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.080 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:52.080 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.080 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.080 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:52.080 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:52.080 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:52.080 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:52.338 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:52.338 "name": "BaseBdev2", 00:11:52.338 "aliases": [ 00:11:52.338 "0a133959-4a32-48d8-97cb-14a47717d019" 00:11:52.338 ], 00:11:52.338 "product_name": "Malloc disk", 00:11:52.338 "block_size": 512, 00:11:52.338 "num_blocks": 65536, 00:11:52.338 "uuid": "0a133959-4a32-48d8-97cb-14a47717d019", 00:11:52.338 "assigned_rate_limits": { 00:11:52.338 "rw_ios_per_sec": 0, 00:11:52.338 "rw_mbytes_per_sec": 0, 00:11:52.338 "r_mbytes_per_sec": 0, 00:11:52.338 "w_mbytes_per_sec": 0 00:11:52.338 }, 00:11:52.338 "claimed": true, 00:11:52.338 "claim_type": "exclusive_write", 00:11:52.338 "zoned": false, 00:11:52.338 "supported_io_types": { 00:11:52.338 "read": true, 00:11:52.338 "write": true, 00:11:52.338 "unmap": true, 00:11:52.338 "flush": true, 00:11:52.338 "reset": true, 00:11:52.338 "nvme_admin": false, 00:11:52.338 "nvme_io": false, 00:11:52.338 "nvme_io_md": false, 00:11:52.338 "write_zeroes": true, 00:11:52.338 "zcopy": true, 00:11:52.338 "get_zone_info": false, 00:11:52.338 "zone_management": false, 00:11:52.338 "zone_append": false, 00:11:52.338 "compare": false, 00:11:52.338 "compare_and_write": false, 00:11:52.338 "abort": true, 00:11:52.338 "seek_hole": false, 00:11:52.338 "seek_data": false, 00:11:52.338 "copy": true, 00:11:52.338 "nvme_iov_md": false 00:11:52.338 }, 00:11:52.338 "memory_domains": [ 00:11:52.338 { 00:11:52.338 "dma_device_id": "system", 00:11:52.338 "dma_device_type": 1 00:11:52.338 }, 00:11:52.338 { 00:11:52.339 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:52.339 "dma_device_type": 2 00:11:52.339 } 00:11:52.339 ], 00:11:52.339 "driver_specific": {} 00:11:52.339 }' 00:11:52.339 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:52.339 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:52.339 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:52.339 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:52.339 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:52.339 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:52.339 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.339 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.597 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:52.597 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.597 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.597 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:52.597 08:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:52.597 [2024-07-23 08:25:05.106791] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:52.597 [2024-07-23 08:25:05.106820] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:52.597 [2024-07-23 08:25:05.106868] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:52.856 "name": "Existed_Raid", 00:11:52.856 "uuid": "80cd6a79-0aab-48a6-9e04-30f46ecfd63c", 00:11:52.856 "strip_size_kb": 64, 00:11:52.856 "state": "offline", 00:11:52.856 "raid_level": "raid0", 00:11:52.856 "superblock": true, 00:11:52.856 "num_base_bdevs": 2, 00:11:52.856 "num_base_bdevs_discovered": 1, 00:11:52.856 "num_base_bdevs_operational": 1, 00:11:52.856 "base_bdevs_list": [ 00:11:52.856 { 00:11:52.856 "name": null, 00:11:52.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:52.856 "is_configured": false, 00:11:52.856 "data_offset": 2048, 00:11:52.856 "data_size": 63488 00:11:52.856 }, 00:11:52.856 { 00:11:52.856 "name": "BaseBdev2", 00:11:52.856 "uuid": "0a133959-4a32-48d8-97cb-14a47717d019", 00:11:52.856 "is_configured": true, 00:11:52.856 "data_offset": 2048, 00:11:52.856 "data_size": 63488 00:11:52.856 } 00:11:52.856 ] 00:11:52.856 }' 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:52.856 08:25:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:53.423 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:53.423 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:53.423 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.423 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:53.682 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:53.682 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:53.682 08:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:53.682 [2024-07-23 08:25:06.130220] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:53.682 [2024-07-23 08:25:06.130271] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1399672 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1399672 ']' 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1399672 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1399672 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1399672' 00:11:53.941 killing process with pid 1399672 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1399672 00:11:53.941 [2024-07-23 08:25:06.457029] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:53.941 08:25:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1399672 00:11:54.200 [2024-07-23 08:25:06.475163] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:55.579 08:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:55.579 00:11:55.579 real 0m9.465s 00:11:55.579 user 0m15.745s 00:11:55.579 sys 0m1.438s 00:11:55.579 08:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:55.579 08:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:55.579 ************************************ 00:11:55.579 END TEST raid_state_function_test_sb 00:11:55.579 ************************************ 00:11:55.579 08:25:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:55.579 08:25:07 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:11:55.579 08:25:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:55.579 08:25:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:55.579 08:25:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:55.579 ************************************ 00:11:55.579 START TEST raid_superblock_test 00:11:55.579 ************************************ 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1401661 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1401661 /var/tmp/spdk-raid.sock 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1401661 ']' 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:55.579 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:55.579 08:25:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:55.579 [2024-07-23 08:25:07.908845] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:11:55.579 [2024-07-23 08:25:07.908932] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1401661 ] 00:11:55.579 [2024-07-23 08:25:08.053016] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:55.839 [2024-07-23 08:25:08.272685] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:56.098 [2024-07-23 08:25:08.521145] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:56.098 [2024-07-23 08:25:08.521175] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:56.357 08:25:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:56.357 08:25:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:56.357 08:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:56.357 08:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:56.357 08:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:56.357 08:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:56.357 08:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:56.357 08:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:56.357 08:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:56.357 08:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:56.357 08:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:56.617 malloc1 00:11:56.617 08:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:56.617 [2024-07-23 08:25:09.063085] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:56.617 [2024-07-23 08:25:09.063136] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:56.617 [2024-07-23 08:25:09.063158] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:11:56.617 [2024-07-23 08:25:09.063170] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:56.617 [2024-07-23 08:25:09.065184] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:56.617 [2024-07-23 08:25:09.065213] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:56.617 pt1 00:11:56.617 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:56.617 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:56.617 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:56.617 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:56.617 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:56.617 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:56.617 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:56.617 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:56.617 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:56.876 malloc2 00:11:56.876 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:57.135 [2024-07-23 08:25:09.461980] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:57.135 [2024-07-23 08:25:09.462033] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:57.135 [2024-07-23 08:25:09.462053] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:11:57.135 [2024-07-23 08:25:09.462062] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:57.135 [2024-07-23 08:25:09.463990] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:57.135 [2024-07-23 08:25:09.464018] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:57.135 pt2 00:11:57.135 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:57.135 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:57.135 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:11:57.135 [2024-07-23 08:25:09.630419] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:57.135 [2024-07-23 08:25:09.632016] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:57.135 [2024-07-23 08:25:09.632196] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035a80 00:11:57.135 [2024-07-23 08:25:09.632208] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:57.135 [2024-07-23 08:25:09.632456] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:11:57.135 [2024-07-23 08:25:09.632646] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035a80 00:11:57.135 [2024-07-23 08:25:09.632659] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000035a80 00:11:57.135 [2024-07-23 08:25:09.632807] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:57.135 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:57.135 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:57.135 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:57.135 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:57.135 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:57.135 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:57.135 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:57.135 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:57.135 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:57.135 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:57.135 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.135 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:57.394 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:57.394 "name": "raid_bdev1", 00:11:57.394 "uuid": "fa38d8dc-ddcf-4fdf-8d54-39a90a017b2e", 00:11:57.394 "strip_size_kb": 64, 00:11:57.394 "state": "online", 00:11:57.394 "raid_level": "raid0", 00:11:57.394 "superblock": true, 00:11:57.394 "num_base_bdevs": 2, 00:11:57.394 "num_base_bdevs_discovered": 2, 00:11:57.394 "num_base_bdevs_operational": 2, 00:11:57.394 "base_bdevs_list": [ 00:11:57.394 { 00:11:57.394 "name": "pt1", 00:11:57.394 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:57.394 "is_configured": true, 00:11:57.394 "data_offset": 2048, 00:11:57.394 "data_size": 63488 00:11:57.394 }, 00:11:57.394 { 00:11:57.394 "name": "pt2", 00:11:57.394 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:57.394 "is_configured": true, 00:11:57.394 "data_offset": 2048, 00:11:57.394 "data_size": 63488 00:11:57.394 } 00:11:57.394 ] 00:11:57.394 }' 00:11:57.394 08:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:57.394 08:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:57.962 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:57.962 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:57.962 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:57.962 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:57.962 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:57.962 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:57.962 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:57.962 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:57.962 [2024-07-23 08:25:10.464849] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:58.221 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:58.221 "name": "raid_bdev1", 00:11:58.221 "aliases": [ 00:11:58.221 "fa38d8dc-ddcf-4fdf-8d54-39a90a017b2e" 00:11:58.221 ], 00:11:58.221 "product_name": "Raid Volume", 00:11:58.221 "block_size": 512, 00:11:58.221 "num_blocks": 126976, 00:11:58.221 "uuid": "fa38d8dc-ddcf-4fdf-8d54-39a90a017b2e", 00:11:58.221 "assigned_rate_limits": { 00:11:58.221 "rw_ios_per_sec": 0, 00:11:58.221 "rw_mbytes_per_sec": 0, 00:11:58.221 "r_mbytes_per_sec": 0, 00:11:58.221 "w_mbytes_per_sec": 0 00:11:58.221 }, 00:11:58.221 "claimed": false, 00:11:58.221 "zoned": false, 00:11:58.221 "supported_io_types": { 00:11:58.221 "read": true, 00:11:58.221 "write": true, 00:11:58.221 "unmap": true, 00:11:58.221 "flush": true, 00:11:58.221 "reset": true, 00:11:58.221 "nvme_admin": false, 00:11:58.221 "nvme_io": false, 00:11:58.221 "nvme_io_md": false, 00:11:58.221 "write_zeroes": true, 00:11:58.221 "zcopy": false, 00:11:58.221 "get_zone_info": false, 00:11:58.221 "zone_management": false, 00:11:58.221 "zone_append": false, 00:11:58.221 "compare": false, 00:11:58.221 "compare_and_write": false, 00:11:58.221 "abort": false, 00:11:58.221 "seek_hole": false, 00:11:58.221 "seek_data": false, 00:11:58.221 "copy": false, 00:11:58.221 "nvme_iov_md": false 00:11:58.221 }, 00:11:58.221 "memory_domains": [ 00:11:58.221 { 00:11:58.221 "dma_device_id": "system", 00:11:58.221 "dma_device_type": 1 00:11:58.221 }, 00:11:58.221 { 00:11:58.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.221 "dma_device_type": 2 00:11:58.221 }, 00:11:58.221 { 00:11:58.221 "dma_device_id": "system", 00:11:58.221 "dma_device_type": 1 00:11:58.221 }, 00:11:58.221 { 00:11:58.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.221 "dma_device_type": 2 00:11:58.221 } 00:11:58.221 ], 00:11:58.221 "driver_specific": { 00:11:58.221 "raid": { 00:11:58.221 "uuid": "fa38d8dc-ddcf-4fdf-8d54-39a90a017b2e", 00:11:58.221 "strip_size_kb": 64, 00:11:58.221 "state": "online", 00:11:58.221 "raid_level": "raid0", 00:11:58.221 "superblock": true, 00:11:58.221 "num_base_bdevs": 2, 00:11:58.221 "num_base_bdevs_discovered": 2, 00:11:58.221 "num_base_bdevs_operational": 2, 00:11:58.221 "base_bdevs_list": [ 00:11:58.221 { 00:11:58.221 "name": "pt1", 00:11:58.221 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:58.221 "is_configured": true, 00:11:58.221 "data_offset": 2048, 00:11:58.221 "data_size": 63488 00:11:58.221 }, 00:11:58.221 { 00:11:58.221 "name": "pt2", 00:11:58.221 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:58.221 "is_configured": true, 00:11:58.221 "data_offset": 2048, 00:11:58.221 "data_size": 63488 00:11:58.221 } 00:11:58.221 ] 00:11:58.221 } 00:11:58.221 } 00:11:58.221 }' 00:11:58.221 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:58.221 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:58.221 pt2' 00:11:58.221 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:58.221 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:58.221 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:58.221 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:58.221 "name": "pt1", 00:11:58.221 "aliases": [ 00:11:58.221 "00000000-0000-0000-0000-000000000001" 00:11:58.221 ], 00:11:58.221 "product_name": "passthru", 00:11:58.221 "block_size": 512, 00:11:58.221 "num_blocks": 65536, 00:11:58.221 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:58.221 "assigned_rate_limits": { 00:11:58.221 "rw_ios_per_sec": 0, 00:11:58.221 "rw_mbytes_per_sec": 0, 00:11:58.221 "r_mbytes_per_sec": 0, 00:11:58.221 "w_mbytes_per_sec": 0 00:11:58.221 }, 00:11:58.221 "claimed": true, 00:11:58.221 "claim_type": "exclusive_write", 00:11:58.221 "zoned": false, 00:11:58.221 "supported_io_types": { 00:11:58.221 "read": true, 00:11:58.221 "write": true, 00:11:58.221 "unmap": true, 00:11:58.221 "flush": true, 00:11:58.221 "reset": true, 00:11:58.221 "nvme_admin": false, 00:11:58.221 "nvme_io": false, 00:11:58.221 "nvme_io_md": false, 00:11:58.221 "write_zeroes": true, 00:11:58.221 "zcopy": true, 00:11:58.221 "get_zone_info": false, 00:11:58.221 "zone_management": false, 00:11:58.221 "zone_append": false, 00:11:58.221 "compare": false, 00:11:58.221 "compare_and_write": false, 00:11:58.221 "abort": true, 00:11:58.221 "seek_hole": false, 00:11:58.221 "seek_data": false, 00:11:58.221 "copy": true, 00:11:58.221 "nvme_iov_md": false 00:11:58.221 }, 00:11:58.221 "memory_domains": [ 00:11:58.221 { 00:11:58.221 "dma_device_id": "system", 00:11:58.221 "dma_device_type": 1 00:11:58.221 }, 00:11:58.221 { 00:11:58.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.221 "dma_device_type": 2 00:11:58.221 } 00:11:58.221 ], 00:11:58.221 "driver_specific": { 00:11:58.221 "passthru": { 00:11:58.221 "name": "pt1", 00:11:58.221 "base_bdev_name": "malloc1" 00:11:58.221 } 00:11:58.221 } 00:11:58.221 }' 00:11:58.221 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:58.480 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:58.480 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:58.480 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:58.480 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:58.480 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:58.480 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:58.480 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:58.480 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:58.480 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:58.480 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:58.480 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:58.480 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:58.480 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:58.480 08:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:58.739 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:58.739 "name": "pt2", 00:11:58.739 "aliases": [ 00:11:58.739 "00000000-0000-0000-0000-000000000002" 00:11:58.739 ], 00:11:58.739 "product_name": "passthru", 00:11:58.739 "block_size": 512, 00:11:58.739 "num_blocks": 65536, 00:11:58.739 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:58.739 "assigned_rate_limits": { 00:11:58.739 "rw_ios_per_sec": 0, 00:11:58.739 "rw_mbytes_per_sec": 0, 00:11:58.739 "r_mbytes_per_sec": 0, 00:11:58.739 "w_mbytes_per_sec": 0 00:11:58.739 }, 00:11:58.739 "claimed": true, 00:11:58.739 "claim_type": "exclusive_write", 00:11:58.739 "zoned": false, 00:11:58.739 "supported_io_types": { 00:11:58.739 "read": true, 00:11:58.739 "write": true, 00:11:58.739 "unmap": true, 00:11:58.739 "flush": true, 00:11:58.739 "reset": true, 00:11:58.739 "nvme_admin": false, 00:11:58.739 "nvme_io": false, 00:11:58.739 "nvme_io_md": false, 00:11:58.739 "write_zeroes": true, 00:11:58.739 "zcopy": true, 00:11:58.739 "get_zone_info": false, 00:11:58.739 "zone_management": false, 00:11:58.739 "zone_append": false, 00:11:58.739 "compare": false, 00:11:58.739 "compare_and_write": false, 00:11:58.739 "abort": true, 00:11:58.739 "seek_hole": false, 00:11:58.739 "seek_data": false, 00:11:58.739 "copy": true, 00:11:58.739 "nvme_iov_md": false 00:11:58.739 }, 00:11:58.739 "memory_domains": [ 00:11:58.739 { 00:11:58.739 "dma_device_id": "system", 00:11:58.739 "dma_device_type": 1 00:11:58.739 }, 00:11:58.739 { 00:11:58.739 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.739 "dma_device_type": 2 00:11:58.739 } 00:11:58.739 ], 00:11:58.739 "driver_specific": { 00:11:58.739 "passthru": { 00:11:58.739 "name": "pt2", 00:11:58.739 "base_bdev_name": "malloc2" 00:11:58.739 } 00:11:58.739 } 00:11:58.739 }' 00:11:58.739 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:58.739 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:58.739 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:58.739 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:58.997 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:58.997 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:58.997 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:58.997 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:58.997 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:58.997 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:58.997 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:58.997 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:58.997 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:58.997 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:59.256 [2024-07-23 08:25:11.639920] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:59.256 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=fa38d8dc-ddcf-4fdf-8d54-39a90a017b2e 00:11:59.256 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z fa38d8dc-ddcf-4fdf-8d54-39a90a017b2e ']' 00:11:59.256 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:59.515 [2024-07-23 08:25:11.812135] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:59.515 [2024-07-23 08:25:11.812161] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:59.515 [2024-07-23 08:25:11.812234] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:59.515 [2024-07-23 08:25:11.812282] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:59.515 [2024-07-23 08:25:11.812297] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035a80 name raid_bdev1, state offline 00:11:59.515 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.515 08:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:59.515 08:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:59.515 08:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:59.515 08:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:59.515 08:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:59.774 08:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:59.774 08:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:00.033 08:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:00.033 08:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:00.033 08:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:00.033 08:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:00.033 08:25:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:00.033 08:25:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:00.033 08:25:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:00.033 08:25:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:00.033 08:25:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:00.033 08:25:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:00.033 08:25:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:00.033 08:25:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:00.033 08:25:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:00.033 08:25:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:00.033 08:25:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:00.292 [2024-07-23 08:25:12.674408] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:00.292 [2024-07-23 08:25:12.675988] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:00.292 [2024-07-23 08:25:12.676046] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:00.292 [2024-07-23 08:25:12.676098] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:00.292 [2024-07-23 08:25:12.676113] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:00.292 [2024-07-23 08:25:12.676124] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036080 name raid_bdev1, state configuring 00:12:00.292 request: 00:12:00.292 { 00:12:00.292 "name": "raid_bdev1", 00:12:00.292 "raid_level": "raid0", 00:12:00.292 "base_bdevs": [ 00:12:00.292 "malloc1", 00:12:00.292 "malloc2" 00:12:00.292 ], 00:12:00.292 "strip_size_kb": 64, 00:12:00.292 "superblock": false, 00:12:00.292 "method": "bdev_raid_create", 00:12:00.292 "req_id": 1 00:12:00.292 } 00:12:00.292 Got JSON-RPC error response 00:12:00.292 response: 00:12:00.292 { 00:12:00.292 "code": -17, 00:12:00.292 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:00.292 } 00:12:00.292 08:25:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:00.292 08:25:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:00.292 08:25:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:00.292 08:25:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:00.292 08:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.292 08:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:00.551 08:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:00.551 08:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:00.551 08:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:00.551 [2024-07-23 08:25:13.019274] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:00.551 [2024-07-23 08:25:13.019327] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:00.551 [2024-07-23 08:25:13.019346] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036680 00:12:00.551 [2024-07-23 08:25:13.019357] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:00.551 [2024-07-23 08:25:13.021384] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:00.551 [2024-07-23 08:25:13.021413] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:00.551 [2024-07-23 08:25:13.021491] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:00.551 [2024-07-23 08:25:13.021563] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:00.551 pt1 00:12:00.551 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:12:00.551 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:00.551 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:00.551 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:00.551 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:00.551 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:00.551 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:00.551 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:00.551 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:00.551 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:00.551 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.552 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:00.811 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:00.811 "name": "raid_bdev1", 00:12:00.811 "uuid": "fa38d8dc-ddcf-4fdf-8d54-39a90a017b2e", 00:12:00.811 "strip_size_kb": 64, 00:12:00.811 "state": "configuring", 00:12:00.811 "raid_level": "raid0", 00:12:00.811 "superblock": true, 00:12:00.811 "num_base_bdevs": 2, 00:12:00.811 "num_base_bdevs_discovered": 1, 00:12:00.811 "num_base_bdevs_operational": 2, 00:12:00.811 "base_bdevs_list": [ 00:12:00.811 { 00:12:00.811 "name": "pt1", 00:12:00.811 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:00.811 "is_configured": true, 00:12:00.811 "data_offset": 2048, 00:12:00.811 "data_size": 63488 00:12:00.811 }, 00:12:00.811 { 00:12:00.811 "name": null, 00:12:00.811 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:00.811 "is_configured": false, 00:12:00.811 "data_offset": 2048, 00:12:00.811 "data_size": 63488 00:12:00.811 } 00:12:00.811 ] 00:12:00.811 }' 00:12:00.811 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:00.811 08:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:01.377 [2024-07-23 08:25:13.853486] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:01.377 [2024-07-23 08:25:13.853547] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:01.377 [2024-07-23 08:25:13.853564] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036f80 00:12:01.377 [2024-07-23 08:25:13.853574] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:01.377 [2024-07-23 08:25:13.854011] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:01.377 [2024-07-23 08:25:13.854031] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:01.377 [2024-07-23 08:25:13.854107] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:01.377 [2024-07-23 08:25:13.854130] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:01.377 [2024-07-23 08:25:13.854263] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036c80 00:12:01.377 [2024-07-23 08:25:13.854277] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:01.377 [2024-07-23 08:25:13.854491] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:12:01.377 [2024-07-23 08:25:13.854663] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036c80 00:12:01.377 [2024-07-23 08:25:13.854673] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036c80 00:12:01.377 [2024-07-23 08:25:13.854808] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:01.377 pt2 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.377 08:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:01.636 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:01.636 "name": "raid_bdev1", 00:12:01.636 "uuid": "fa38d8dc-ddcf-4fdf-8d54-39a90a017b2e", 00:12:01.636 "strip_size_kb": 64, 00:12:01.636 "state": "online", 00:12:01.636 "raid_level": "raid0", 00:12:01.636 "superblock": true, 00:12:01.636 "num_base_bdevs": 2, 00:12:01.636 "num_base_bdevs_discovered": 2, 00:12:01.636 "num_base_bdevs_operational": 2, 00:12:01.636 "base_bdevs_list": [ 00:12:01.636 { 00:12:01.636 "name": "pt1", 00:12:01.636 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:01.636 "is_configured": true, 00:12:01.636 "data_offset": 2048, 00:12:01.636 "data_size": 63488 00:12:01.636 }, 00:12:01.636 { 00:12:01.636 "name": "pt2", 00:12:01.636 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:01.636 "is_configured": true, 00:12:01.636 "data_offset": 2048, 00:12:01.636 "data_size": 63488 00:12:01.636 } 00:12:01.636 ] 00:12:01.636 }' 00:12:01.636 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:01.636 08:25:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:02.204 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:02.204 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:02.204 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:02.204 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:02.204 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:02.204 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:02.204 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:02.204 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:02.204 [2024-07-23 08:25:14.663888] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:02.204 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:02.204 "name": "raid_bdev1", 00:12:02.204 "aliases": [ 00:12:02.204 "fa38d8dc-ddcf-4fdf-8d54-39a90a017b2e" 00:12:02.204 ], 00:12:02.204 "product_name": "Raid Volume", 00:12:02.204 "block_size": 512, 00:12:02.204 "num_blocks": 126976, 00:12:02.204 "uuid": "fa38d8dc-ddcf-4fdf-8d54-39a90a017b2e", 00:12:02.204 "assigned_rate_limits": { 00:12:02.204 "rw_ios_per_sec": 0, 00:12:02.204 "rw_mbytes_per_sec": 0, 00:12:02.204 "r_mbytes_per_sec": 0, 00:12:02.204 "w_mbytes_per_sec": 0 00:12:02.204 }, 00:12:02.204 "claimed": false, 00:12:02.204 "zoned": false, 00:12:02.204 "supported_io_types": { 00:12:02.204 "read": true, 00:12:02.204 "write": true, 00:12:02.204 "unmap": true, 00:12:02.204 "flush": true, 00:12:02.204 "reset": true, 00:12:02.204 "nvme_admin": false, 00:12:02.204 "nvme_io": false, 00:12:02.204 "nvme_io_md": false, 00:12:02.204 "write_zeroes": true, 00:12:02.204 "zcopy": false, 00:12:02.204 "get_zone_info": false, 00:12:02.204 "zone_management": false, 00:12:02.204 "zone_append": false, 00:12:02.204 "compare": false, 00:12:02.204 "compare_and_write": false, 00:12:02.204 "abort": false, 00:12:02.204 "seek_hole": false, 00:12:02.204 "seek_data": false, 00:12:02.204 "copy": false, 00:12:02.204 "nvme_iov_md": false 00:12:02.204 }, 00:12:02.204 "memory_domains": [ 00:12:02.204 { 00:12:02.204 "dma_device_id": "system", 00:12:02.204 "dma_device_type": 1 00:12:02.204 }, 00:12:02.204 { 00:12:02.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.204 "dma_device_type": 2 00:12:02.204 }, 00:12:02.204 { 00:12:02.204 "dma_device_id": "system", 00:12:02.204 "dma_device_type": 1 00:12:02.204 }, 00:12:02.204 { 00:12:02.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.204 "dma_device_type": 2 00:12:02.204 } 00:12:02.204 ], 00:12:02.204 "driver_specific": { 00:12:02.204 "raid": { 00:12:02.204 "uuid": "fa38d8dc-ddcf-4fdf-8d54-39a90a017b2e", 00:12:02.204 "strip_size_kb": 64, 00:12:02.204 "state": "online", 00:12:02.204 "raid_level": "raid0", 00:12:02.204 "superblock": true, 00:12:02.204 "num_base_bdevs": 2, 00:12:02.204 "num_base_bdevs_discovered": 2, 00:12:02.204 "num_base_bdevs_operational": 2, 00:12:02.204 "base_bdevs_list": [ 00:12:02.204 { 00:12:02.204 "name": "pt1", 00:12:02.204 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:02.204 "is_configured": true, 00:12:02.204 "data_offset": 2048, 00:12:02.204 "data_size": 63488 00:12:02.204 }, 00:12:02.204 { 00:12:02.204 "name": "pt2", 00:12:02.204 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:02.204 "is_configured": true, 00:12:02.204 "data_offset": 2048, 00:12:02.204 "data_size": 63488 00:12:02.204 } 00:12:02.204 ] 00:12:02.204 } 00:12:02.204 } 00:12:02.204 }' 00:12:02.204 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:02.463 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:02.463 pt2' 00:12:02.463 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:02.463 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:02.463 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:02.463 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:02.463 "name": "pt1", 00:12:02.463 "aliases": [ 00:12:02.463 "00000000-0000-0000-0000-000000000001" 00:12:02.463 ], 00:12:02.463 "product_name": "passthru", 00:12:02.463 "block_size": 512, 00:12:02.463 "num_blocks": 65536, 00:12:02.463 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:02.463 "assigned_rate_limits": { 00:12:02.463 "rw_ios_per_sec": 0, 00:12:02.463 "rw_mbytes_per_sec": 0, 00:12:02.463 "r_mbytes_per_sec": 0, 00:12:02.463 "w_mbytes_per_sec": 0 00:12:02.463 }, 00:12:02.463 "claimed": true, 00:12:02.463 "claim_type": "exclusive_write", 00:12:02.463 "zoned": false, 00:12:02.463 "supported_io_types": { 00:12:02.463 "read": true, 00:12:02.463 "write": true, 00:12:02.463 "unmap": true, 00:12:02.463 "flush": true, 00:12:02.463 "reset": true, 00:12:02.463 "nvme_admin": false, 00:12:02.463 "nvme_io": false, 00:12:02.463 "nvme_io_md": false, 00:12:02.463 "write_zeroes": true, 00:12:02.463 "zcopy": true, 00:12:02.463 "get_zone_info": false, 00:12:02.463 "zone_management": false, 00:12:02.463 "zone_append": false, 00:12:02.463 "compare": false, 00:12:02.463 "compare_and_write": false, 00:12:02.463 "abort": true, 00:12:02.463 "seek_hole": false, 00:12:02.463 "seek_data": false, 00:12:02.463 "copy": true, 00:12:02.463 "nvme_iov_md": false 00:12:02.463 }, 00:12:02.463 "memory_domains": [ 00:12:02.463 { 00:12:02.463 "dma_device_id": "system", 00:12:02.463 "dma_device_type": 1 00:12:02.463 }, 00:12:02.463 { 00:12:02.463 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.463 "dma_device_type": 2 00:12:02.463 } 00:12:02.463 ], 00:12:02.463 "driver_specific": { 00:12:02.463 "passthru": { 00:12:02.463 "name": "pt1", 00:12:02.463 "base_bdev_name": "malloc1" 00:12:02.463 } 00:12:02.463 } 00:12:02.463 }' 00:12:02.463 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:02.463 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:02.722 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:02.722 08:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:02.722 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:02.722 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:02.722 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:02.722 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:02.722 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:02.722 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:02.722 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:02.722 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:02.722 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:02.722 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:02.722 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:02.981 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:02.981 "name": "pt2", 00:12:02.981 "aliases": [ 00:12:02.981 "00000000-0000-0000-0000-000000000002" 00:12:02.981 ], 00:12:02.981 "product_name": "passthru", 00:12:02.981 "block_size": 512, 00:12:02.981 "num_blocks": 65536, 00:12:02.981 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:02.981 "assigned_rate_limits": { 00:12:02.981 "rw_ios_per_sec": 0, 00:12:02.981 "rw_mbytes_per_sec": 0, 00:12:02.981 "r_mbytes_per_sec": 0, 00:12:02.981 "w_mbytes_per_sec": 0 00:12:02.981 }, 00:12:02.981 "claimed": true, 00:12:02.981 "claim_type": "exclusive_write", 00:12:02.981 "zoned": false, 00:12:02.981 "supported_io_types": { 00:12:02.981 "read": true, 00:12:02.981 "write": true, 00:12:02.981 "unmap": true, 00:12:02.981 "flush": true, 00:12:02.981 "reset": true, 00:12:02.981 "nvme_admin": false, 00:12:02.981 "nvme_io": false, 00:12:02.981 "nvme_io_md": false, 00:12:02.981 "write_zeroes": true, 00:12:02.981 "zcopy": true, 00:12:02.981 "get_zone_info": false, 00:12:02.981 "zone_management": false, 00:12:02.981 "zone_append": false, 00:12:02.981 "compare": false, 00:12:02.981 "compare_and_write": false, 00:12:02.981 "abort": true, 00:12:02.981 "seek_hole": false, 00:12:02.981 "seek_data": false, 00:12:02.981 "copy": true, 00:12:02.981 "nvme_iov_md": false 00:12:02.981 }, 00:12:02.981 "memory_domains": [ 00:12:02.981 { 00:12:02.981 "dma_device_id": "system", 00:12:02.981 "dma_device_type": 1 00:12:02.981 }, 00:12:02.981 { 00:12:02.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.981 "dma_device_type": 2 00:12:02.981 } 00:12:02.981 ], 00:12:02.981 "driver_specific": { 00:12:02.981 "passthru": { 00:12:02.981 "name": "pt2", 00:12:02.981 "base_bdev_name": "malloc2" 00:12:02.981 } 00:12:02.981 } 00:12:02.981 }' 00:12:02.981 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:02.981 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:02.981 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:02.981 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.240 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.240 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:03.240 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.240 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.240 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:03.240 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:03.240 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:03.240 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:03.240 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:03.240 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:03.501 [2024-07-23 08:25:15.818924] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:03.501 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' fa38d8dc-ddcf-4fdf-8d54-39a90a017b2e '!=' fa38d8dc-ddcf-4fdf-8d54-39a90a017b2e ']' 00:12:03.501 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:12:03.501 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:03.501 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:03.501 08:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1401661 00:12:03.501 08:25:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1401661 ']' 00:12:03.501 08:25:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1401661 00:12:03.501 08:25:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:03.501 08:25:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:03.501 08:25:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1401661 00:12:03.501 08:25:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:03.501 08:25:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:03.501 08:25:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1401661' 00:12:03.501 killing process with pid 1401661 00:12:03.501 08:25:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1401661 00:12:03.501 [2024-07-23 08:25:15.879256] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:03.501 [2024-07-23 08:25:15.879343] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:03.501 [2024-07-23 08:25:15.879388] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:03.501 [2024-07-23 08:25:15.879400] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036c80 name raid_bdev1, state offline 00:12:03.502 08:25:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1401661 00:12:03.797 [2024-07-23 08:25:16.022142] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:05.176 08:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:05.176 00:12:05.176 real 0m9.434s 00:12:05.176 user 0m15.797s 00:12:05.176 sys 0m1.439s 00:12:05.176 08:25:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:05.176 08:25:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:05.176 ************************************ 00:12:05.176 END TEST raid_superblock_test 00:12:05.176 ************************************ 00:12:05.176 08:25:17 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:05.176 08:25:17 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:12:05.176 08:25:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:05.176 08:25:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:05.176 08:25:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:05.176 ************************************ 00:12:05.176 START TEST raid_read_error_test 00:12:05.176 ************************************ 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.4EjaGu9PmW 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1403641 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1403641 /var/tmp/spdk-raid.sock 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1403641 ']' 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:05.176 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:05.176 08:25:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:05.176 [2024-07-23 08:25:17.416310] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:12:05.176 [2024-07-23 08:25:17.416400] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1403641 ] 00:12:05.176 [2024-07-23 08:25:17.540826] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:05.435 [2024-07-23 08:25:17.753831] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:05.694 [2024-07-23 08:25:18.028195] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:05.694 [2024-07-23 08:25:18.028223] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:05.694 08:25:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:05.694 08:25:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:05.694 08:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:05.694 08:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:05.953 BaseBdev1_malloc 00:12:05.953 08:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:06.212 true 00:12:06.212 08:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:06.212 [2024-07-23 08:25:18.720040] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:06.212 [2024-07-23 08:25:18.720094] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:06.212 [2024-07-23 08:25:18.720113] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:12:06.212 [2024-07-23 08:25:18.720124] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:06.212 [2024-07-23 08:25:18.722094] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:06.212 [2024-07-23 08:25:18.722126] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:06.212 BaseBdev1 00:12:06.471 08:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:06.471 08:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:06.471 BaseBdev2_malloc 00:12:06.471 08:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:06.730 true 00:12:06.730 08:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:07.007 [2024-07-23 08:25:19.261302] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:07.007 [2024-07-23 08:25:19.261352] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:07.007 [2024-07-23 08:25:19.261370] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:12:07.007 [2024-07-23 08:25:19.261383] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:07.007 [2024-07-23 08:25:19.263363] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:07.007 [2024-07-23 08:25:19.263391] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:07.007 BaseBdev2 00:12:07.007 08:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:07.008 [2024-07-23 08:25:19.417759] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:07.008 [2024-07-23 08:25:19.419407] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:07.008 [2024-07-23 08:25:19.419616] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036080 00:12:07.008 [2024-07-23 08:25:19.419635] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:07.008 [2024-07-23 08:25:19.419888] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:12:07.008 [2024-07-23 08:25:19.420092] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036080 00:12:07.008 [2024-07-23 08:25:19.420102] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036080 00:12:07.008 [2024-07-23 08:25:19.420272] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:07.008 08:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:07.008 08:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:07.008 08:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:07.008 08:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:07.008 08:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:07.008 08:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:07.008 08:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:07.008 08:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:07.008 08:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:07.008 08:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:07.008 08:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:07.008 08:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:07.266 08:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:07.266 "name": "raid_bdev1", 00:12:07.266 "uuid": "5d68349c-f573-43c2-b1a9-c81e743f69ad", 00:12:07.266 "strip_size_kb": 64, 00:12:07.266 "state": "online", 00:12:07.266 "raid_level": "raid0", 00:12:07.266 "superblock": true, 00:12:07.266 "num_base_bdevs": 2, 00:12:07.266 "num_base_bdevs_discovered": 2, 00:12:07.266 "num_base_bdevs_operational": 2, 00:12:07.266 "base_bdevs_list": [ 00:12:07.266 { 00:12:07.266 "name": "BaseBdev1", 00:12:07.266 "uuid": "e2da866f-7046-50a1-9679-afd9cb657e40", 00:12:07.266 "is_configured": true, 00:12:07.266 "data_offset": 2048, 00:12:07.266 "data_size": 63488 00:12:07.266 }, 00:12:07.266 { 00:12:07.266 "name": "BaseBdev2", 00:12:07.266 "uuid": "1b61051a-5397-56ee-aab2-0d9f2a92f449", 00:12:07.266 "is_configured": true, 00:12:07.266 "data_offset": 2048, 00:12:07.266 "data_size": 63488 00:12:07.266 } 00:12:07.266 ] 00:12:07.266 }' 00:12:07.266 08:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:07.266 08:25:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.833 08:25:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:07.833 08:25:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:07.833 [2024-07-23 08:25:20.177291] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:12:08.771 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:08.771 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:08.771 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:08.771 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:08.771 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:08.771 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:08.771 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:08.771 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:08.771 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:08.771 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:08.771 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:08.771 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:08.771 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:08.771 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:08.771 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.771 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:09.030 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:09.030 "name": "raid_bdev1", 00:12:09.030 "uuid": "5d68349c-f573-43c2-b1a9-c81e743f69ad", 00:12:09.030 "strip_size_kb": 64, 00:12:09.030 "state": "online", 00:12:09.030 "raid_level": "raid0", 00:12:09.030 "superblock": true, 00:12:09.030 "num_base_bdevs": 2, 00:12:09.030 "num_base_bdevs_discovered": 2, 00:12:09.030 "num_base_bdevs_operational": 2, 00:12:09.030 "base_bdevs_list": [ 00:12:09.030 { 00:12:09.030 "name": "BaseBdev1", 00:12:09.030 "uuid": "e2da866f-7046-50a1-9679-afd9cb657e40", 00:12:09.030 "is_configured": true, 00:12:09.030 "data_offset": 2048, 00:12:09.030 "data_size": 63488 00:12:09.030 }, 00:12:09.030 { 00:12:09.030 "name": "BaseBdev2", 00:12:09.030 "uuid": "1b61051a-5397-56ee-aab2-0d9f2a92f449", 00:12:09.030 "is_configured": true, 00:12:09.030 "data_offset": 2048, 00:12:09.030 "data_size": 63488 00:12:09.030 } 00:12:09.030 ] 00:12:09.030 }' 00:12:09.030 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:09.030 08:25:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:09.612 08:25:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:09.612 [2024-07-23 08:25:22.083087] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:09.612 [2024-07-23 08:25:22.083128] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:09.612 [2024-07-23 08:25:22.085529] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:09.612 [2024-07-23 08:25:22.085570] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:09.612 [2024-07-23 08:25:22.085601] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:09.612 [2024-07-23 08:25:22.085616] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036080 name raid_bdev1, state offline 00:12:09.612 0 00:12:09.612 08:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1403641 00:12:09.612 08:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1403641 ']' 00:12:09.612 08:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1403641 00:12:09.612 08:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:09.612 08:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:09.612 08:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1403641 00:12:09.871 08:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:09.871 08:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:09.871 08:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1403641' 00:12:09.871 killing process with pid 1403641 00:12:09.871 08:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1403641 00:12:09.871 [2024-07-23 08:25:22.144913] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:09.871 08:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1403641 00:12:09.871 [2024-07-23 08:25:22.222513] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:11.250 08:25:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.4EjaGu9PmW 00:12:11.250 08:25:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:11.250 08:25:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:11.250 08:25:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:12:11.250 08:25:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:11.250 08:25:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:11.250 08:25:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:11.250 08:25:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:12:11.250 00:12:11.250 real 0m6.260s 00:12:11.250 user 0m8.763s 00:12:11.250 sys 0m0.829s 00:12:11.250 08:25:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:11.250 08:25:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:11.250 ************************************ 00:12:11.250 END TEST raid_read_error_test 00:12:11.250 ************************************ 00:12:11.250 08:25:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:11.250 08:25:23 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:12:11.250 08:25:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:11.250 08:25:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:11.250 08:25:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:11.250 ************************************ 00:12:11.250 START TEST raid_write_error_test 00:12:11.250 ************************************ 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.EEzvjbuqpB 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1404969 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1404969 /var/tmp/spdk-raid.sock 00:12:11.250 08:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:11.251 08:25:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1404969 ']' 00:12:11.251 08:25:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:11.251 08:25:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:11.251 08:25:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:11.251 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:11.251 08:25:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:11.251 08:25:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:11.251 [2024-07-23 08:25:23.741720] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:12:11.251 [2024-07-23 08:25:23.741808] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1404969 ] 00:12:11.509 [2024-07-23 08:25:23.866171] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:11.768 [2024-07-23 08:25:24.077894] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:12.027 [2024-07-23 08:25:24.354405] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:12.027 [2024-07-23 08:25:24.354436] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:12.027 08:25:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:12.027 08:25:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:12.027 08:25:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:12.027 08:25:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:12.286 BaseBdev1_malloc 00:12:12.286 08:25:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:12.544 true 00:12:12.544 08:25:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:12.544 [2024-07-23 08:25:25.047496] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:12.544 [2024-07-23 08:25:25.047549] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:12.544 [2024-07-23 08:25:25.047584] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:12:12.544 [2024-07-23 08:25:25.047595] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:12.544 [2024-07-23 08:25:25.049554] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:12.544 [2024-07-23 08:25:25.049585] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:12.544 BaseBdev1 00:12:12.544 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:12.544 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:12.803 BaseBdev2_malloc 00:12:12.803 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:13.062 true 00:12:13.062 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:13.063 [2024-07-23 08:25:25.577450] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:13.063 [2024-07-23 08:25:25.577507] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:13.063 [2024-07-23 08:25:25.577527] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:12:13.063 [2024-07-23 08:25:25.577541] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:13.063 [2024-07-23 08:25:25.579582] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:13.063 [2024-07-23 08:25:25.579621] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:13.322 BaseBdev2 00:12:13.322 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:13.322 [2024-07-23 08:25:25.742092] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:13.322 [2024-07-23 08:25:25.744000] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:13.322 [2024-07-23 08:25:25.744228] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036080 00:12:13.322 [2024-07-23 08:25:25.744248] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:13.322 [2024-07-23 08:25:25.744542] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:12:13.322 [2024-07-23 08:25:25.744794] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036080 00:12:13.322 [2024-07-23 08:25:25.744807] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036080 00:12:13.322 [2024-07-23 08:25:25.744996] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:13.322 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:13.322 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:13.322 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:13.322 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:13.322 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:13.322 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:13.322 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:13.322 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:13.322 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:13.322 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:13.322 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:13.322 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:13.595 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:13.595 "name": "raid_bdev1", 00:12:13.595 "uuid": "36961c55-0761-4be3-81fd-6dc9fab77942", 00:12:13.595 "strip_size_kb": 64, 00:12:13.595 "state": "online", 00:12:13.595 "raid_level": "raid0", 00:12:13.595 "superblock": true, 00:12:13.595 "num_base_bdevs": 2, 00:12:13.595 "num_base_bdevs_discovered": 2, 00:12:13.595 "num_base_bdevs_operational": 2, 00:12:13.595 "base_bdevs_list": [ 00:12:13.595 { 00:12:13.595 "name": "BaseBdev1", 00:12:13.595 "uuid": "719d85e1-c222-5ca9-8562-719d7c0f763f", 00:12:13.595 "is_configured": true, 00:12:13.595 "data_offset": 2048, 00:12:13.595 "data_size": 63488 00:12:13.595 }, 00:12:13.595 { 00:12:13.595 "name": "BaseBdev2", 00:12:13.595 "uuid": "080ad8dd-f1ec-5614-8d7e-70da6f55c241", 00:12:13.595 "is_configured": true, 00:12:13.595 "data_offset": 2048, 00:12:13.595 "data_size": 63488 00:12:13.595 } 00:12:13.595 ] 00:12:13.595 }' 00:12:13.595 08:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:13.595 08:25:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:14.164 08:25:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:14.164 08:25:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:14.164 [2024-07-23 08:25:26.477430] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:12:15.103 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:15.103 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:15.103 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:15.103 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:15.103 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:15.103 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:15.103 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:15.103 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:15.103 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:15.103 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:15.103 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:15.103 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:15.103 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:15.103 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:15.103 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.103 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:15.362 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:15.362 "name": "raid_bdev1", 00:12:15.362 "uuid": "36961c55-0761-4be3-81fd-6dc9fab77942", 00:12:15.362 "strip_size_kb": 64, 00:12:15.362 "state": "online", 00:12:15.362 "raid_level": "raid0", 00:12:15.362 "superblock": true, 00:12:15.362 "num_base_bdevs": 2, 00:12:15.362 "num_base_bdevs_discovered": 2, 00:12:15.362 "num_base_bdevs_operational": 2, 00:12:15.362 "base_bdevs_list": [ 00:12:15.362 { 00:12:15.362 "name": "BaseBdev1", 00:12:15.362 "uuid": "719d85e1-c222-5ca9-8562-719d7c0f763f", 00:12:15.362 "is_configured": true, 00:12:15.362 "data_offset": 2048, 00:12:15.362 "data_size": 63488 00:12:15.362 }, 00:12:15.362 { 00:12:15.362 "name": "BaseBdev2", 00:12:15.362 "uuid": "080ad8dd-f1ec-5614-8d7e-70da6f55c241", 00:12:15.362 "is_configured": true, 00:12:15.362 "data_offset": 2048, 00:12:15.362 "data_size": 63488 00:12:15.362 } 00:12:15.362 ] 00:12:15.362 }' 00:12:15.362 08:25:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:15.362 08:25:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:15.931 08:25:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:15.931 [2024-07-23 08:25:28.394212] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:15.931 [2024-07-23 08:25:28.394248] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:15.931 [2024-07-23 08:25:28.396597] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:15.931 [2024-07-23 08:25:28.396644] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:15.931 [2024-07-23 08:25:28.396674] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:15.931 [2024-07-23 08:25:28.396686] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036080 name raid_bdev1, state offline 00:12:15.931 0 00:12:15.931 08:25:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1404969 00:12:15.931 08:25:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1404969 ']' 00:12:15.931 08:25:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1404969 00:12:15.931 08:25:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:15.931 08:25:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:15.931 08:25:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1404969 00:12:16.190 08:25:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:16.190 08:25:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:16.190 08:25:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1404969' 00:12:16.190 killing process with pid 1404969 00:12:16.190 08:25:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1404969 00:12:16.190 [2024-07-23 08:25:28.456143] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:16.190 08:25:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1404969 00:12:16.190 [2024-07-23 08:25:28.533266] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:17.582 08:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.EEzvjbuqpB 00:12:17.582 08:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:17.582 08:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:17.582 08:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:12:17.582 08:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:17.582 08:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:17.582 08:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:17.582 08:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:12:17.582 00:12:17.582 real 0m6.253s 00:12:17.582 user 0m8.749s 00:12:17.582 sys 0m0.846s 00:12:17.582 08:25:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:17.582 08:25:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:17.582 ************************************ 00:12:17.582 END TEST raid_write_error_test 00:12:17.582 ************************************ 00:12:17.582 08:25:29 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:17.582 08:25:29 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:17.582 08:25:29 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:12:17.582 08:25:29 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:17.582 08:25:29 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:17.582 08:25:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:17.582 ************************************ 00:12:17.582 START TEST raid_state_function_test 00:12:17.582 ************************************ 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1406302 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1406302' 00:12:17.582 Process raid pid: 1406302 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1406302 /var/tmp/spdk-raid.sock 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1406302 ']' 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:17.582 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:17.582 08:25:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:17.582 [2024-07-23 08:25:30.049050] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:12:17.582 [2024-07-23 08:25:30.049140] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:17.841 [2024-07-23 08:25:30.175160] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:18.100 [2024-07-23 08:25:30.389093] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:18.359 [2024-07-23 08:25:30.647721] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:18.359 [2024-07-23 08:25:30.647753] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:18.359 08:25:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:18.359 08:25:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:18.359 08:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:18.618 [2024-07-23 08:25:30.997884] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:18.618 [2024-07-23 08:25:30.997932] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:18.618 [2024-07-23 08:25:30.997942] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:18.618 [2024-07-23 08:25:30.997954] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:18.618 08:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:18.618 08:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:18.618 08:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:18.618 08:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:18.618 08:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:18.618 08:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:18.618 08:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:18.618 08:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:18.618 08:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:18.618 08:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:18.618 08:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:18.618 08:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:18.877 08:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:18.877 "name": "Existed_Raid", 00:12:18.877 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:18.877 "strip_size_kb": 64, 00:12:18.877 "state": "configuring", 00:12:18.877 "raid_level": "concat", 00:12:18.877 "superblock": false, 00:12:18.877 "num_base_bdevs": 2, 00:12:18.877 "num_base_bdevs_discovered": 0, 00:12:18.877 "num_base_bdevs_operational": 2, 00:12:18.877 "base_bdevs_list": [ 00:12:18.877 { 00:12:18.877 "name": "BaseBdev1", 00:12:18.877 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:18.877 "is_configured": false, 00:12:18.877 "data_offset": 0, 00:12:18.877 "data_size": 0 00:12:18.877 }, 00:12:18.877 { 00:12:18.877 "name": "BaseBdev2", 00:12:18.877 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:18.877 "is_configured": false, 00:12:18.877 "data_offset": 0, 00:12:18.877 "data_size": 0 00:12:18.877 } 00:12:18.877 ] 00:12:18.877 }' 00:12:18.877 08:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:18.877 08:25:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:19.445 08:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:19.445 [2024-07-23 08:25:31.856019] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:19.445 [2024-07-23 08:25:31.856060] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:12:19.445 08:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:19.704 [2024-07-23 08:25:32.028479] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:19.704 [2024-07-23 08:25:32.028531] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:19.704 [2024-07-23 08:25:32.028541] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:19.704 [2024-07-23 08:25:32.028550] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:19.704 08:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:19.962 [2024-07-23 08:25:32.224907] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:19.962 BaseBdev1 00:12:19.962 08:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:19.962 08:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:19.962 08:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:19.962 08:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:19.962 08:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:19.962 08:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:19.962 08:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:19.962 08:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:20.221 [ 00:12:20.221 { 00:12:20.221 "name": "BaseBdev1", 00:12:20.221 "aliases": [ 00:12:20.221 "6a1f1d6a-35c9-4748-bbb0-745fa822dfc4" 00:12:20.221 ], 00:12:20.221 "product_name": "Malloc disk", 00:12:20.221 "block_size": 512, 00:12:20.221 "num_blocks": 65536, 00:12:20.221 "uuid": "6a1f1d6a-35c9-4748-bbb0-745fa822dfc4", 00:12:20.221 "assigned_rate_limits": { 00:12:20.221 "rw_ios_per_sec": 0, 00:12:20.221 "rw_mbytes_per_sec": 0, 00:12:20.221 "r_mbytes_per_sec": 0, 00:12:20.221 "w_mbytes_per_sec": 0 00:12:20.221 }, 00:12:20.221 "claimed": true, 00:12:20.221 "claim_type": "exclusive_write", 00:12:20.221 "zoned": false, 00:12:20.221 "supported_io_types": { 00:12:20.221 "read": true, 00:12:20.221 "write": true, 00:12:20.221 "unmap": true, 00:12:20.221 "flush": true, 00:12:20.221 "reset": true, 00:12:20.221 "nvme_admin": false, 00:12:20.221 "nvme_io": false, 00:12:20.221 "nvme_io_md": false, 00:12:20.221 "write_zeroes": true, 00:12:20.221 "zcopy": true, 00:12:20.221 "get_zone_info": false, 00:12:20.221 "zone_management": false, 00:12:20.222 "zone_append": false, 00:12:20.222 "compare": false, 00:12:20.222 "compare_and_write": false, 00:12:20.222 "abort": true, 00:12:20.222 "seek_hole": false, 00:12:20.222 "seek_data": false, 00:12:20.222 "copy": true, 00:12:20.222 "nvme_iov_md": false 00:12:20.222 }, 00:12:20.222 "memory_domains": [ 00:12:20.222 { 00:12:20.222 "dma_device_id": "system", 00:12:20.222 "dma_device_type": 1 00:12:20.222 }, 00:12:20.222 { 00:12:20.222 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.222 "dma_device_type": 2 00:12:20.222 } 00:12:20.222 ], 00:12:20.222 "driver_specific": {} 00:12:20.222 } 00:12:20.222 ] 00:12:20.222 08:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:20.222 08:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:20.222 08:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:20.222 08:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:20.222 08:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:20.222 08:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:20.222 08:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:20.222 08:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:20.222 08:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:20.222 08:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:20.222 08:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:20.222 08:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.222 08:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:20.480 08:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:20.480 "name": "Existed_Raid", 00:12:20.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.480 "strip_size_kb": 64, 00:12:20.480 "state": "configuring", 00:12:20.480 "raid_level": "concat", 00:12:20.480 "superblock": false, 00:12:20.480 "num_base_bdevs": 2, 00:12:20.480 "num_base_bdevs_discovered": 1, 00:12:20.480 "num_base_bdevs_operational": 2, 00:12:20.480 "base_bdevs_list": [ 00:12:20.480 { 00:12:20.480 "name": "BaseBdev1", 00:12:20.480 "uuid": "6a1f1d6a-35c9-4748-bbb0-745fa822dfc4", 00:12:20.480 "is_configured": true, 00:12:20.480 "data_offset": 0, 00:12:20.480 "data_size": 65536 00:12:20.480 }, 00:12:20.480 { 00:12:20.480 "name": "BaseBdev2", 00:12:20.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.480 "is_configured": false, 00:12:20.480 "data_offset": 0, 00:12:20.480 "data_size": 0 00:12:20.480 } 00:12:20.480 ] 00:12:20.480 }' 00:12:20.480 08:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:20.480 08:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.047 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:21.047 [2024-07-23 08:25:33.432127] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:21.047 [2024-07-23 08:25:33.432178] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:12:21.047 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:21.306 [2024-07-23 08:25:33.600594] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:21.306 [2024-07-23 08:25:33.602206] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:21.306 [2024-07-23 08:25:33.602238] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:21.307 "name": "Existed_Raid", 00:12:21.307 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:21.307 "strip_size_kb": 64, 00:12:21.307 "state": "configuring", 00:12:21.307 "raid_level": "concat", 00:12:21.307 "superblock": false, 00:12:21.307 "num_base_bdevs": 2, 00:12:21.307 "num_base_bdevs_discovered": 1, 00:12:21.307 "num_base_bdevs_operational": 2, 00:12:21.307 "base_bdevs_list": [ 00:12:21.307 { 00:12:21.307 "name": "BaseBdev1", 00:12:21.307 "uuid": "6a1f1d6a-35c9-4748-bbb0-745fa822dfc4", 00:12:21.307 "is_configured": true, 00:12:21.307 "data_offset": 0, 00:12:21.307 "data_size": 65536 00:12:21.307 }, 00:12:21.307 { 00:12:21.307 "name": "BaseBdev2", 00:12:21.307 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:21.307 "is_configured": false, 00:12:21.307 "data_offset": 0, 00:12:21.307 "data_size": 0 00:12:21.307 } 00:12:21.307 ] 00:12:21.307 }' 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:21.307 08:25:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.875 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:22.134 [2024-07-23 08:25:34.453358] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:22.134 [2024-07-23 08:25:34.453400] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:12:22.134 [2024-07-23 08:25:34.453408] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:22.134 [2024-07-23 08:25:34.453633] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:12:22.134 [2024-07-23 08:25:34.453799] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:12:22.134 [2024-07-23 08:25:34.453810] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:12:22.134 [2024-07-23 08:25:34.454059] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:22.134 BaseBdev2 00:12:22.134 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:22.134 08:25:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:22.134 08:25:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:22.134 08:25:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:22.134 08:25:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:22.134 08:25:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:22.134 08:25:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:22.134 08:25:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:22.399 [ 00:12:22.399 { 00:12:22.399 "name": "BaseBdev2", 00:12:22.399 "aliases": [ 00:12:22.399 "6ff36860-dd2a-4424-a9fb-5a6820794c07" 00:12:22.399 ], 00:12:22.399 "product_name": "Malloc disk", 00:12:22.399 "block_size": 512, 00:12:22.399 "num_blocks": 65536, 00:12:22.399 "uuid": "6ff36860-dd2a-4424-a9fb-5a6820794c07", 00:12:22.399 "assigned_rate_limits": { 00:12:22.399 "rw_ios_per_sec": 0, 00:12:22.399 "rw_mbytes_per_sec": 0, 00:12:22.399 "r_mbytes_per_sec": 0, 00:12:22.399 "w_mbytes_per_sec": 0 00:12:22.399 }, 00:12:22.399 "claimed": true, 00:12:22.399 "claim_type": "exclusive_write", 00:12:22.399 "zoned": false, 00:12:22.399 "supported_io_types": { 00:12:22.399 "read": true, 00:12:22.399 "write": true, 00:12:22.399 "unmap": true, 00:12:22.399 "flush": true, 00:12:22.399 "reset": true, 00:12:22.399 "nvme_admin": false, 00:12:22.399 "nvme_io": false, 00:12:22.399 "nvme_io_md": false, 00:12:22.399 "write_zeroes": true, 00:12:22.399 "zcopy": true, 00:12:22.399 "get_zone_info": false, 00:12:22.399 "zone_management": false, 00:12:22.399 "zone_append": false, 00:12:22.399 "compare": false, 00:12:22.399 "compare_and_write": false, 00:12:22.399 "abort": true, 00:12:22.399 "seek_hole": false, 00:12:22.399 "seek_data": false, 00:12:22.399 "copy": true, 00:12:22.399 "nvme_iov_md": false 00:12:22.399 }, 00:12:22.399 "memory_domains": [ 00:12:22.399 { 00:12:22.399 "dma_device_id": "system", 00:12:22.399 "dma_device_type": 1 00:12:22.399 }, 00:12:22.399 { 00:12:22.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:22.399 "dma_device_type": 2 00:12:22.399 } 00:12:22.399 ], 00:12:22.399 "driver_specific": {} 00:12:22.399 } 00:12:22.399 ] 00:12:22.399 08:25:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:22.399 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:22.399 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:22.399 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:12:22.399 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:22.399 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:22.399 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:22.399 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:22.399 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:22.399 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:22.399 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:22.399 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:22.399 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:22.399 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.399 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:22.659 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:22.659 "name": "Existed_Raid", 00:12:22.659 "uuid": "6a2f9fe8-3a75-403c-835e-a5529b36d2d3", 00:12:22.659 "strip_size_kb": 64, 00:12:22.659 "state": "online", 00:12:22.659 "raid_level": "concat", 00:12:22.659 "superblock": false, 00:12:22.659 "num_base_bdevs": 2, 00:12:22.659 "num_base_bdevs_discovered": 2, 00:12:22.659 "num_base_bdevs_operational": 2, 00:12:22.659 "base_bdevs_list": [ 00:12:22.659 { 00:12:22.659 "name": "BaseBdev1", 00:12:22.659 "uuid": "6a1f1d6a-35c9-4748-bbb0-745fa822dfc4", 00:12:22.659 "is_configured": true, 00:12:22.659 "data_offset": 0, 00:12:22.659 "data_size": 65536 00:12:22.659 }, 00:12:22.659 { 00:12:22.659 "name": "BaseBdev2", 00:12:22.659 "uuid": "6ff36860-dd2a-4424-a9fb-5a6820794c07", 00:12:22.659 "is_configured": true, 00:12:22.659 "data_offset": 0, 00:12:22.659 "data_size": 65536 00:12:22.659 } 00:12:22.659 ] 00:12:22.659 }' 00:12:22.659 08:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:22.659 08:25:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:23.227 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:23.227 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:23.227 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:23.227 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:23.227 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:23.227 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:23.227 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:23.227 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:23.227 [2024-07-23 08:25:35.624767] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:23.227 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:23.227 "name": "Existed_Raid", 00:12:23.227 "aliases": [ 00:12:23.227 "6a2f9fe8-3a75-403c-835e-a5529b36d2d3" 00:12:23.227 ], 00:12:23.227 "product_name": "Raid Volume", 00:12:23.227 "block_size": 512, 00:12:23.227 "num_blocks": 131072, 00:12:23.227 "uuid": "6a2f9fe8-3a75-403c-835e-a5529b36d2d3", 00:12:23.227 "assigned_rate_limits": { 00:12:23.227 "rw_ios_per_sec": 0, 00:12:23.227 "rw_mbytes_per_sec": 0, 00:12:23.227 "r_mbytes_per_sec": 0, 00:12:23.227 "w_mbytes_per_sec": 0 00:12:23.227 }, 00:12:23.227 "claimed": false, 00:12:23.227 "zoned": false, 00:12:23.227 "supported_io_types": { 00:12:23.227 "read": true, 00:12:23.227 "write": true, 00:12:23.227 "unmap": true, 00:12:23.227 "flush": true, 00:12:23.227 "reset": true, 00:12:23.227 "nvme_admin": false, 00:12:23.227 "nvme_io": false, 00:12:23.227 "nvme_io_md": false, 00:12:23.227 "write_zeroes": true, 00:12:23.227 "zcopy": false, 00:12:23.227 "get_zone_info": false, 00:12:23.227 "zone_management": false, 00:12:23.227 "zone_append": false, 00:12:23.227 "compare": false, 00:12:23.227 "compare_and_write": false, 00:12:23.227 "abort": false, 00:12:23.227 "seek_hole": false, 00:12:23.227 "seek_data": false, 00:12:23.227 "copy": false, 00:12:23.227 "nvme_iov_md": false 00:12:23.227 }, 00:12:23.227 "memory_domains": [ 00:12:23.227 { 00:12:23.227 "dma_device_id": "system", 00:12:23.227 "dma_device_type": 1 00:12:23.227 }, 00:12:23.227 { 00:12:23.227 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:23.227 "dma_device_type": 2 00:12:23.227 }, 00:12:23.227 { 00:12:23.227 "dma_device_id": "system", 00:12:23.227 "dma_device_type": 1 00:12:23.227 }, 00:12:23.227 { 00:12:23.227 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:23.227 "dma_device_type": 2 00:12:23.227 } 00:12:23.227 ], 00:12:23.227 "driver_specific": { 00:12:23.227 "raid": { 00:12:23.227 "uuid": "6a2f9fe8-3a75-403c-835e-a5529b36d2d3", 00:12:23.227 "strip_size_kb": 64, 00:12:23.227 "state": "online", 00:12:23.227 "raid_level": "concat", 00:12:23.227 "superblock": false, 00:12:23.227 "num_base_bdevs": 2, 00:12:23.227 "num_base_bdevs_discovered": 2, 00:12:23.227 "num_base_bdevs_operational": 2, 00:12:23.227 "base_bdevs_list": [ 00:12:23.227 { 00:12:23.227 "name": "BaseBdev1", 00:12:23.227 "uuid": "6a1f1d6a-35c9-4748-bbb0-745fa822dfc4", 00:12:23.227 "is_configured": true, 00:12:23.227 "data_offset": 0, 00:12:23.227 "data_size": 65536 00:12:23.227 }, 00:12:23.227 { 00:12:23.227 "name": "BaseBdev2", 00:12:23.227 "uuid": "6ff36860-dd2a-4424-a9fb-5a6820794c07", 00:12:23.227 "is_configured": true, 00:12:23.227 "data_offset": 0, 00:12:23.227 "data_size": 65536 00:12:23.227 } 00:12:23.227 ] 00:12:23.227 } 00:12:23.227 } 00:12:23.227 }' 00:12:23.227 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:23.227 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:23.227 BaseBdev2' 00:12:23.227 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:23.227 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:23.227 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:23.485 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:23.485 "name": "BaseBdev1", 00:12:23.485 "aliases": [ 00:12:23.485 "6a1f1d6a-35c9-4748-bbb0-745fa822dfc4" 00:12:23.485 ], 00:12:23.485 "product_name": "Malloc disk", 00:12:23.485 "block_size": 512, 00:12:23.485 "num_blocks": 65536, 00:12:23.485 "uuid": "6a1f1d6a-35c9-4748-bbb0-745fa822dfc4", 00:12:23.485 "assigned_rate_limits": { 00:12:23.486 "rw_ios_per_sec": 0, 00:12:23.486 "rw_mbytes_per_sec": 0, 00:12:23.486 "r_mbytes_per_sec": 0, 00:12:23.486 "w_mbytes_per_sec": 0 00:12:23.486 }, 00:12:23.486 "claimed": true, 00:12:23.486 "claim_type": "exclusive_write", 00:12:23.486 "zoned": false, 00:12:23.486 "supported_io_types": { 00:12:23.486 "read": true, 00:12:23.486 "write": true, 00:12:23.486 "unmap": true, 00:12:23.486 "flush": true, 00:12:23.486 "reset": true, 00:12:23.486 "nvme_admin": false, 00:12:23.486 "nvme_io": false, 00:12:23.486 "nvme_io_md": false, 00:12:23.486 "write_zeroes": true, 00:12:23.486 "zcopy": true, 00:12:23.486 "get_zone_info": false, 00:12:23.486 "zone_management": false, 00:12:23.486 "zone_append": false, 00:12:23.486 "compare": false, 00:12:23.486 "compare_and_write": false, 00:12:23.486 "abort": true, 00:12:23.486 "seek_hole": false, 00:12:23.486 "seek_data": false, 00:12:23.486 "copy": true, 00:12:23.486 "nvme_iov_md": false 00:12:23.486 }, 00:12:23.486 "memory_domains": [ 00:12:23.486 { 00:12:23.486 "dma_device_id": "system", 00:12:23.486 "dma_device_type": 1 00:12:23.486 }, 00:12:23.486 { 00:12:23.486 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:23.486 "dma_device_type": 2 00:12:23.486 } 00:12:23.486 ], 00:12:23.486 "driver_specific": {} 00:12:23.486 }' 00:12:23.486 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:23.486 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:23.486 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:23.486 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:23.486 08:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:23.486 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:23.486 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:23.744 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:23.744 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:23.744 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:23.744 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:23.744 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:23.744 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:23.744 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:23.744 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:24.003 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:24.003 "name": "BaseBdev2", 00:12:24.003 "aliases": [ 00:12:24.003 "6ff36860-dd2a-4424-a9fb-5a6820794c07" 00:12:24.003 ], 00:12:24.003 "product_name": "Malloc disk", 00:12:24.003 "block_size": 512, 00:12:24.003 "num_blocks": 65536, 00:12:24.003 "uuid": "6ff36860-dd2a-4424-a9fb-5a6820794c07", 00:12:24.003 "assigned_rate_limits": { 00:12:24.003 "rw_ios_per_sec": 0, 00:12:24.003 "rw_mbytes_per_sec": 0, 00:12:24.003 "r_mbytes_per_sec": 0, 00:12:24.003 "w_mbytes_per_sec": 0 00:12:24.003 }, 00:12:24.003 "claimed": true, 00:12:24.003 "claim_type": "exclusive_write", 00:12:24.003 "zoned": false, 00:12:24.003 "supported_io_types": { 00:12:24.003 "read": true, 00:12:24.003 "write": true, 00:12:24.003 "unmap": true, 00:12:24.003 "flush": true, 00:12:24.003 "reset": true, 00:12:24.003 "nvme_admin": false, 00:12:24.003 "nvme_io": false, 00:12:24.003 "nvme_io_md": false, 00:12:24.003 "write_zeroes": true, 00:12:24.003 "zcopy": true, 00:12:24.003 "get_zone_info": false, 00:12:24.003 "zone_management": false, 00:12:24.003 "zone_append": false, 00:12:24.003 "compare": false, 00:12:24.003 "compare_and_write": false, 00:12:24.003 "abort": true, 00:12:24.003 "seek_hole": false, 00:12:24.003 "seek_data": false, 00:12:24.004 "copy": true, 00:12:24.004 "nvme_iov_md": false 00:12:24.004 }, 00:12:24.004 "memory_domains": [ 00:12:24.004 { 00:12:24.004 "dma_device_id": "system", 00:12:24.004 "dma_device_type": 1 00:12:24.004 }, 00:12:24.004 { 00:12:24.004 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:24.004 "dma_device_type": 2 00:12:24.004 } 00:12:24.004 ], 00:12:24.004 "driver_specific": {} 00:12:24.004 }' 00:12:24.004 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:24.004 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:24.004 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:24.004 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:24.004 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:24.004 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:24.004 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:24.261 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:24.261 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:24.261 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:24.261 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:24.261 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:24.261 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:24.520 [2024-07-23 08:25:36.803666] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:24.520 [2024-07-23 08:25:36.803695] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:24.520 [2024-07-23 08:25:36.803745] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:24.520 08:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:24.520 08:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:24.520 "name": "Existed_Raid", 00:12:24.520 "uuid": "6a2f9fe8-3a75-403c-835e-a5529b36d2d3", 00:12:24.520 "strip_size_kb": 64, 00:12:24.520 "state": "offline", 00:12:24.520 "raid_level": "concat", 00:12:24.520 "superblock": false, 00:12:24.520 "num_base_bdevs": 2, 00:12:24.520 "num_base_bdevs_discovered": 1, 00:12:24.520 "num_base_bdevs_operational": 1, 00:12:24.520 "base_bdevs_list": [ 00:12:24.520 { 00:12:24.520 "name": null, 00:12:24.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:24.520 "is_configured": false, 00:12:24.520 "data_offset": 0, 00:12:24.520 "data_size": 65536 00:12:24.520 }, 00:12:24.520 { 00:12:24.520 "name": "BaseBdev2", 00:12:24.520 "uuid": "6ff36860-dd2a-4424-a9fb-5a6820794c07", 00:12:24.520 "is_configured": true, 00:12:24.520 "data_offset": 0, 00:12:24.520 "data_size": 65536 00:12:24.520 } 00:12:24.520 ] 00:12:24.520 }' 00:12:24.520 08:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:24.520 08:25:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:25.087 08:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:25.087 08:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:25.087 08:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.087 08:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:25.345 08:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:25.345 08:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:25.345 08:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:25.345 [2024-07-23 08:25:37.795943] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:25.345 [2024-07-23 08:25:37.796003] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:12:25.604 08:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:25.604 08:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:25.604 08:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.604 08:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:25.604 08:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:25.604 08:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:25.604 08:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:25.604 08:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1406302 00:12:25.604 08:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1406302 ']' 00:12:25.605 08:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1406302 00:12:25.605 08:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:25.605 08:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:25.605 08:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1406302 00:12:25.864 08:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:25.864 08:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:25.864 08:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1406302' 00:12:25.864 killing process with pid 1406302 00:12:25.864 08:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1406302 00:12:25.864 [2024-07-23 08:25:38.127548] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:25.864 08:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1406302 00:12:25.864 [2024-07-23 08:25:38.145681] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:27.242 00:12:27.242 real 0m9.433s 00:12:27.242 user 0m15.785s 00:12:27.242 sys 0m1.404s 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:27.242 ************************************ 00:12:27.242 END TEST raid_state_function_test 00:12:27.242 ************************************ 00:12:27.242 08:25:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:27.242 08:25:39 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:12:27.242 08:25:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:27.242 08:25:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:27.242 08:25:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:27.242 ************************************ 00:12:27.242 START TEST raid_state_function_test_sb 00:12:27.242 ************************************ 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1408286 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1408286' 00:12:27.242 Process raid pid: 1408286 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1408286 /var/tmp/spdk-raid.sock 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1408286 ']' 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:27.242 08:25:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:27.243 08:25:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:27.243 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:27.243 08:25:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:27.243 08:25:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:27.243 [2024-07-23 08:25:39.558394] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:12:27.243 [2024-07-23 08:25:39.558478] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:27.243 [2024-07-23 08:25:39.684821] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:27.502 [2024-07-23 08:25:39.898871] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:27.762 [2024-07-23 08:25:40.189259] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:27.762 [2024-07-23 08:25:40.189288] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:28.021 08:25:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:28.021 08:25:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:28.021 08:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:28.021 [2024-07-23 08:25:40.477526] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:28.021 [2024-07-23 08:25:40.477569] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:28.021 [2024-07-23 08:25:40.477579] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:28.021 [2024-07-23 08:25:40.477590] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:28.021 08:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:28.021 08:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:28.021 08:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:28.021 08:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:28.021 08:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:28.021 08:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:28.021 08:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:28.021 08:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:28.021 08:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:28.021 08:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:28.021 08:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.021 08:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:28.280 08:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:28.280 "name": "Existed_Raid", 00:12:28.280 "uuid": "5a5dcce7-7cb8-44e6-b090-a34238a60b97", 00:12:28.280 "strip_size_kb": 64, 00:12:28.280 "state": "configuring", 00:12:28.280 "raid_level": "concat", 00:12:28.280 "superblock": true, 00:12:28.280 "num_base_bdevs": 2, 00:12:28.280 "num_base_bdevs_discovered": 0, 00:12:28.280 "num_base_bdevs_operational": 2, 00:12:28.280 "base_bdevs_list": [ 00:12:28.280 { 00:12:28.280 "name": "BaseBdev1", 00:12:28.280 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:28.280 "is_configured": false, 00:12:28.280 "data_offset": 0, 00:12:28.280 "data_size": 0 00:12:28.280 }, 00:12:28.280 { 00:12:28.280 "name": "BaseBdev2", 00:12:28.280 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:28.280 "is_configured": false, 00:12:28.280 "data_offset": 0, 00:12:28.280 "data_size": 0 00:12:28.280 } 00:12:28.280 ] 00:12:28.280 }' 00:12:28.280 08:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:28.280 08:25:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:28.847 08:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:28.848 [2024-07-23 08:25:41.303571] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:28.848 [2024-07-23 08:25:41.303605] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:12:28.848 08:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:29.149 [2024-07-23 08:25:41.472033] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:29.149 [2024-07-23 08:25:41.472076] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:29.149 [2024-07-23 08:25:41.472085] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:29.149 [2024-07-23 08:25:41.472094] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:29.149 08:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:29.409 [2024-07-23 08:25:41.679585] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:29.409 BaseBdev1 00:12:29.409 08:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:29.409 08:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:29.409 08:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:29.409 08:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:29.409 08:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:29.409 08:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:29.409 08:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:29.409 08:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:29.668 [ 00:12:29.668 { 00:12:29.668 "name": "BaseBdev1", 00:12:29.668 "aliases": [ 00:12:29.668 "12c65c3c-d504-47f9-872a-8913f159df23" 00:12:29.668 ], 00:12:29.668 "product_name": "Malloc disk", 00:12:29.668 "block_size": 512, 00:12:29.668 "num_blocks": 65536, 00:12:29.668 "uuid": "12c65c3c-d504-47f9-872a-8913f159df23", 00:12:29.668 "assigned_rate_limits": { 00:12:29.668 "rw_ios_per_sec": 0, 00:12:29.668 "rw_mbytes_per_sec": 0, 00:12:29.668 "r_mbytes_per_sec": 0, 00:12:29.668 "w_mbytes_per_sec": 0 00:12:29.668 }, 00:12:29.668 "claimed": true, 00:12:29.668 "claim_type": "exclusive_write", 00:12:29.668 "zoned": false, 00:12:29.668 "supported_io_types": { 00:12:29.668 "read": true, 00:12:29.668 "write": true, 00:12:29.668 "unmap": true, 00:12:29.668 "flush": true, 00:12:29.668 "reset": true, 00:12:29.668 "nvme_admin": false, 00:12:29.668 "nvme_io": false, 00:12:29.668 "nvme_io_md": false, 00:12:29.668 "write_zeroes": true, 00:12:29.668 "zcopy": true, 00:12:29.668 "get_zone_info": false, 00:12:29.668 "zone_management": false, 00:12:29.668 "zone_append": false, 00:12:29.668 "compare": false, 00:12:29.668 "compare_and_write": false, 00:12:29.668 "abort": true, 00:12:29.668 "seek_hole": false, 00:12:29.668 "seek_data": false, 00:12:29.669 "copy": true, 00:12:29.669 "nvme_iov_md": false 00:12:29.669 }, 00:12:29.669 "memory_domains": [ 00:12:29.669 { 00:12:29.669 "dma_device_id": "system", 00:12:29.669 "dma_device_type": 1 00:12:29.669 }, 00:12:29.669 { 00:12:29.669 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.669 "dma_device_type": 2 00:12:29.669 } 00:12:29.669 ], 00:12:29.669 "driver_specific": {} 00:12:29.669 } 00:12:29.669 ] 00:12:29.669 08:25:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:29.669 08:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:29.669 08:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:29.669 08:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:29.669 08:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:29.669 08:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:29.669 08:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:29.669 08:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:29.669 08:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:29.669 08:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:29.669 08:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:29.669 08:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.669 08:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:29.669 08:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:29.669 "name": "Existed_Raid", 00:12:29.669 "uuid": "49f9e161-8c6c-40aa-89db-ba71adcb97d3", 00:12:29.669 "strip_size_kb": 64, 00:12:29.669 "state": "configuring", 00:12:29.669 "raid_level": "concat", 00:12:29.669 "superblock": true, 00:12:29.669 "num_base_bdevs": 2, 00:12:29.669 "num_base_bdevs_discovered": 1, 00:12:29.669 "num_base_bdevs_operational": 2, 00:12:29.669 "base_bdevs_list": [ 00:12:29.669 { 00:12:29.669 "name": "BaseBdev1", 00:12:29.669 "uuid": "12c65c3c-d504-47f9-872a-8913f159df23", 00:12:29.669 "is_configured": true, 00:12:29.669 "data_offset": 2048, 00:12:29.669 "data_size": 63488 00:12:29.669 }, 00:12:29.669 { 00:12:29.669 "name": "BaseBdev2", 00:12:29.669 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:29.669 "is_configured": false, 00:12:29.669 "data_offset": 0, 00:12:29.669 "data_size": 0 00:12:29.669 } 00:12:29.669 ] 00:12:29.669 }' 00:12:29.669 08:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:29.669 08:25:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:30.236 08:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:30.495 [2024-07-23 08:25:42.822673] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:30.495 [2024-07-23 08:25:42.822722] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:12:30.495 08:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:30.495 [2024-07-23 08:25:42.991124] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:30.495 [2024-07-23 08:25:42.992707] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:30.495 [2024-07-23 08:25:42.992742] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:30.495 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:30.495 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:30.495 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:30.495 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:30.495 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:30.495 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:30.495 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:30.495 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:30.495 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:30.495 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:30.495 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:30.495 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:30.495 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.495 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:30.755 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:30.755 "name": "Existed_Raid", 00:12:30.755 "uuid": "27a83c72-617b-48e9-83b0-bfc302873c6b", 00:12:30.755 "strip_size_kb": 64, 00:12:30.755 "state": "configuring", 00:12:30.755 "raid_level": "concat", 00:12:30.755 "superblock": true, 00:12:30.755 "num_base_bdevs": 2, 00:12:30.755 "num_base_bdevs_discovered": 1, 00:12:30.755 "num_base_bdevs_operational": 2, 00:12:30.755 "base_bdevs_list": [ 00:12:30.755 { 00:12:30.755 "name": "BaseBdev1", 00:12:30.755 "uuid": "12c65c3c-d504-47f9-872a-8913f159df23", 00:12:30.755 "is_configured": true, 00:12:30.755 "data_offset": 2048, 00:12:30.755 "data_size": 63488 00:12:30.755 }, 00:12:30.755 { 00:12:30.755 "name": "BaseBdev2", 00:12:30.755 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:30.755 "is_configured": false, 00:12:30.755 "data_offset": 0, 00:12:30.755 "data_size": 0 00:12:30.755 } 00:12:30.755 ] 00:12:30.755 }' 00:12:30.755 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:30.755 08:25:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:31.321 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:31.580 [2024-07-23 08:25:43.864836] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:31.580 [2024-07-23 08:25:43.865049] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:12:31.580 [2024-07-23 08:25:43.865063] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:31.580 [2024-07-23 08:25:43.865298] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:12:31.580 [2024-07-23 08:25:43.865462] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:12:31.580 [2024-07-23 08:25:43.865474] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:12:31.580 [2024-07-23 08:25:43.865628] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:31.580 BaseBdev2 00:12:31.580 08:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:31.580 08:25:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:31.580 08:25:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:31.580 08:25:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:31.580 08:25:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:31.580 08:25:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:31.580 08:25:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:31.580 08:25:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:31.839 [ 00:12:31.839 { 00:12:31.839 "name": "BaseBdev2", 00:12:31.839 "aliases": [ 00:12:31.839 "1c91bb25-5417-4ab3-aafe-a885191cdefe" 00:12:31.839 ], 00:12:31.839 "product_name": "Malloc disk", 00:12:31.839 "block_size": 512, 00:12:31.839 "num_blocks": 65536, 00:12:31.839 "uuid": "1c91bb25-5417-4ab3-aafe-a885191cdefe", 00:12:31.839 "assigned_rate_limits": { 00:12:31.839 "rw_ios_per_sec": 0, 00:12:31.839 "rw_mbytes_per_sec": 0, 00:12:31.839 "r_mbytes_per_sec": 0, 00:12:31.839 "w_mbytes_per_sec": 0 00:12:31.839 }, 00:12:31.839 "claimed": true, 00:12:31.839 "claim_type": "exclusive_write", 00:12:31.839 "zoned": false, 00:12:31.839 "supported_io_types": { 00:12:31.839 "read": true, 00:12:31.839 "write": true, 00:12:31.839 "unmap": true, 00:12:31.839 "flush": true, 00:12:31.839 "reset": true, 00:12:31.839 "nvme_admin": false, 00:12:31.839 "nvme_io": false, 00:12:31.839 "nvme_io_md": false, 00:12:31.839 "write_zeroes": true, 00:12:31.839 "zcopy": true, 00:12:31.839 "get_zone_info": false, 00:12:31.839 "zone_management": false, 00:12:31.839 "zone_append": false, 00:12:31.839 "compare": false, 00:12:31.839 "compare_and_write": false, 00:12:31.839 "abort": true, 00:12:31.839 "seek_hole": false, 00:12:31.839 "seek_data": false, 00:12:31.839 "copy": true, 00:12:31.839 "nvme_iov_md": false 00:12:31.839 }, 00:12:31.839 "memory_domains": [ 00:12:31.839 { 00:12:31.839 "dma_device_id": "system", 00:12:31.839 "dma_device_type": 1 00:12:31.839 }, 00:12:31.839 { 00:12:31.839 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.839 "dma_device_type": 2 00:12:31.839 } 00:12:31.839 ], 00:12:31.839 "driver_specific": {} 00:12:31.839 } 00:12:31.839 ] 00:12:31.839 08:25:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:31.839 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:31.839 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:31.839 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:12:31.839 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:31.839 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:31.839 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:31.839 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:31.839 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:31.839 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:31.839 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:31.839 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:31.839 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:31.839 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.839 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:32.098 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:32.098 "name": "Existed_Raid", 00:12:32.098 "uuid": "27a83c72-617b-48e9-83b0-bfc302873c6b", 00:12:32.098 "strip_size_kb": 64, 00:12:32.098 "state": "online", 00:12:32.098 "raid_level": "concat", 00:12:32.098 "superblock": true, 00:12:32.098 "num_base_bdevs": 2, 00:12:32.098 "num_base_bdevs_discovered": 2, 00:12:32.098 "num_base_bdevs_operational": 2, 00:12:32.098 "base_bdevs_list": [ 00:12:32.098 { 00:12:32.098 "name": "BaseBdev1", 00:12:32.098 "uuid": "12c65c3c-d504-47f9-872a-8913f159df23", 00:12:32.098 "is_configured": true, 00:12:32.098 "data_offset": 2048, 00:12:32.098 "data_size": 63488 00:12:32.098 }, 00:12:32.098 { 00:12:32.098 "name": "BaseBdev2", 00:12:32.098 "uuid": "1c91bb25-5417-4ab3-aafe-a885191cdefe", 00:12:32.098 "is_configured": true, 00:12:32.098 "data_offset": 2048, 00:12:32.098 "data_size": 63488 00:12:32.098 } 00:12:32.098 ] 00:12:32.098 }' 00:12:32.098 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:32.098 08:25:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:32.357 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:32.357 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:32.357 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:32.357 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:32.357 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:32.357 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:32.357 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:32.357 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:32.617 [2024-07-23 08:25:44.980078] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:32.617 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:32.617 "name": "Existed_Raid", 00:12:32.617 "aliases": [ 00:12:32.617 "27a83c72-617b-48e9-83b0-bfc302873c6b" 00:12:32.617 ], 00:12:32.617 "product_name": "Raid Volume", 00:12:32.617 "block_size": 512, 00:12:32.617 "num_blocks": 126976, 00:12:32.617 "uuid": "27a83c72-617b-48e9-83b0-bfc302873c6b", 00:12:32.617 "assigned_rate_limits": { 00:12:32.617 "rw_ios_per_sec": 0, 00:12:32.617 "rw_mbytes_per_sec": 0, 00:12:32.617 "r_mbytes_per_sec": 0, 00:12:32.617 "w_mbytes_per_sec": 0 00:12:32.617 }, 00:12:32.617 "claimed": false, 00:12:32.617 "zoned": false, 00:12:32.617 "supported_io_types": { 00:12:32.617 "read": true, 00:12:32.617 "write": true, 00:12:32.617 "unmap": true, 00:12:32.617 "flush": true, 00:12:32.617 "reset": true, 00:12:32.617 "nvme_admin": false, 00:12:32.617 "nvme_io": false, 00:12:32.617 "nvme_io_md": false, 00:12:32.617 "write_zeroes": true, 00:12:32.617 "zcopy": false, 00:12:32.617 "get_zone_info": false, 00:12:32.617 "zone_management": false, 00:12:32.617 "zone_append": false, 00:12:32.617 "compare": false, 00:12:32.617 "compare_and_write": false, 00:12:32.617 "abort": false, 00:12:32.617 "seek_hole": false, 00:12:32.617 "seek_data": false, 00:12:32.617 "copy": false, 00:12:32.617 "nvme_iov_md": false 00:12:32.617 }, 00:12:32.617 "memory_domains": [ 00:12:32.617 { 00:12:32.617 "dma_device_id": "system", 00:12:32.617 "dma_device_type": 1 00:12:32.617 }, 00:12:32.617 { 00:12:32.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.617 "dma_device_type": 2 00:12:32.617 }, 00:12:32.617 { 00:12:32.617 "dma_device_id": "system", 00:12:32.617 "dma_device_type": 1 00:12:32.617 }, 00:12:32.617 { 00:12:32.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.617 "dma_device_type": 2 00:12:32.617 } 00:12:32.617 ], 00:12:32.617 "driver_specific": { 00:12:32.617 "raid": { 00:12:32.617 "uuid": "27a83c72-617b-48e9-83b0-bfc302873c6b", 00:12:32.617 "strip_size_kb": 64, 00:12:32.617 "state": "online", 00:12:32.617 "raid_level": "concat", 00:12:32.617 "superblock": true, 00:12:32.617 "num_base_bdevs": 2, 00:12:32.617 "num_base_bdevs_discovered": 2, 00:12:32.617 "num_base_bdevs_operational": 2, 00:12:32.617 "base_bdevs_list": [ 00:12:32.617 { 00:12:32.617 "name": "BaseBdev1", 00:12:32.617 "uuid": "12c65c3c-d504-47f9-872a-8913f159df23", 00:12:32.617 "is_configured": true, 00:12:32.617 "data_offset": 2048, 00:12:32.617 "data_size": 63488 00:12:32.617 }, 00:12:32.617 { 00:12:32.617 "name": "BaseBdev2", 00:12:32.617 "uuid": "1c91bb25-5417-4ab3-aafe-a885191cdefe", 00:12:32.617 "is_configured": true, 00:12:32.617 "data_offset": 2048, 00:12:32.617 "data_size": 63488 00:12:32.617 } 00:12:32.617 ] 00:12:32.617 } 00:12:32.617 } 00:12:32.617 }' 00:12:32.617 08:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:32.617 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:32.617 BaseBdev2' 00:12:32.617 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:32.617 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:32.617 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:32.877 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:32.877 "name": "BaseBdev1", 00:12:32.877 "aliases": [ 00:12:32.877 "12c65c3c-d504-47f9-872a-8913f159df23" 00:12:32.877 ], 00:12:32.877 "product_name": "Malloc disk", 00:12:32.877 "block_size": 512, 00:12:32.877 "num_blocks": 65536, 00:12:32.877 "uuid": "12c65c3c-d504-47f9-872a-8913f159df23", 00:12:32.877 "assigned_rate_limits": { 00:12:32.877 "rw_ios_per_sec": 0, 00:12:32.877 "rw_mbytes_per_sec": 0, 00:12:32.877 "r_mbytes_per_sec": 0, 00:12:32.877 "w_mbytes_per_sec": 0 00:12:32.877 }, 00:12:32.877 "claimed": true, 00:12:32.877 "claim_type": "exclusive_write", 00:12:32.877 "zoned": false, 00:12:32.877 "supported_io_types": { 00:12:32.877 "read": true, 00:12:32.877 "write": true, 00:12:32.877 "unmap": true, 00:12:32.877 "flush": true, 00:12:32.877 "reset": true, 00:12:32.877 "nvme_admin": false, 00:12:32.877 "nvme_io": false, 00:12:32.877 "nvme_io_md": false, 00:12:32.877 "write_zeroes": true, 00:12:32.877 "zcopy": true, 00:12:32.877 "get_zone_info": false, 00:12:32.877 "zone_management": false, 00:12:32.877 "zone_append": false, 00:12:32.877 "compare": false, 00:12:32.877 "compare_and_write": false, 00:12:32.877 "abort": true, 00:12:32.877 "seek_hole": false, 00:12:32.877 "seek_data": false, 00:12:32.877 "copy": true, 00:12:32.877 "nvme_iov_md": false 00:12:32.877 }, 00:12:32.877 "memory_domains": [ 00:12:32.877 { 00:12:32.877 "dma_device_id": "system", 00:12:32.877 "dma_device_type": 1 00:12:32.877 }, 00:12:32.877 { 00:12:32.877 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.877 "dma_device_type": 2 00:12:32.877 } 00:12:32.877 ], 00:12:32.877 "driver_specific": {} 00:12:32.877 }' 00:12:32.877 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.877 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.877 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:32.877 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.877 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.877 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:32.877 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.877 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:33.136 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:33.136 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:33.136 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:33.136 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:33.136 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:33.136 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:33.136 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:33.396 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:33.396 "name": "BaseBdev2", 00:12:33.396 "aliases": [ 00:12:33.396 "1c91bb25-5417-4ab3-aafe-a885191cdefe" 00:12:33.396 ], 00:12:33.396 "product_name": "Malloc disk", 00:12:33.396 "block_size": 512, 00:12:33.396 "num_blocks": 65536, 00:12:33.396 "uuid": "1c91bb25-5417-4ab3-aafe-a885191cdefe", 00:12:33.396 "assigned_rate_limits": { 00:12:33.396 "rw_ios_per_sec": 0, 00:12:33.396 "rw_mbytes_per_sec": 0, 00:12:33.396 "r_mbytes_per_sec": 0, 00:12:33.396 "w_mbytes_per_sec": 0 00:12:33.396 }, 00:12:33.396 "claimed": true, 00:12:33.396 "claim_type": "exclusive_write", 00:12:33.396 "zoned": false, 00:12:33.396 "supported_io_types": { 00:12:33.396 "read": true, 00:12:33.396 "write": true, 00:12:33.396 "unmap": true, 00:12:33.396 "flush": true, 00:12:33.396 "reset": true, 00:12:33.396 "nvme_admin": false, 00:12:33.396 "nvme_io": false, 00:12:33.396 "nvme_io_md": false, 00:12:33.396 "write_zeroes": true, 00:12:33.396 "zcopy": true, 00:12:33.396 "get_zone_info": false, 00:12:33.396 "zone_management": false, 00:12:33.396 "zone_append": false, 00:12:33.396 "compare": false, 00:12:33.396 "compare_and_write": false, 00:12:33.396 "abort": true, 00:12:33.396 "seek_hole": false, 00:12:33.396 "seek_data": false, 00:12:33.396 "copy": true, 00:12:33.396 "nvme_iov_md": false 00:12:33.396 }, 00:12:33.396 "memory_domains": [ 00:12:33.396 { 00:12:33.396 "dma_device_id": "system", 00:12:33.396 "dma_device_type": 1 00:12:33.396 }, 00:12:33.396 { 00:12:33.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:33.396 "dma_device_type": 2 00:12:33.396 } 00:12:33.396 ], 00:12:33.396 "driver_specific": {} 00:12:33.396 }' 00:12:33.396 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:33.396 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:33.396 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:33.396 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:33.396 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:33.396 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:33.396 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:33.396 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:33.396 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:33.396 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:33.655 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:33.655 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:33.655 08:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:33.655 [2024-07-23 08:25:46.122896] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:33.655 [2024-07-23 08:25:46.122924] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:33.655 [2024-07-23 08:25:46.122973] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:33.655 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:33.655 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:33.655 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:33.655 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:33.655 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:33.655 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:12:33.655 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:33.655 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:33.655 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:33.655 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:33.655 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:33.655 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:33.655 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:33.655 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:33.914 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:33.914 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.914 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:33.914 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:33.914 "name": "Existed_Raid", 00:12:33.914 "uuid": "27a83c72-617b-48e9-83b0-bfc302873c6b", 00:12:33.914 "strip_size_kb": 64, 00:12:33.914 "state": "offline", 00:12:33.914 "raid_level": "concat", 00:12:33.914 "superblock": true, 00:12:33.914 "num_base_bdevs": 2, 00:12:33.914 "num_base_bdevs_discovered": 1, 00:12:33.914 "num_base_bdevs_operational": 1, 00:12:33.914 "base_bdevs_list": [ 00:12:33.914 { 00:12:33.914 "name": null, 00:12:33.914 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.914 "is_configured": false, 00:12:33.914 "data_offset": 2048, 00:12:33.914 "data_size": 63488 00:12:33.914 }, 00:12:33.914 { 00:12:33.914 "name": "BaseBdev2", 00:12:33.914 "uuid": "1c91bb25-5417-4ab3-aafe-a885191cdefe", 00:12:33.914 "is_configured": true, 00:12:33.914 "data_offset": 2048, 00:12:33.914 "data_size": 63488 00:12:33.914 } 00:12:33.914 ] 00:12:33.914 }' 00:12:33.914 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:33.914 08:25:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:34.481 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:34.481 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:34.481 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.481 08:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:34.740 08:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:34.740 08:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:34.740 08:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:34.740 [2024-07-23 08:25:47.171228] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:34.740 [2024-07-23 08:25:47.171280] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1408286 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1408286 ']' 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1408286 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1408286 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1408286' 00:12:35.000 killing process with pid 1408286 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1408286 00:12:35.000 [2024-07-23 08:25:47.506697] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:35.000 08:25:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1408286 00:12:35.259 [2024-07-23 08:25:47.525210] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:36.636 08:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:36.636 00:12:36.636 real 0m9.327s 00:12:36.636 user 0m15.556s 00:12:36.636 sys 0m1.408s 00:12:36.636 08:25:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:36.636 08:25:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:36.636 ************************************ 00:12:36.636 END TEST raid_state_function_test_sb 00:12:36.636 ************************************ 00:12:36.636 08:25:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:36.636 08:25:48 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:12:36.636 08:25:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:36.636 08:25:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:36.636 08:25:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:36.636 ************************************ 00:12:36.636 START TEST raid_superblock_test 00:12:36.636 ************************************ 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1410201 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1410201 /var/tmp/spdk-raid.sock 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1410201 ']' 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:36.636 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:36.636 08:25:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:36.636 [2024-07-23 08:25:48.944500] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:12:36.636 [2024-07-23 08:25:48.944602] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1410201 ] 00:12:36.636 [2024-07-23 08:25:49.069303] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:36.895 [2024-07-23 08:25:49.277965] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:37.153 [2024-07-23 08:25:49.522114] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:37.153 [2024-07-23 08:25:49.522145] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:37.412 08:25:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:37.412 08:25:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:37.412 08:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:37.412 08:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:37.412 08:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:37.412 08:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:37.412 08:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:37.412 08:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:37.412 08:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:37.412 08:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:37.412 08:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:37.412 malloc1 00:12:37.671 08:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:37.671 [2024-07-23 08:25:50.092655] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:37.671 [2024-07-23 08:25:50.092716] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:37.671 [2024-07-23 08:25:50.092755] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:12:37.671 [2024-07-23 08:25:50.092768] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:37.671 [2024-07-23 08:25:50.094775] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:37.671 [2024-07-23 08:25:50.094804] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:37.671 pt1 00:12:37.671 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:37.671 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:37.671 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:37.671 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:37.671 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:37.671 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:37.671 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:37.671 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:37.671 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:37.930 malloc2 00:12:37.930 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:38.189 [2024-07-23 08:25:50.491141] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:38.189 [2024-07-23 08:25:50.491196] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:38.189 [2024-07-23 08:25:50.491232] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:12:38.189 [2024-07-23 08:25:50.491242] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:38.189 [2024-07-23 08:25:50.493237] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:38.189 [2024-07-23 08:25:50.493260] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:38.189 pt2 00:12:38.189 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:38.189 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:38.189 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:12:38.189 [2024-07-23 08:25:50.647562] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:38.190 [2024-07-23 08:25:50.649123] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:38.190 [2024-07-23 08:25:50.649290] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035a80 00:12:38.190 [2024-07-23 08:25:50.649303] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:38.190 [2024-07-23 08:25:50.649562] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:12:38.190 [2024-07-23 08:25:50.649748] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035a80 00:12:38.190 [2024-07-23 08:25:50.649764] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000035a80 00:12:38.190 [2024-07-23 08:25:50.649918] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:38.190 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:38.190 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:38.190 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:38.190 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:38.190 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:38.190 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:38.190 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:38.190 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:38.190 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:38.190 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:38.190 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.190 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:38.449 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:38.449 "name": "raid_bdev1", 00:12:38.449 "uuid": "fca2885c-3bdb-4ef1-9bc1-947604e29562", 00:12:38.449 "strip_size_kb": 64, 00:12:38.449 "state": "online", 00:12:38.449 "raid_level": "concat", 00:12:38.449 "superblock": true, 00:12:38.449 "num_base_bdevs": 2, 00:12:38.449 "num_base_bdevs_discovered": 2, 00:12:38.449 "num_base_bdevs_operational": 2, 00:12:38.449 "base_bdevs_list": [ 00:12:38.449 { 00:12:38.449 "name": "pt1", 00:12:38.449 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:38.449 "is_configured": true, 00:12:38.449 "data_offset": 2048, 00:12:38.449 "data_size": 63488 00:12:38.449 }, 00:12:38.449 { 00:12:38.449 "name": "pt2", 00:12:38.449 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:38.449 "is_configured": true, 00:12:38.449 "data_offset": 2048, 00:12:38.449 "data_size": 63488 00:12:38.449 } 00:12:38.449 ] 00:12:38.449 }' 00:12:38.449 08:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:38.449 08:25:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:39.016 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:39.016 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:39.016 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:39.016 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:39.016 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:39.016 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:39.016 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:39.016 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:39.016 [2024-07-23 08:25:51.482004] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:39.016 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:39.016 "name": "raid_bdev1", 00:12:39.016 "aliases": [ 00:12:39.016 "fca2885c-3bdb-4ef1-9bc1-947604e29562" 00:12:39.016 ], 00:12:39.016 "product_name": "Raid Volume", 00:12:39.016 "block_size": 512, 00:12:39.016 "num_blocks": 126976, 00:12:39.016 "uuid": "fca2885c-3bdb-4ef1-9bc1-947604e29562", 00:12:39.016 "assigned_rate_limits": { 00:12:39.016 "rw_ios_per_sec": 0, 00:12:39.016 "rw_mbytes_per_sec": 0, 00:12:39.016 "r_mbytes_per_sec": 0, 00:12:39.016 "w_mbytes_per_sec": 0 00:12:39.016 }, 00:12:39.016 "claimed": false, 00:12:39.016 "zoned": false, 00:12:39.016 "supported_io_types": { 00:12:39.016 "read": true, 00:12:39.016 "write": true, 00:12:39.016 "unmap": true, 00:12:39.016 "flush": true, 00:12:39.016 "reset": true, 00:12:39.016 "nvme_admin": false, 00:12:39.016 "nvme_io": false, 00:12:39.016 "nvme_io_md": false, 00:12:39.016 "write_zeroes": true, 00:12:39.016 "zcopy": false, 00:12:39.016 "get_zone_info": false, 00:12:39.016 "zone_management": false, 00:12:39.016 "zone_append": false, 00:12:39.016 "compare": false, 00:12:39.016 "compare_and_write": false, 00:12:39.016 "abort": false, 00:12:39.016 "seek_hole": false, 00:12:39.016 "seek_data": false, 00:12:39.016 "copy": false, 00:12:39.016 "nvme_iov_md": false 00:12:39.016 }, 00:12:39.016 "memory_domains": [ 00:12:39.016 { 00:12:39.016 "dma_device_id": "system", 00:12:39.016 "dma_device_type": 1 00:12:39.016 }, 00:12:39.016 { 00:12:39.016 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:39.016 "dma_device_type": 2 00:12:39.016 }, 00:12:39.016 { 00:12:39.016 "dma_device_id": "system", 00:12:39.016 "dma_device_type": 1 00:12:39.016 }, 00:12:39.016 { 00:12:39.016 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:39.016 "dma_device_type": 2 00:12:39.016 } 00:12:39.016 ], 00:12:39.016 "driver_specific": { 00:12:39.016 "raid": { 00:12:39.016 "uuid": "fca2885c-3bdb-4ef1-9bc1-947604e29562", 00:12:39.016 "strip_size_kb": 64, 00:12:39.016 "state": "online", 00:12:39.016 "raid_level": "concat", 00:12:39.016 "superblock": true, 00:12:39.016 "num_base_bdevs": 2, 00:12:39.016 "num_base_bdevs_discovered": 2, 00:12:39.016 "num_base_bdevs_operational": 2, 00:12:39.016 "base_bdevs_list": [ 00:12:39.016 { 00:12:39.016 "name": "pt1", 00:12:39.016 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:39.016 "is_configured": true, 00:12:39.016 "data_offset": 2048, 00:12:39.016 "data_size": 63488 00:12:39.016 }, 00:12:39.016 { 00:12:39.016 "name": "pt2", 00:12:39.016 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:39.016 "is_configured": true, 00:12:39.016 "data_offset": 2048, 00:12:39.016 "data_size": 63488 00:12:39.016 } 00:12:39.016 ] 00:12:39.016 } 00:12:39.016 } 00:12:39.016 }' 00:12:39.017 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:39.276 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:39.276 pt2' 00:12:39.276 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:39.276 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:39.276 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:39.276 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:39.276 "name": "pt1", 00:12:39.276 "aliases": [ 00:12:39.276 "00000000-0000-0000-0000-000000000001" 00:12:39.276 ], 00:12:39.276 "product_name": "passthru", 00:12:39.276 "block_size": 512, 00:12:39.276 "num_blocks": 65536, 00:12:39.276 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:39.276 "assigned_rate_limits": { 00:12:39.276 "rw_ios_per_sec": 0, 00:12:39.276 "rw_mbytes_per_sec": 0, 00:12:39.276 "r_mbytes_per_sec": 0, 00:12:39.276 "w_mbytes_per_sec": 0 00:12:39.276 }, 00:12:39.276 "claimed": true, 00:12:39.276 "claim_type": "exclusive_write", 00:12:39.276 "zoned": false, 00:12:39.276 "supported_io_types": { 00:12:39.276 "read": true, 00:12:39.276 "write": true, 00:12:39.276 "unmap": true, 00:12:39.276 "flush": true, 00:12:39.276 "reset": true, 00:12:39.276 "nvme_admin": false, 00:12:39.276 "nvme_io": false, 00:12:39.276 "nvme_io_md": false, 00:12:39.276 "write_zeroes": true, 00:12:39.276 "zcopy": true, 00:12:39.276 "get_zone_info": false, 00:12:39.276 "zone_management": false, 00:12:39.276 "zone_append": false, 00:12:39.276 "compare": false, 00:12:39.276 "compare_and_write": false, 00:12:39.276 "abort": true, 00:12:39.276 "seek_hole": false, 00:12:39.276 "seek_data": false, 00:12:39.276 "copy": true, 00:12:39.276 "nvme_iov_md": false 00:12:39.276 }, 00:12:39.276 "memory_domains": [ 00:12:39.276 { 00:12:39.276 "dma_device_id": "system", 00:12:39.276 "dma_device_type": 1 00:12:39.276 }, 00:12:39.276 { 00:12:39.276 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:39.276 "dma_device_type": 2 00:12:39.276 } 00:12:39.276 ], 00:12:39.276 "driver_specific": { 00:12:39.276 "passthru": { 00:12:39.276 "name": "pt1", 00:12:39.276 "base_bdev_name": "malloc1" 00:12:39.276 } 00:12:39.276 } 00:12:39.276 }' 00:12:39.276 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:39.276 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:39.535 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:39.535 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:39.535 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:39.535 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:39.535 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:39.535 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:39.535 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:39.535 08:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:39.535 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:39.535 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:39.535 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:39.535 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:39.535 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:39.793 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:39.793 "name": "pt2", 00:12:39.793 "aliases": [ 00:12:39.793 "00000000-0000-0000-0000-000000000002" 00:12:39.793 ], 00:12:39.793 "product_name": "passthru", 00:12:39.793 "block_size": 512, 00:12:39.793 "num_blocks": 65536, 00:12:39.793 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:39.793 "assigned_rate_limits": { 00:12:39.793 "rw_ios_per_sec": 0, 00:12:39.793 "rw_mbytes_per_sec": 0, 00:12:39.793 "r_mbytes_per_sec": 0, 00:12:39.793 "w_mbytes_per_sec": 0 00:12:39.793 }, 00:12:39.793 "claimed": true, 00:12:39.793 "claim_type": "exclusive_write", 00:12:39.793 "zoned": false, 00:12:39.793 "supported_io_types": { 00:12:39.793 "read": true, 00:12:39.793 "write": true, 00:12:39.793 "unmap": true, 00:12:39.793 "flush": true, 00:12:39.793 "reset": true, 00:12:39.793 "nvme_admin": false, 00:12:39.793 "nvme_io": false, 00:12:39.793 "nvme_io_md": false, 00:12:39.793 "write_zeroes": true, 00:12:39.793 "zcopy": true, 00:12:39.793 "get_zone_info": false, 00:12:39.793 "zone_management": false, 00:12:39.793 "zone_append": false, 00:12:39.793 "compare": false, 00:12:39.793 "compare_and_write": false, 00:12:39.793 "abort": true, 00:12:39.793 "seek_hole": false, 00:12:39.793 "seek_data": false, 00:12:39.793 "copy": true, 00:12:39.793 "nvme_iov_md": false 00:12:39.793 }, 00:12:39.793 "memory_domains": [ 00:12:39.793 { 00:12:39.793 "dma_device_id": "system", 00:12:39.793 "dma_device_type": 1 00:12:39.793 }, 00:12:39.793 { 00:12:39.793 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:39.793 "dma_device_type": 2 00:12:39.793 } 00:12:39.793 ], 00:12:39.793 "driver_specific": { 00:12:39.793 "passthru": { 00:12:39.793 "name": "pt2", 00:12:39.793 "base_bdev_name": "malloc2" 00:12:39.793 } 00:12:39.793 } 00:12:39.793 }' 00:12:39.793 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:39.793 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:40.052 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:40.052 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:40.052 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:40.052 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:40.052 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:40.052 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:40.052 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:40.052 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:40.052 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:40.052 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:40.052 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:40.052 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:40.311 [2024-07-23 08:25:52.681191] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:40.311 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=fca2885c-3bdb-4ef1-9bc1-947604e29562 00:12:40.311 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z fca2885c-3bdb-4ef1-9bc1-947604e29562 ']' 00:12:40.311 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:40.569 [2024-07-23 08:25:52.869475] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:40.569 [2024-07-23 08:25:52.869502] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:40.569 [2024-07-23 08:25:52.869577] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:40.569 [2024-07-23 08:25:52.869638] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:40.569 [2024-07-23 08:25:52.869657] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035a80 name raid_bdev1, state offline 00:12:40.569 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.569 08:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:40.569 08:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:40.569 08:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:40.569 08:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:40.569 08:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:40.828 08:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:40.828 08:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:41.144 08:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:41.144 08:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:41.144 08:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:41.144 08:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:41.144 08:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:41.144 08:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:41.144 08:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:41.144 08:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:41.144 08:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:41.144 08:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:41.144 08:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:41.144 08:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:41.144 08:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:41.144 08:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:41.144 08:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:41.402 [2024-07-23 08:25:53.715719] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:41.402 [2024-07-23 08:25:53.717321] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:41.402 [2024-07-23 08:25:53.717380] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:41.402 [2024-07-23 08:25:53.717422] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:41.402 [2024-07-23 08:25:53.717456] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:41.402 [2024-07-23 08:25:53.717470] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036080 name raid_bdev1, state configuring 00:12:41.402 request: 00:12:41.402 { 00:12:41.402 "name": "raid_bdev1", 00:12:41.402 "raid_level": "concat", 00:12:41.402 "base_bdevs": [ 00:12:41.402 "malloc1", 00:12:41.402 "malloc2" 00:12:41.402 ], 00:12:41.402 "strip_size_kb": 64, 00:12:41.402 "superblock": false, 00:12:41.402 "method": "bdev_raid_create", 00:12:41.402 "req_id": 1 00:12:41.402 } 00:12:41.402 Got JSON-RPC error response 00:12:41.402 response: 00:12:41.402 { 00:12:41.402 "code": -17, 00:12:41.402 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:41.402 } 00:12:41.402 08:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:41.402 08:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:41.402 08:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:41.402 08:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:41.402 08:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.402 08:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:41.402 08:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:41.402 08:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:41.402 08:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:41.661 [2024-07-23 08:25:54.044488] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:41.661 [2024-07-23 08:25:54.044543] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:41.661 [2024-07-23 08:25:54.044578] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036680 00:12:41.661 [2024-07-23 08:25:54.044590] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:41.661 [2024-07-23 08:25:54.046620] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:41.661 [2024-07-23 08:25:54.046649] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:41.661 [2024-07-23 08:25:54.046729] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:41.661 [2024-07-23 08:25:54.046794] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:41.661 pt1 00:12:41.661 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:12:41.661 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:41.661 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:41.661 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:41.661 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:41.661 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:41.661 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:41.661 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:41.661 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:41.661 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:41.661 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.661 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:41.920 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:41.920 "name": "raid_bdev1", 00:12:41.920 "uuid": "fca2885c-3bdb-4ef1-9bc1-947604e29562", 00:12:41.920 "strip_size_kb": 64, 00:12:41.920 "state": "configuring", 00:12:41.920 "raid_level": "concat", 00:12:41.920 "superblock": true, 00:12:41.920 "num_base_bdevs": 2, 00:12:41.920 "num_base_bdevs_discovered": 1, 00:12:41.920 "num_base_bdevs_operational": 2, 00:12:41.920 "base_bdevs_list": [ 00:12:41.920 { 00:12:41.920 "name": "pt1", 00:12:41.920 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:41.920 "is_configured": true, 00:12:41.920 "data_offset": 2048, 00:12:41.920 "data_size": 63488 00:12:41.920 }, 00:12:41.920 { 00:12:41.920 "name": null, 00:12:41.920 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:41.920 "is_configured": false, 00:12:41.920 "data_offset": 2048, 00:12:41.920 "data_size": 63488 00:12:41.920 } 00:12:41.920 ] 00:12:41.920 }' 00:12:41.921 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:41.921 08:25:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:42.489 [2024-07-23 08:25:54.882693] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:42.489 [2024-07-23 08:25:54.882751] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:42.489 [2024-07-23 08:25:54.882770] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036f80 00:12:42.489 [2024-07-23 08:25:54.882780] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:42.489 [2024-07-23 08:25:54.883213] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:42.489 [2024-07-23 08:25:54.883231] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:42.489 [2024-07-23 08:25:54.883305] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:42.489 [2024-07-23 08:25:54.883328] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:42.489 [2024-07-23 08:25:54.883453] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036c80 00:12:42.489 [2024-07-23 08:25:54.883467] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:42.489 [2024-07-23 08:25:54.883683] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:12:42.489 [2024-07-23 08:25:54.883854] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036c80 00:12:42.489 [2024-07-23 08:25:54.883863] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036c80 00:12:42.489 [2024-07-23 08:25:54.883999] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:42.489 pt2 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.489 08:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:42.748 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:42.748 "name": "raid_bdev1", 00:12:42.748 "uuid": "fca2885c-3bdb-4ef1-9bc1-947604e29562", 00:12:42.748 "strip_size_kb": 64, 00:12:42.748 "state": "online", 00:12:42.748 "raid_level": "concat", 00:12:42.748 "superblock": true, 00:12:42.748 "num_base_bdevs": 2, 00:12:42.748 "num_base_bdevs_discovered": 2, 00:12:42.748 "num_base_bdevs_operational": 2, 00:12:42.748 "base_bdevs_list": [ 00:12:42.748 { 00:12:42.748 "name": "pt1", 00:12:42.748 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:42.748 "is_configured": true, 00:12:42.748 "data_offset": 2048, 00:12:42.748 "data_size": 63488 00:12:42.748 }, 00:12:42.748 { 00:12:42.748 "name": "pt2", 00:12:42.748 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:42.748 "is_configured": true, 00:12:42.748 "data_offset": 2048, 00:12:42.748 "data_size": 63488 00:12:42.748 } 00:12:42.748 ] 00:12:42.748 }' 00:12:42.748 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:42.748 08:25:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:43.315 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:43.315 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:43.315 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:43.315 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:43.315 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:43.315 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:43.315 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:43.315 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:43.315 [2024-07-23 08:25:55.681139] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:43.316 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:43.316 "name": "raid_bdev1", 00:12:43.316 "aliases": [ 00:12:43.316 "fca2885c-3bdb-4ef1-9bc1-947604e29562" 00:12:43.316 ], 00:12:43.316 "product_name": "Raid Volume", 00:12:43.316 "block_size": 512, 00:12:43.316 "num_blocks": 126976, 00:12:43.316 "uuid": "fca2885c-3bdb-4ef1-9bc1-947604e29562", 00:12:43.316 "assigned_rate_limits": { 00:12:43.316 "rw_ios_per_sec": 0, 00:12:43.316 "rw_mbytes_per_sec": 0, 00:12:43.316 "r_mbytes_per_sec": 0, 00:12:43.316 "w_mbytes_per_sec": 0 00:12:43.316 }, 00:12:43.316 "claimed": false, 00:12:43.316 "zoned": false, 00:12:43.316 "supported_io_types": { 00:12:43.316 "read": true, 00:12:43.316 "write": true, 00:12:43.316 "unmap": true, 00:12:43.316 "flush": true, 00:12:43.316 "reset": true, 00:12:43.316 "nvme_admin": false, 00:12:43.316 "nvme_io": false, 00:12:43.316 "nvme_io_md": false, 00:12:43.316 "write_zeroes": true, 00:12:43.316 "zcopy": false, 00:12:43.316 "get_zone_info": false, 00:12:43.316 "zone_management": false, 00:12:43.316 "zone_append": false, 00:12:43.316 "compare": false, 00:12:43.316 "compare_and_write": false, 00:12:43.316 "abort": false, 00:12:43.316 "seek_hole": false, 00:12:43.316 "seek_data": false, 00:12:43.316 "copy": false, 00:12:43.316 "nvme_iov_md": false 00:12:43.316 }, 00:12:43.316 "memory_domains": [ 00:12:43.316 { 00:12:43.316 "dma_device_id": "system", 00:12:43.316 "dma_device_type": 1 00:12:43.316 }, 00:12:43.316 { 00:12:43.316 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.316 "dma_device_type": 2 00:12:43.316 }, 00:12:43.316 { 00:12:43.316 "dma_device_id": "system", 00:12:43.316 "dma_device_type": 1 00:12:43.316 }, 00:12:43.316 { 00:12:43.316 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.316 "dma_device_type": 2 00:12:43.316 } 00:12:43.316 ], 00:12:43.316 "driver_specific": { 00:12:43.316 "raid": { 00:12:43.316 "uuid": "fca2885c-3bdb-4ef1-9bc1-947604e29562", 00:12:43.316 "strip_size_kb": 64, 00:12:43.316 "state": "online", 00:12:43.316 "raid_level": "concat", 00:12:43.316 "superblock": true, 00:12:43.316 "num_base_bdevs": 2, 00:12:43.316 "num_base_bdevs_discovered": 2, 00:12:43.316 "num_base_bdevs_operational": 2, 00:12:43.316 "base_bdevs_list": [ 00:12:43.316 { 00:12:43.316 "name": "pt1", 00:12:43.316 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:43.316 "is_configured": true, 00:12:43.316 "data_offset": 2048, 00:12:43.316 "data_size": 63488 00:12:43.316 }, 00:12:43.316 { 00:12:43.316 "name": "pt2", 00:12:43.316 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:43.316 "is_configured": true, 00:12:43.316 "data_offset": 2048, 00:12:43.316 "data_size": 63488 00:12:43.316 } 00:12:43.316 ] 00:12:43.316 } 00:12:43.316 } 00:12:43.316 }' 00:12:43.316 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:43.316 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:43.316 pt2' 00:12:43.316 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:43.316 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:43.316 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:43.575 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:43.575 "name": "pt1", 00:12:43.575 "aliases": [ 00:12:43.575 "00000000-0000-0000-0000-000000000001" 00:12:43.575 ], 00:12:43.575 "product_name": "passthru", 00:12:43.575 "block_size": 512, 00:12:43.575 "num_blocks": 65536, 00:12:43.575 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:43.575 "assigned_rate_limits": { 00:12:43.575 "rw_ios_per_sec": 0, 00:12:43.575 "rw_mbytes_per_sec": 0, 00:12:43.575 "r_mbytes_per_sec": 0, 00:12:43.575 "w_mbytes_per_sec": 0 00:12:43.575 }, 00:12:43.575 "claimed": true, 00:12:43.575 "claim_type": "exclusive_write", 00:12:43.575 "zoned": false, 00:12:43.575 "supported_io_types": { 00:12:43.575 "read": true, 00:12:43.575 "write": true, 00:12:43.575 "unmap": true, 00:12:43.575 "flush": true, 00:12:43.575 "reset": true, 00:12:43.575 "nvme_admin": false, 00:12:43.575 "nvme_io": false, 00:12:43.575 "nvme_io_md": false, 00:12:43.575 "write_zeroes": true, 00:12:43.575 "zcopy": true, 00:12:43.575 "get_zone_info": false, 00:12:43.575 "zone_management": false, 00:12:43.575 "zone_append": false, 00:12:43.575 "compare": false, 00:12:43.575 "compare_and_write": false, 00:12:43.575 "abort": true, 00:12:43.575 "seek_hole": false, 00:12:43.575 "seek_data": false, 00:12:43.575 "copy": true, 00:12:43.575 "nvme_iov_md": false 00:12:43.575 }, 00:12:43.575 "memory_domains": [ 00:12:43.575 { 00:12:43.575 "dma_device_id": "system", 00:12:43.575 "dma_device_type": 1 00:12:43.575 }, 00:12:43.575 { 00:12:43.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.575 "dma_device_type": 2 00:12:43.575 } 00:12:43.575 ], 00:12:43.575 "driver_specific": { 00:12:43.575 "passthru": { 00:12:43.575 "name": "pt1", 00:12:43.575 "base_bdev_name": "malloc1" 00:12:43.575 } 00:12:43.575 } 00:12:43.575 }' 00:12:43.575 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:43.575 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:43.575 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:43.576 08:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:43.576 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:43.576 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:43.576 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:43.576 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:43.834 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:43.834 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:43.834 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:43.834 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:43.834 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:43.834 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:43.834 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:44.095 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:44.095 "name": "pt2", 00:12:44.095 "aliases": [ 00:12:44.095 "00000000-0000-0000-0000-000000000002" 00:12:44.095 ], 00:12:44.095 "product_name": "passthru", 00:12:44.095 "block_size": 512, 00:12:44.095 "num_blocks": 65536, 00:12:44.095 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:44.095 "assigned_rate_limits": { 00:12:44.095 "rw_ios_per_sec": 0, 00:12:44.095 "rw_mbytes_per_sec": 0, 00:12:44.095 "r_mbytes_per_sec": 0, 00:12:44.095 "w_mbytes_per_sec": 0 00:12:44.095 }, 00:12:44.095 "claimed": true, 00:12:44.095 "claim_type": "exclusive_write", 00:12:44.095 "zoned": false, 00:12:44.095 "supported_io_types": { 00:12:44.095 "read": true, 00:12:44.095 "write": true, 00:12:44.095 "unmap": true, 00:12:44.095 "flush": true, 00:12:44.095 "reset": true, 00:12:44.095 "nvme_admin": false, 00:12:44.095 "nvme_io": false, 00:12:44.095 "nvme_io_md": false, 00:12:44.095 "write_zeroes": true, 00:12:44.095 "zcopy": true, 00:12:44.095 "get_zone_info": false, 00:12:44.095 "zone_management": false, 00:12:44.095 "zone_append": false, 00:12:44.095 "compare": false, 00:12:44.095 "compare_and_write": false, 00:12:44.095 "abort": true, 00:12:44.095 "seek_hole": false, 00:12:44.095 "seek_data": false, 00:12:44.095 "copy": true, 00:12:44.095 "nvme_iov_md": false 00:12:44.095 }, 00:12:44.095 "memory_domains": [ 00:12:44.095 { 00:12:44.095 "dma_device_id": "system", 00:12:44.095 "dma_device_type": 1 00:12:44.095 }, 00:12:44.095 { 00:12:44.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.095 "dma_device_type": 2 00:12:44.095 } 00:12:44.095 ], 00:12:44.095 "driver_specific": { 00:12:44.095 "passthru": { 00:12:44.095 "name": "pt2", 00:12:44.095 "base_bdev_name": "malloc2" 00:12:44.095 } 00:12:44.095 } 00:12:44.095 }' 00:12:44.095 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:44.095 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:44.095 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:44.095 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:44.095 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:44.095 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:44.095 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.095 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.095 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:44.096 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:44.355 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:44.355 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:44.355 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:44.355 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:44.355 [2024-07-23 08:25:56.840284] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:44.355 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' fca2885c-3bdb-4ef1-9bc1-947604e29562 '!=' fca2885c-3bdb-4ef1-9bc1-947604e29562 ']' 00:12:44.355 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:12:44.355 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:44.355 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:44.355 08:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1410201 00:12:44.355 08:25:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1410201 ']' 00:12:44.355 08:25:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1410201 00:12:44.355 08:25:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:44.355 08:25:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:44.355 08:25:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1410201 00:12:44.614 08:25:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:44.614 08:25:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:44.614 08:25:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1410201' 00:12:44.614 killing process with pid 1410201 00:12:44.614 08:25:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1410201 00:12:44.614 [2024-07-23 08:25:56.901814] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:44.614 [2024-07-23 08:25:56.901910] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:44.614 [2024-07-23 08:25:56.901959] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:44.614 [2024-07-23 08:25:56.901972] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036c80 name raid_bdev1, state offline 00:12:44.614 08:25:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1410201 00:12:44.614 [2024-07-23 08:25:57.048651] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:46.002 08:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:46.002 00:12:46.002 real 0m9.466s 00:12:46.002 user 0m15.835s 00:12:46.002 sys 0m1.468s 00:12:46.002 08:25:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:46.002 08:25:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:46.002 ************************************ 00:12:46.002 END TEST raid_superblock_test 00:12:46.002 ************************************ 00:12:46.002 08:25:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:46.002 08:25:58 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:12:46.002 08:25:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:46.002 08:25:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:46.002 08:25:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:46.002 ************************************ 00:12:46.002 START TEST raid_read_error_test 00:12:46.002 ************************************ 00:12:46.002 08:25:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:12:46.002 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:46.002 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:46.002 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:46.002 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:46.002 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:46.002 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:46.002 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:46.002 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:46.002 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:46.002 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:46.002 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:46.002 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:46.002 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:46.002 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:46.002 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.VtCZ5TXnOu 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1412205 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1412205 /var/tmp/spdk-raid.sock 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1412205 ']' 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:46.003 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:46.003 08:25:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:46.003 [2024-07-23 08:25:58.474062] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:12:46.003 [2024-07-23 08:25:58.474157] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1412205 ] 00:12:46.261 [2024-07-23 08:25:58.598468] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:46.520 [2024-07-23 08:25:58.819883] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:46.779 [2024-07-23 08:25:59.070879] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:46.779 [2024-07-23 08:25:59.070914] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:46.779 08:25:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:46.779 08:25:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:46.779 08:25:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:46.779 08:25:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:47.038 BaseBdev1_malloc 00:12:47.038 08:25:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:47.296 true 00:12:47.296 08:25:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:47.296 [2024-07-23 08:25:59.801852] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:47.296 [2024-07-23 08:25:59.801912] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:47.296 [2024-07-23 08:25:59.801933] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:12:47.296 [2024-07-23 08:25:59.801945] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:47.296 [2024-07-23 08:25:59.804027] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:47.296 [2024-07-23 08:25:59.804060] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:47.296 BaseBdev1 00:12:47.555 08:25:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:47.555 08:25:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:47.555 BaseBdev2_malloc 00:12:47.555 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:47.814 true 00:12:47.814 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:48.072 [2024-07-23 08:26:00.365897] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:48.072 [2024-07-23 08:26:00.365949] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:48.072 [2024-07-23 08:26:00.365968] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:12:48.072 [2024-07-23 08:26:00.365981] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:48.072 [2024-07-23 08:26:00.367968] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:48.072 [2024-07-23 08:26:00.368003] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:48.072 BaseBdev2 00:12:48.072 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:48.072 [2024-07-23 08:26:00.538400] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:48.072 [2024-07-23 08:26:00.540073] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:48.072 [2024-07-23 08:26:00.540285] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036080 00:12:48.072 [2024-07-23 08:26:00.540300] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:48.072 [2024-07-23 08:26:00.540561] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:12:48.072 [2024-07-23 08:26:00.540787] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036080 00:12:48.072 [2024-07-23 08:26:00.540798] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036080 00:12:48.072 [2024-07-23 08:26:00.540969] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:48.073 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:48.073 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:48.073 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:48.073 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:48.073 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:48.073 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:48.073 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:48.073 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:48.073 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:48.073 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:48.073 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.073 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:48.331 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:48.331 "name": "raid_bdev1", 00:12:48.331 "uuid": "274d7a4a-221f-48af-9149-ac7673cf7ad9", 00:12:48.331 "strip_size_kb": 64, 00:12:48.331 "state": "online", 00:12:48.331 "raid_level": "concat", 00:12:48.331 "superblock": true, 00:12:48.331 "num_base_bdevs": 2, 00:12:48.331 "num_base_bdevs_discovered": 2, 00:12:48.331 "num_base_bdevs_operational": 2, 00:12:48.331 "base_bdevs_list": [ 00:12:48.331 { 00:12:48.331 "name": "BaseBdev1", 00:12:48.331 "uuid": "c0358d5a-9c66-5c3d-a9a3-f88dc9c751ff", 00:12:48.331 "is_configured": true, 00:12:48.331 "data_offset": 2048, 00:12:48.331 "data_size": 63488 00:12:48.331 }, 00:12:48.331 { 00:12:48.332 "name": "BaseBdev2", 00:12:48.332 "uuid": "1d751316-9caf-50e3-abce-59c7169b1495", 00:12:48.332 "is_configured": true, 00:12:48.332 "data_offset": 2048, 00:12:48.332 "data_size": 63488 00:12:48.332 } 00:12:48.332 ] 00:12:48.332 }' 00:12:48.332 08:26:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:48.332 08:26:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:48.899 08:26:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:48.899 08:26:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:48.899 [2024-07-23 08:26:01.245897] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:12:49.836 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:49.836 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:49.836 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:49.836 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:49.836 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:49.836 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:49.836 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:49.836 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:49.836 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:49.836 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:49.836 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:49.836 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:49.836 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:49.836 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:49.836 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.836 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:50.095 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:50.095 "name": "raid_bdev1", 00:12:50.095 "uuid": "274d7a4a-221f-48af-9149-ac7673cf7ad9", 00:12:50.095 "strip_size_kb": 64, 00:12:50.095 "state": "online", 00:12:50.095 "raid_level": "concat", 00:12:50.095 "superblock": true, 00:12:50.095 "num_base_bdevs": 2, 00:12:50.095 "num_base_bdevs_discovered": 2, 00:12:50.095 "num_base_bdevs_operational": 2, 00:12:50.095 "base_bdevs_list": [ 00:12:50.095 { 00:12:50.095 "name": "BaseBdev1", 00:12:50.095 "uuid": "c0358d5a-9c66-5c3d-a9a3-f88dc9c751ff", 00:12:50.095 "is_configured": true, 00:12:50.095 "data_offset": 2048, 00:12:50.095 "data_size": 63488 00:12:50.095 }, 00:12:50.095 { 00:12:50.095 "name": "BaseBdev2", 00:12:50.095 "uuid": "1d751316-9caf-50e3-abce-59c7169b1495", 00:12:50.095 "is_configured": true, 00:12:50.095 "data_offset": 2048, 00:12:50.095 "data_size": 63488 00:12:50.095 } 00:12:50.095 ] 00:12:50.095 }' 00:12:50.095 08:26:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:50.095 08:26:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:50.662 08:26:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:50.662 [2024-07-23 08:26:03.166934] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:50.662 [2024-07-23 08:26:03.166971] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:50.662 [2024-07-23 08:26:03.169414] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:50.662 [2024-07-23 08:26:03.169454] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:50.662 [2024-07-23 08:26:03.169485] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:50.662 [2024-07-23 08:26:03.169496] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036080 name raid_bdev1, state offline 00:12:50.662 0 00:12:50.920 08:26:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1412205 00:12:50.920 08:26:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1412205 ']' 00:12:50.920 08:26:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1412205 00:12:50.920 08:26:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:50.920 08:26:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:50.920 08:26:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1412205 00:12:50.920 08:26:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:50.920 08:26:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:50.920 08:26:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1412205' 00:12:50.920 killing process with pid 1412205 00:12:50.920 08:26:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1412205 00:12:50.920 [2024-07-23 08:26:03.233160] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:50.920 08:26:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1412205 00:12:50.920 [2024-07-23 08:26:03.306235] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:52.310 08:26:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.VtCZ5TXnOu 00:12:52.310 08:26:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:52.310 08:26:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:52.310 08:26:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:12:52.310 08:26:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:52.310 08:26:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:52.310 08:26:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:52.310 08:26:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:12:52.310 00:12:52.310 real 0m6.257s 00:12:52.310 user 0m8.809s 00:12:52.310 sys 0m0.816s 00:12:52.310 08:26:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:52.310 08:26:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.310 ************************************ 00:12:52.310 END TEST raid_read_error_test 00:12:52.310 ************************************ 00:12:52.310 08:26:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:52.310 08:26:04 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:12:52.310 08:26:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:52.310 08:26:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:52.310 08:26:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:52.310 ************************************ 00:12:52.310 START TEST raid_write_error_test 00:12:52.310 ************************************ 00:12:52.310 08:26:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:12:52.310 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:52.310 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:52.310 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:52.310 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:52.310 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:52.310 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:52.310 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:52.310 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:52.310 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:52.310 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:52.310 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:52.310 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:52.310 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:52.310 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:52.310 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ScPsVHRc7E 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1413397 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1413397 /var/tmp/spdk-raid.sock 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1413397 ']' 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:52.311 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:52.311 08:26:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.311 [2024-07-23 08:26:04.805948] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:12:52.311 [2024-07-23 08:26:04.806036] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1413397 ] 00:12:52.570 [2024-07-23 08:26:04.928447] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:52.829 [2024-07-23 08:26:05.140697] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:53.088 [2024-07-23 08:26:05.409346] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:53.088 [2024-07-23 08:26:05.409378] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:53.088 08:26:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:53.088 08:26:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:53.088 08:26:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:53.088 08:26:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:53.347 BaseBdev1_malloc 00:12:53.347 08:26:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:53.642 true 00:12:53.642 08:26:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:53.642 [2024-07-23 08:26:06.095605] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:53.642 [2024-07-23 08:26:06.095666] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:53.642 [2024-07-23 08:26:06.095701] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:12:53.642 [2024-07-23 08:26:06.095713] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:53.642 [2024-07-23 08:26:06.097766] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:53.642 [2024-07-23 08:26:06.097797] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:53.642 BaseBdev1 00:12:53.642 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:53.642 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:53.906 BaseBdev2_malloc 00:12:53.906 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:54.164 true 00:12:54.164 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:54.164 [2024-07-23 08:26:06.648327] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:54.164 [2024-07-23 08:26:06.648381] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:54.164 [2024-07-23 08:26:06.648400] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:12:54.164 [2024-07-23 08:26:06.648411] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:54.164 [2024-07-23 08:26:06.650486] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:54.164 [2024-07-23 08:26:06.650517] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:54.164 BaseBdev2 00:12:54.164 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:54.423 [2024-07-23 08:26:06.816835] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:54.423 [2024-07-23 08:26:06.818523] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:54.423 [2024-07-23 08:26:06.818729] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036080 00:12:54.423 [2024-07-23 08:26:06.818747] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:54.423 [2024-07-23 08:26:06.819006] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:12:54.423 [2024-07-23 08:26:06.819208] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036080 00:12:54.423 [2024-07-23 08:26:06.819218] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036080 00:12:54.423 [2024-07-23 08:26:06.819378] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:54.423 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:54.423 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:54.423 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:54.423 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:54.423 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:54.423 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:54.423 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:54.423 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:54.423 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:54.423 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:54.423 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.423 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:54.682 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:54.682 "name": "raid_bdev1", 00:12:54.682 "uuid": "2199019e-8217-4d4f-b368-c9656f1e2d4d", 00:12:54.682 "strip_size_kb": 64, 00:12:54.682 "state": "online", 00:12:54.682 "raid_level": "concat", 00:12:54.682 "superblock": true, 00:12:54.682 "num_base_bdevs": 2, 00:12:54.682 "num_base_bdevs_discovered": 2, 00:12:54.682 "num_base_bdevs_operational": 2, 00:12:54.682 "base_bdevs_list": [ 00:12:54.682 { 00:12:54.682 "name": "BaseBdev1", 00:12:54.682 "uuid": "a216de57-ee39-5c07-9a13-24c6b1b42c42", 00:12:54.682 "is_configured": true, 00:12:54.682 "data_offset": 2048, 00:12:54.682 "data_size": 63488 00:12:54.682 }, 00:12:54.682 { 00:12:54.682 "name": "BaseBdev2", 00:12:54.682 "uuid": "40f4e785-06b0-5433-bf6d-baac42b52d6a", 00:12:54.682 "is_configured": true, 00:12:54.682 "data_offset": 2048, 00:12:54.682 "data_size": 63488 00:12:54.682 } 00:12:54.682 ] 00:12:54.682 }' 00:12:54.682 08:26:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:54.682 08:26:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:55.250 08:26:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:55.250 08:26:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:55.250 [2024-07-23 08:26:07.572253] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:12:56.188 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:56.188 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:56.188 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:56.188 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:56.188 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:56.188 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:56.188 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:56.188 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:56.188 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:56.188 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:56.188 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:56.188 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:56.188 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:56.188 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:56.188 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:56.188 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.447 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:56.447 "name": "raid_bdev1", 00:12:56.447 "uuid": "2199019e-8217-4d4f-b368-c9656f1e2d4d", 00:12:56.447 "strip_size_kb": 64, 00:12:56.447 "state": "online", 00:12:56.447 "raid_level": "concat", 00:12:56.447 "superblock": true, 00:12:56.447 "num_base_bdevs": 2, 00:12:56.447 "num_base_bdevs_discovered": 2, 00:12:56.447 "num_base_bdevs_operational": 2, 00:12:56.447 "base_bdevs_list": [ 00:12:56.447 { 00:12:56.447 "name": "BaseBdev1", 00:12:56.447 "uuid": "a216de57-ee39-5c07-9a13-24c6b1b42c42", 00:12:56.447 "is_configured": true, 00:12:56.447 "data_offset": 2048, 00:12:56.447 "data_size": 63488 00:12:56.447 }, 00:12:56.447 { 00:12:56.447 "name": "BaseBdev2", 00:12:56.447 "uuid": "40f4e785-06b0-5433-bf6d-baac42b52d6a", 00:12:56.447 "is_configured": true, 00:12:56.447 "data_offset": 2048, 00:12:56.447 "data_size": 63488 00:12:56.447 } 00:12:56.447 ] 00:12:56.447 }' 00:12:56.447 08:26:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:56.447 08:26:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:57.015 08:26:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:57.015 [2024-07-23 08:26:09.492697] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:57.015 [2024-07-23 08:26:09.492736] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:57.015 [2024-07-23 08:26:09.495003] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:57.015 [2024-07-23 08:26:09.495043] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:57.015 [2024-07-23 08:26:09.495073] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:57.015 [2024-07-23 08:26:09.495084] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036080 name raid_bdev1, state offline 00:12:57.015 0 00:12:57.015 08:26:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1413397 00:12:57.015 08:26:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1413397 ']' 00:12:57.015 08:26:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1413397 00:12:57.015 08:26:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:57.015 08:26:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:57.015 08:26:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1413397 00:12:57.274 08:26:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:57.274 08:26:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:57.274 08:26:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1413397' 00:12:57.274 killing process with pid 1413397 00:12:57.274 08:26:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1413397 00:12:57.274 [2024-07-23 08:26:09.541509] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:57.274 08:26:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1413397 00:12:57.274 [2024-07-23 08:26:09.617667] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:58.653 08:26:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ScPsVHRc7E 00:12:58.653 08:26:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:58.653 08:26:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:58.653 08:26:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:12:58.653 08:26:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:58.653 08:26:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:58.653 08:26:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:58.653 08:26:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:12:58.653 00:12:58.653 real 0m6.230s 00:12:58.653 user 0m8.691s 00:12:58.653 sys 0m0.860s 00:12:58.653 08:26:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:58.653 08:26:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.653 ************************************ 00:12:58.653 END TEST raid_write_error_test 00:12:58.653 ************************************ 00:12:58.653 08:26:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:58.653 08:26:10 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:58.653 08:26:10 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:12:58.653 08:26:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:58.653 08:26:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:58.653 08:26:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:58.653 ************************************ 00:12:58.653 START TEST raid_state_function_test 00:12:58.653 ************************************ 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1414731 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1414731' 00:12:58.653 Process raid pid: 1414731 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1414731 /var/tmp/spdk-raid.sock 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1414731 ']' 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:58.653 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:58.653 08:26:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.653 [2024-07-23 08:26:11.113292] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:12:58.653 [2024-07-23 08:26:11.113400] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:58.912 [2024-07-23 08:26:11.240598] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:59.171 [2024-07-23 08:26:11.468529] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:59.430 [2024-07-23 08:26:11.738869] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:59.430 [2024-07-23 08:26:11.738906] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:59.430 08:26:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:59.430 08:26:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:59.430 08:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:59.689 [2024-07-23 08:26:12.049727] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:59.689 [2024-07-23 08:26:12.049783] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:59.689 [2024-07-23 08:26:12.049793] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:59.689 [2024-07-23 08:26:12.049805] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:59.689 08:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:59.689 08:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:59.689 08:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:59.689 08:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:59.689 08:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:59.689 08:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:59.689 08:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:59.689 08:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:59.689 08:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:59.689 08:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:59.689 08:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:59.689 08:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.948 08:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:59.948 "name": "Existed_Raid", 00:12:59.948 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:59.948 "strip_size_kb": 0, 00:12:59.948 "state": "configuring", 00:12:59.948 "raid_level": "raid1", 00:12:59.948 "superblock": false, 00:12:59.948 "num_base_bdevs": 2, 00:12:59.948 "num_base_bdevs_discovered": 0, 00:12:59.948 "num_base_bdevs_operational": 2, 00:12:59.948 "base_bdevs_list": [ 00:12:59.948 { 00:12:59.948 "name": "BaseBdev1", 00:12:59.948 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:59.948 "is_configured": false, 00:12:59.948 "data_offset": 0, 00:12:59.948 "data_size": 0 00:12:59.948 }, 00:12:59.948 { 00:12:59.948 "name": "BaseBdev2", 00:12:59.948 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:59.948 "is_configured": false, 00:12:59.948 "data_offset": 0, 00:12:59.948 "data_size": 0 00:12:59.948 } 00:12:59.948 ] 00:12:59.948 }' 00:12:59.948 08:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:59.948 08:26:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:00.516 08:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:00.516 [2024-07-23 08:26:12.879779] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:00.516 [2024-07-23 08:26:12.879829] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:13:00.516 08:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:00.775 [2024-07-23 08:26:13.056250] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:00.775 [2024-07-23 08:26:13.056288] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:00.775 [2024-07-23 08:26:13.056297] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:00.775 [2024-07-23 08:26:13.056306] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:00.775 08:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:00.775 [2024-07-23 08:26:13.282254] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:00.775 BaseBdev1 00:13:01.033 08:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:01.033 08:26:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:01.033 08:26:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:01.033 08:26:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:01.033 08:26:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:01.033 08:26:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:01.033 08:26:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:01.033 08:26:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:01.292 [ 00:13:01.292 { 00:13:01.292 "name": "BaseBdev1", 00:13:01.292 "aliases": [ 00:13:01.292 "a5c395a2-f3b3-4360-b784-f7f4324ad671" 00:13:01.292 ], 00:13:01.292 "product_name": "Malloc disk", 00:13:01.292 "block_size": 512, 00:13:01.292 "num_blocks": 65536, 00:13:01.292 "uuid": "a5c395a2-f3b3-4360-b784-f7f4324ad671", 00:13:01.292 "assigned_rate_limits": { 00:13:01.292 "rw_ios_per_sec": 0, 00:13:01.292 "rw_mbytes_per_sec": 0, 00:13:01.292 "r_mbytes_per_sec": 0, 00:13:01.292 "w_mbytes_per_sec": 0 00:13:01.292 }, 00:13:01.292 "claimed": true, 00:13:01.292 "claim_type": "exclusive_write", 00:13:01.292 "zoned": false, 00:13:01.292 "supported_io_types": { 00:13:01.292 "read": true, 00:13:01.292 "write": true, 00:13:01.292 "unmap": true, 00:13:01.292 "flush": true, 00:13:01.292 "reset": true, 00:13:01.292 "nvme_admin": false, 00:13:01.292 "nvme_io": false, 00:13:01.292 "nvme_io_md": false, 00:13:01.292 "write_zeroes": true, 00:13:01.292 "zcopy": true, 00:13:01.292 "get_zone_info": false, 00:13:01.292 "zone_management": false, 00:13:01.292 "zone_append": false, 00:13:01.292 "compare": false, 00:13:01.292 "compare_and_write": false, 00:13:01.292 "abort": true, 00:13:01.292 "seek_hole": false, 00:13:01.292 "seek_data": false, 00:13:01.292 "copy": true, 00:13:01.292 "nvme_iov_md": false 00:13:01.292 }, 00:13:01.292 "memory_domains": [ 00:13:01.292 { 00:13:01.292 "dma_device_id": "system", 00:13:01.292 "dma_device_type": 1 00:13:01.292 }, 00:13:01.292 { 00:13:01.292 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:01.292 "dma_device_type": 2 00:13:01.292 } 00:13:01.292 ], 00:13:01.292 "driver_specific": {} 00:13:01.292 } 00:13:01.292 ] 00:13:01.292 08:26:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:01.292 08:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:01.292 08:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:01.292 08:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:01.292 08:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:01.292 08:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:01.292 08:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:01.292 08:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:01.292 08:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:01.292 08:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:01.292 08:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:01.292 08:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:01.292 08:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:01.551 08:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:01.551 "name": "Existed_Raid", 00:13:01.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:01.551 "strip_size_kb": 0, 00:13:01.551 "state": "configuring", 00:13:01.551 "raid_level": "raid1", 00:13:01.551 "superblock": false, 00:13:01.551 "num_base_bdevs": 2, 00:13:01.551 "num_base_bdevs_discovered": 1, 00:13:01.551 "num_base_bdevs_operational": 2, 00:13:01.551 "base_bdevs_list": [ 00:13:01.551 { 00:13:01.551 "name": "BaseBdev1", 00:13:01.551 "uuid": "a5c395a2-f3b3-4360-b784-f7f4324ad671", 00:13:01.551 "is_configured": true, 00:13:01.551 "data_offset": 0, 00:13:01.551 "data_size": 65536 00:13:01.551 }, 00:13:01.551 { 00:13:01.551 "name": "BaseBdev2", 00:13:01.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:01.551 "is_configured": false, 00:13:01.551 "data_offset": 0, 00:13:01.551 "data_size": 0 00:13:01.551 } 00:13:01.551 ] 00:13:01.551 }' 00:13:01.551 08:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:01.551 08:26:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:01.921 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:02.180 [2024-07-23 08:26:14.489500] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:02.180 [2024-07-23 08:26:14.489553] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:13:02.180 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:02.180 [2024-07-23 08:26:14.665992] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:02.180 [2024-07-23 08:26:14.667689] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:02.180 [2024-07-23 08:26:14.667726] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:02.180 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:02.180 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:02.180 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:02.180 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:02.180 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:02.180 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:02.180 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:02.180 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:02.180 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:02.180 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:02.180 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:02.180 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:02.180 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:02.180 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:02.439 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:02.439 "name": "Existed_Raid", 00:13:02.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:02.439 "strip_size_kb": 0, 00:13:02.439 "state": "configuring", 00:13:02.439 "raid_level": "raid1", 00:13:02.439 "superblock": false, 00:13:02.439 "num_base_bdevs": 2, 00:13:02.439 "num_base_bdevs_discovered": 1, 00:13:02.439 "num_base_bdevs_operational": 2, 00:13:02.439 "base_bdevs_list": [ 00:13:02.439 { 00:13:02.439 "name": "BaseBdev1", 00:13:02.439 "uuid": "a5c395a2-f3b3-4360-b784-f7f4324ad671", 00:13:02.439 "is_configured": true, 00:13:02.439 "data_offset": 0, 00:13:02.439 "data_size": 65536 00:13:02.439 }, 00:13:02.439 { 00:13:02.439 "name": "BaseBdev2", 00:13:02.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:02.439 "is_configured": false, 00:13:02.439 "data_offset": 0, 00:13:02.439 "data_size": 0 00:13:02.439 } 00:13:02.439 ] 00:13:02.439 }' 00:13:02.439 08:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:02.439 08:26:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:03.019 08:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:03.278 [2024-07-23 08:26:15.548795] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:03.278 [2024-07-23 08:26:15.548866] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:13:03.278 [2024-07-23 08:26:15.548876] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:03.278 [2024-07-23 08:26:15.549166] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:13:03.278 [2024-07-23 08:26:15.549360] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:13:03.278 [2024-07-23 08:26:15.549373] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:13:03.278 [2024-07-23 08:26:15.549683] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:03.278 BaseBdev2 00:13:03.278 08:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:03.278 08:26:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:03.278 08:26:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:03.278 08:26:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:03.279 08:26:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:03.279 08:26:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:03.279 08:26:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:03.279 08:26:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:03.537 [ 00:13:03.537 { 00:13:03.537 "name": "BaseBdev2", 00:13:03.537 "aliases": [ 00:13:03.537 "6eed1e80-9a98-4e5d-b86f-ccd2cf81532f" 00:13:03.537 ], 00:13:03.537 "product_name": "Malloc disk", 00:13:03.537 "block_size": 512, 00:13:03.537 "num_blocks": 65536, 00:13:03.537 "uuid": "6eed1e80-9a98-4e5d-b86f-ccd2cf81532f", 00:13:03.537 "assigned_rate_limits": { 00:13:03.537 "rw_ios_per_sec": 0, 00:13:03.537 "rw_mbytes_per_sec": 0, 00:13:03.537 "r_mbytes_per_sec": 0, 00:13:03.537 "w_mbytes_per_sec": 0 00:13:03.537 }, 00:13:03.537 "claimed": true, 00:13:03.537 "claim_type": "exclusive_write", 00:13:03.537 "zoned": false, 00:13:03.537 "supported_io_types": { 00:13:03.537 "read": true, 00:13:03.537 "write": true, 00:13:03.537 "unmap": true, 00:13:03.537 "flush": true, 00:13:03.537 "reset": true, 00:13:03.537 "nvme_admin": false, 00:13:03.537 "nvme_io": false, 00:13:03.537 "nvme_io_md": false, 00:13:03.537 "write_zeroes": true, 00:13:03.537 "zcopy": true, 00:13:03.537 "get_zone_info": false, 00:13:03.537 "zone_management": false, 00:13:03.537 "zone_append": false, 00:13:03.537 "compare": false, 00:13:03.537 "compare_and_write": false, 00:13:03.537 "abort": true, 00:13:03.537 "seek_hole": false, 00:13:03.537 "seek_data": false, 00:13:03.537 "copy": true, 00:13:03.537 "nvme_iov_md": false 00:13:03.537 }, 00:13:03.537 "memory_domains": [ 00:13:03.537 { 00:13:03.537 "dma_device_id": "system", 00:13:03.537 "dma_device_type": 1 00:13:03.537 }, 00:13:03.537 { 00:13:03.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.537 "dma_device_type": 2 00:13:03.537 } 00:13:03.537 ], 00:13:03.537 "driver_specific": {} 00:13:03.537 } 00:13:03.537 ] 00:13:03.537 08:26:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:03.537 08:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:03.537 08:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:03.538 08:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:03.538 08:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:03.538 08:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:03.538 08:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:03.538 08:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:03.538 08:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:03.538 08:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:03.538 08:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:03.538 08:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:03.538 08:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:03.538 08:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.538 08:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:03.797 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:03.797 "name": "Existed_Raid", 00:13:03.797 "uuid": "9ceb89f9-b240-49ec-9305-7f139f2b66e3", 00:13:03.797 "strip_size_kb": 0, 00:13:03.797 "state": "online", 00:13:03.797 "raid_level": "raid1", 00:13:03.797 "superblock": false, 00:13:03.797 "num_base_bdevs": 2, 00:13:03.797 "num_base_bdevs_discovered": 2, 00:13:03.797 "num_base_bdevs_operational": 2, 00:13:03.797 "base_bdevs_list": [ 00:13:03.797 { 00:13:03.797 "name": "BaseBdev1", 00:13:03.797 "uuid": "a5c395a2-f3b3-4360-b784-f7f4324ad671", 00:13:03.797 "is_configured": true, 00:13:03.797 "data_offset": 0, 00:13:03.797 "data_size": 65536 00:13:03.797 }, 00:13:03.797 { 00:13:03.797 "name": "BaseBdev2", 00:13:03.797 "uuid": "6eed1e80-9a98-4e5d-b86f-ccd2cf81532f", 00:13:03.797 "is_configured": true, 00:13:03.797 "data_offset": 0, 00:13:03.797 "data_size": 65536 00:13:03.797 } 00:13:03.797 ] 00:13:03.797 }' 00:13:03.797 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:03.797 08:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:04.364 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:04.364 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:04.364 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:04.364 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:04.364 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:04.364 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:04.364 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:04.364 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:04.364 [2024-07-23 08:26:16.736247] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:04.364 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:04.364 "name": "Existed_Raid", 00:13:04.364 "aliases": [ 00:13:04.364 "9ceb89f9-b240-49ec-9305-7f139f2b66e3" 00:13:04.364 ], 00:13:04.364 "product_name": "Raid Volume", 00:13:04.364 "block_size": 512, 00:13:04.364 "num_blocks": 65536, 00:13:04.364 "uuid": "9ceb89f9-b240-49ec-9305-7f139f2b66e3", 00:13:04.364 "assigned_rate_limits": { 00:13:04.364 "rw_ios_per_sec": 0, 00:13:04.364 "rw_mbytes_per_sec": 0, 00:13:04.364 "r_mbytes_per_sec": 0, 00:13:04.364 "w_mbytes_per_sec": 0 00:13:04.364 }, 00:13:04.364 "claimed": false, 00:13:04.364 "zoned": false, 00:13:04.364 "supported_io_types": { 00:13:04.364 "read": true, 00:13:04.364 "write": true, 00:13:04.364 "unmap": false, 00:13:04.364 "flush": false, 00:13:04.364 "reset": true, 00:13:04.364 "nvme_admin": false, 00:13:04.364 "nvme_io": false, 00:13:04.364 "nvme_io_md": false, 00:13:04.364 "write_zeroes": true, 00:13:04.364 "zcopy": false, 00:13:04.364 "get_zone_info": false, 00:13:04.364 "zone_management": false, 00:13:04.364 "zone_append": false, 00:13:04.364 "compare": false, 00:13:04.364 "compare_and_write": false, 00:13:04.364 "abort": false, 00:13:04.365 "seek_hole": false, 00:13:04.365 "seek_data": false, 00:13:04.365 "copy": false, 00:13:04.365 "nvme_iov_md": false 00:13:04.365 }, 00:13:04.365 "memory_domains": [ 00:13:04.365 { 00:13:04.365 "dma_device_id": "system", 00:13:04.365 "dma_device_type": 1 00:13:04.365 }, 00:13:04.365 { 00:13:04.365 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.365 "dma_device_type": 2 00:13:04.365 }, 00:13:04.365 { 00:13:04.365 "dma_device_id": "system", 00:13:04.365 "dma_device_type": 1 00:13:04.365 }, 00:13:04.365 { 00:13:04.365 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.365 "dma_device_type": 2 00:13:04.365 } 00:13:04.365 ], 00:13:04.365 "driver_specific": { 00:13:04.365 "raid": { 00:13:04.365 "uuid": "9ceb89f9-b240-49ec-9305-7f139f2b66e3", 00:13:04.365 "strip_size_kb": 0, 00:13:04.365 "state": "online", 00:13:04.365 "raid_level": "raid1", 00:13:04.365 "superblock": false, 00:13:04.365 "num_base_bdevs": 2, 00:13:04.365 "num_base_bdevs_discovered": 2, 00:13:04.365 "num_base_bdevs_operational": 2, 00:13:04.365 "base_bdevs_list": [ 00:13:04.365 { 00:13:04.365 "name": "BaseBdev1", 00:13:04.365 "uuid": "a5c395a2-f3b3-4360-b784-f7f4324ad671", 00:13:04.365 "is_configured": true, 00:13:04.365 "data_offset": 0, 00:13:04.365 "data_size": 65536 00:13:04.365 }, 00:13:04.365 { 00:13:04.365 "name": "BaseBdev2", 00:13:04.365 "uuid": "6eed1e80-9a98-4e5d-b86f-ccd2cf81532f", 00:13:04.365 "is_configured": true, 00:13:04.365 "data_offset": 0, 00:13:04.365 "data_size": 65536 00:13:04.365 } 00:13:04.365 ] 00:13:04.365 } 00:13:04.365 } 00:13:04.365 }' 00:13:04.365 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:04.365 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:04.365 BaseBdev2' 00:13:04.365 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:04.365 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:04.365 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:04.624 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:04.624 "name": "BaseBdev1", 00:13:04.624 "aliases": [ 00:13:04.624 "a5c395a2-f3b3-4360-b784-f7f4324ad671" 00:13:04.624 ], 00:13:04.624 "product_name": "Malloc disk", 00:13:04.624 "block_size": 512, 00:13:04.624 "num_blocks": 65536, 00:13:04.624 "uuid": "a5c395a2-f3b3-4360-b784-f7f4324ad671", 00:13:04.624 "assigned_rate_limits": { 00:13:04.624 "rw_ios_per_sec": 0, 00:13:04.624 "rw_mbytes_per_sec": 0, 00:13:04.624 "r_mbytes_per_sec": 0, 00:13:04.624 "w_mbytes_per_sec": 0 00:13:04.624 }, 00:13:04.624 "claimed": true, 00:13:04.624 "claim_type": "exclusive_write", 00:13:04.624 "zoned": false, 00:13:04.624 "supported_io_types": { 00:13:04.624 "read": true, 00:13:04.624 "write": true, 00:13:04.624 "unmap": true, 00:13:04.624 "flush": true, 00:13:04.624 "reset": true, 00:13:04.624 "nvme_admin": false, 00:13:04.624 "nvme_io": false, 00:13:04.624 "nvme_io_md": false, 00:13:04.624 "write_zeroes": true, 00:13:04.624 "zcopy": true, 00:13:04.624 "get_zone_info": false, 00:13:04.624 "zone_management": false, 00:13:04.624 "zone_append": false, 00:13:04.624 "compare": false, 00:13:04.624 "compare_and_write": false, 00:13:04.624 "abort": true, 00:13:04.624 "seek_hole": false, 00:13:04.624 "seek_data": false, 00:13:04.624 "copy": true, 00:13:04.624 "nvme_iov_md": false 00:13:04.624 }, 00:13:04.624 "memory_domains": [ 00:13:04.624 { 00:13:04.624 "dma_device_id": "system", 00:13:04.624 "dma_device_type": 1 00:13:04.624 }, 00:13:04.624 { 00:13:04.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.624 "dma_device_type": 2 00:13:04.624 } 00:13:04.624 ], 00:13:04.624 "driver_specific": {} 00:13:04.624 }' 00:13:04.624 08:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.624 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.624 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:04.624 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.624 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.624 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:04.624 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.883 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.883 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:04.883 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.883 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.883 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:04.883 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:04.883 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:04.883 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:05.141 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:05.141 "name": "BaseBdev2", 00:13:05.141 "aliases": [ 00:13:05.141 "6eed1e80-9a98-4e5d-b86f-ccd2cf81532f" 00:13:05.141 ], 00:13:05.141 "product_name": "Malloc disk", 00:13:05.141 "block_size": 512, 00:13:05.141 "num_blocks": 65536, 00:13:05.141 "uuid": "6eed1e80-9a98-4e5d-b86f-ccd2cf81532f", 00:13:05.141 "assigned_rate_limits": { 00:13:05.141 "rw_ios_per_sec": 0, 00:13:05.141 "rw_mbytes_per_sec": 0, 00:13:05.141 "r_mbytes_per_sec": 0, 00:13:05.141 "w_mbytes_per_sec": 0 00:13:05.141 }, 00:13:05.141 "claimed": true, 00:13:05.141 "claim_type": "exclusive_write", 00:13:05.141 "zoned": false, 00:13:05.141 "supported_io_types": { 00:13:05.141 "read": true, 00:13:05.141 "write": true, 00:13:05.141 "unmap": true, 00:13:05.141 "flush": true, 00:13:05.141 "reset": true, 00:13:05.141 "nvme_admin": false, 00:13:05.141 "nvme_io": false, 00:13:05.141 "nvme_io_md": false, 00:13:05.141 "write_zeroes": true, 00:13:05.141 "zcopy": true, 00:13:05.141 "get_zone_info": false, 00:13:05.141 "zone_management": false, 00:13:05.141 "zone_append": false, 00:13:05.141 "compare": false, 00:13:05.141 "compare_and_write": false, 00:13:05.141 "abort": true, 00:13:05.141 "seek_hole": false, 00:13:05.141 "seek_data": false, 00:13:05.141 "copy": true, 00:13:05.141 "nvme_iov_md": false 00:13:05.141 }, 00:13:05.141 "memory_domains": [ 00:13:05.141 { 00:13:05.141 "dma_device_id": "system", 00:13:05.141 "dma_device_type": 1 00:13:05.141 }, 00:13:05.141 { 00:13:05.141 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.141 "dma_device_type": 2 00:13:05.141 } 00:13:05.141 ], 00:13:05.141 "driver_specific": {} 00:13:05.141 }' 00:13:05.141 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:05.141 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:05.141 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:05.141 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:05.141 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:05.141 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:05.141 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:05.141 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:05.401 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:05.401 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:05.401 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:05.401 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:05.401 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:05.401 [2024-07-23 08:26:17.911166] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:05.659 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:05.659 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:05.659 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:05.659 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:05.659 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:05.659 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:13:05.659 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:05.659 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:05.659 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:05.659 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:05.659 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:05.659 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:05.659 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:05.660 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:05.660 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:05.660 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.660 08:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:05.660 08:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:05.660 "name": "Existed_Raid", 00:13:05.660 "uuid": "9ceb89f9-b240-49ec-9305-7f139f2b66e3", 00:13:05.660 "strip_size_kb": 0, 00:13:05.660 "state": "online", 00:13:05.660 "raid_level": "raid1", 00:13:05.660 "superblock": false, 00:13:05.660 "num_base_bdevs": 2, 00:13:05.660 "num_base_bdevs_discovered": 1, 00:13:05.660 "num_base_bdevs_operational": 1, 00:13:05.660 "base_bdevs_list": [ 00:13:05.660 { 00:13:05.660 "name": null, 00:13:05.660 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:05.660 "is_configured": false, 00:13:05.660 "data_offset": 0, 00:13:05.660 "data_size": 65536 00:13:05.660 }, 00:13:05.660 { 00:13:05.660 "name": "BaseBdev2", 00:13:05.660 "uuid": "6eed1e80-9a98-4e5d-b86f-ccd2cf81532f", 00:13:05.660 "is_configured": true, 00:13:05.660 "data_offset": 0, 00:13:05.660 "data_size": 65536 00:13:05.660 } 00:13:05.660 ] 00:13:05.660 }' 00:13:05.660 08:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:05.660 08:26:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:06.227 08:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:06.227 08:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:06.227 08:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.227 08:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:06.486 08:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:06.486 08:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:06.486 08:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:06.486 [2024-07-23 08:26:18.943439] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:06.486 [2024-07-23 08:26:18.943545] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:06.745 [2024-07-23 08:26:19.038494] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:06.745 [2024-07-23 08:26:19.038546] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:06.745 [2024-07-23 08:26:19.038558] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:13:06.745 08:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:06.745 08:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:06.745 08:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.745 08:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:06.745 08:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:06.745 08:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:06.745 08:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:06.745 08:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1414731 00:13:06.745 08:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1414731 ']' 00:13:06.745 08:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1414731 00:13:06.745 08:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:06.745 08:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:06.745 08:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1414731 00:13:07.004 08:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:07.004 08:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:07.004 08:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1414731' 00:13:07.004 killing process with pid 1414731 00:13:07.004 08:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1414731 00:13:07.004 [2024-07-23 08:26:19.270761] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:07.004 08:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1414731 00:13:07.004 [2024-07-23 08:26:19.288550] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:08.383 00:13:08.383 real 0m9.542s 00:13:08.383 user 0m15.968s 00:13:08.383 sys 0m1.412s 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:08.383 ************************************ 00:13:08.383 END TEST raid_state_function_test 00:13:08.383 ************************************ 00:13:08.383 08:26:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:08.383 08:26:20 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:13:08.383 08:26:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:08.383 08:26:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:08.383 08:26:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:08.383 ************************************ 00:13:08.383 START TEST raid_state_function_test_sb 00:13:08.383 ************************************ 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1416726 00:13:08.383 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1416726' 00:13:08.384 Process raid pid: 1416726 00:13:08.384 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:08.384 08:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1416726 /var/tmp/spdk-raid.sock 00:13:08.384 08:26:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1416726 ']' 00:13:08.384 08:26:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:08.384 08:26:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:08.384 08:26:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:08.384 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:08.384 08:26:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:08.384 08:26:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:08.384 [2024-07-23 08:26:20.718189] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:13:08.384 [2024-07-23 08:26:20.718275] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:08.384 [2024-07-23 08:26:20.842960] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:08.643 [2024-07-23 08:26:21.055984] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:08.901 [2024-07-23 08:26:21.353232] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:08.901 [2024-07-23 08:26:21.353262] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:09.159 08:26:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:09.159 08:26:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:09.159 08:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:09.159 [2024-07-23 08:26:21.674582] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:09.159 [2024-07-23 08:26:21.674632] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:09.159 [2024-07-23 08:26:21.674643] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:09.159 [2024-07-23 08:26:21.674652] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:09.419 08:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:09.419 08:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:09.419 08:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:09.419 08:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:09.419 08:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:09.419 08:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:09.419 08:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:09.419 08:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:09.419 08:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:09.419 08:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:09.419 08:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:09.419 08:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:09.419 08:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:09.419 "name": "Existed_Raid", 00:13:09.419 "uuid": "6773954e-0f59-4834-9c9f-aa2042982b87", 00:13:09.419 "strip_size_kb": 0, 00:13:09.419 "state": "configuring", 00:13:09.419 "raid_level": "raid1", 00:13:09.419 "superblock": true, 00:13:09.419 "num_base_bdevs": 2, 00:13:09.419 "num_base_bdevs_discovered": 0, 00:13:09.419 "num_base_bdevs_operational": 2, 00:13:09.419 "base_bdevs_list": [ 00:13:09.419 { 00:13:09.419 "name": "BaseBdev1", 00:13:09.419 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.419 "is_configured": false, 00:13:09.419 "data_offset": 0, 00:13:09.419 "data_size": 0 00:13:09.419 }, 00:13:09.419 { 00:13:09.419 "name": "BaseBdev2", 00:13:09.419 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.419 "is_configured": false, 00:13:09.419 "data_offset": 0, 00:13:09.419 "data_size": 0 00:13:09.419 } 00:13:09.419 ] 00:13:09.419 }' 00:13:09.419 08:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:09.419 08:26:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:09.987 08:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:09.987 [2024-07-23 08:26:22.468548] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:09.987 [2024-07-23 08:26:22.468580] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:13:09.987 08:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:10.246 [2024-07-23 08:26:22.641023] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:10.246 [2024-07-23 08:26:22.641062] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:10.246 [2024-07-23 08:26:22.641071] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:10.246 [2024-07-23 08:26:22.641080] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:10.246 08:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:10.505 [2024-07-23 08:26:22.855865] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:10.505 BaseBdev1 00:13:10.505 08:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:10.505 08:26:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:10.505 08:26:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:10.505 08:26:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:10.505 08:26:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:10.505 08:26:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:10.505 08:26:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:10.763 08:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:10.763 [ 00:13:10.763 { 00:13:10.763 "name": "BaseBdev1", 00:13:10.763 "aliases": [ 00:13:10.764 "ac03cae1-94af-4b09-8e5f-82c377fbb121" 00:13:10.764 ], 00:13:10.764 "product_name": "Malloc disk", 00:13:10.764 "block_size": 512, 00:13:10.764 "num_blocks": 65536, 00:13:10.764 "uuid": "ac03cae1-94af-4b09-8e5f-82c377fbb121", 00:13:10.764 "assigned_rate_limits": { 00:13:10.764 "rw_ios_per_sec": 0, 00:13:10.764 "rw_mbytes_per_sec": 0, 00:13:10.764 "r_mbytes_per_sec": 0, 00:13:10.764 "w_mbytes_per_sec": 0 00:13:10.764 }, 00:13:10.764 "claimed": true, 00:13:10.764 "claim_type": "exclusive_write", 00:13:10.764 "zoned": false, 00:13:10.764 "supported_io_types": { 00:13:10.764 "read": true, 00:13:10.764 "write": true, 00:13:10.764 "unmap": true, 00:13:10.764 "flush": true, 00:13:10.764 "reset": true, 00:13:10.764 "nvme_admin": false, 00:13:10.764 "nvme_io": false, 00:13:10.764 "nvme_io_md": false, 00:13:10.764 "write_zeroes": true, 00:13:10.764 "zcopy": true, 00:13:10.764 "get_zone_info": false, 00:13:10.764 "zone_management": false, 00:13:10.764 "zone_append": false, 00:13:10.764 "compare": false, 00:13:10.764 "compare_and_write": false, 00:13:10.764 "abort": true, 00:13:10.764 "seek_hole": false, 00:13:10.764 "seek_data": false, 00:13:10.764 "copy": true, 00:13:10.764 "nvme_iov_md": false 00:13:10.764 }, 00:13:10.764 "memory_domains": [ 00:13:10.764 { 00:13:10.764 "dma_device_id": "system", 00:13:10.764 "dma_device_type": 1 00:13:10.764 }, 00:13:10.764 { 00:13:10.764 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.764 "dma_device_type": 2 00:13:10.764 } 00:13:10.764 ], 00:13:10.764 "driver_specific": {} 00:13:10.764 } 00:13:10.764 ] 00:13:10.764 08:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:10.764 08:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:10.764 08:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:10.764 08:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:10.764 08:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:10.764 08:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:10.764 08:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:10.764 08:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:10.764 08:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:10.764 08:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:10.764 08:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:10.764 08:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.764 08:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:11.022 08:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:11.022 "name": "Existed_Raid", 00:13:11.022 "uuid": "d709bdcc-c6df-4fa7-8ba6-f25b7ed15187", 00:13:11.022 "strip_size_kb": 0, 00:13:11.022 "state": "configuring", 00:13:11.022 "raid_level": "raid1", 00:13:11.023 "superblock": true, 00:13:11.023 "num_base_bdevs": 2, 00:13:11.023 "num_base_bdevs_discovered": 1, 00:13:11.023 "num_base_bdevs_operational": 2, 00:13:11.023 "base_bdevs_list": [ 00:13:11.023 { 00:13:11.023 "name": "BaseBdev1", 00:13:11.023 "uuid": "ac03cae1-94af-4b09-8e5f-82c377fbb121", 00:13:11.023 "is_configured": true, 00:13:11.023 "data_offset": 2048, 00:13:11.023 "data_size": 63488 00:13:11.023 }, 00:13:11.023 { 00:13:11.023 "name": "BaseBdev2", 00:13:11.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:11.023 "is_configured": false, 00:13:11.023 "data_offset": 0, 00:13:11.023 "data_size": 0 00:13:11.023 } 00:13:11.023 ] 00:13:11.023 }' 00:13:11.023 08:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:11.023 08:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:11.590 08:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:11.590 [2024-07-23 08:26:24.026992] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:11.590 [2024-07-23 08:26:24.027043] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:13:11.590 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:11.849 [2024-07-23 08:26:24.203472] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:11.849 [2024-07-23 08:26:24.205052] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:11.849 [2024-07-23 08:26:24.205086] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:11.849 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:11.849 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:11.849 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:11.849 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:11.849 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:11.849 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:11.849 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:11.849 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:11.849 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:11.849 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:11.849 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:11.849 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:11.849 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.849 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:12.108 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:12.108 "name": "Existed_Raid", 00:13:12.108 "uuid": "45572762-38b8-4c7e-aaa0-a9a0fd0428b0", 00:13:12.108 "strip_size_kb": 0, 00:13:12.108 "state": "configuring", 00:13:12.108 "raid_level": "raid1", 00:13:12.108 "superblock": true, 00:13:12.108 "num_base_bdevs": 2, 00:13:12.108 "num_base_bdevs_discovered": 1, 00:13:12.108 "num_base_bdevs_operational": 2, 00:13:12.108 "base_bdevs_list": [ 00:13:12.108 { 00:13:12.108 "name": "BaseBdev1", 00:13:12.108 "uuid": "ac03cae1-94af-4b09-8e5f-82c377fbb121", 00:13:12.108 "is_configured": true, 00:13:12.108 "data_offset": 2048, 00:13:12.108 "data_size": 63488 00:13:12.108 }, 00:13:12.108 { 00:13:12.108 "name": "BaseBdev2", 00:13:12.108 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.109 "is_configured": false, 00:13:12.109 "data_offset": 0, 00:13:12.109 "data_size": 0 00:13:12.109 } 00:13:12.109 ] 00:13:12.109 }' 00:13:12.109 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:12.109 08:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:12.368 08:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:12.627 [2024-07-23 08:26:25.053176] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:12.627 [2024-07-23 08:26:25.053404] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:13:12.627 [2024-07-23 08:26:25.053424] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:12.627 [2024-07-23 08:26:25.053672] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:13:12.627 [2024-07-23 08:26:25.053850] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:13:12.627 [2024-07-23 08:26:25.053862] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:13:12.627 [2024-07-23 08:26:25.054001] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:12.627 BaseBdev2 00:13:12.627 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:12.627 08:26:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:12.627 08:26:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:12.627 08:26:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:12.627 08:26:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:12.627 08:26:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:12.627 08:26:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:12.886 08:26:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:12.886 [ 00:13:12.886 { 00:13:12.886 "name": "BaseBdev2", 00:13:12.886 "aliases": [ 00:13:12.886 "704fdaa4-1d5c-4ca5-9524-10d6f943f2c7" 00:13:12.886 ], 00:13:12.886 "product_name": "Malloc disk", 00:13:12.886 "block_size": 512, 00:13:12.886 "num_blocks": 65536, 00:13:12.886 "uuid": "704fdaa4-1d5c-4ca5-9524-10d6f943f2c7", 00:13:12.886 "assigned_rate_limits": { 00:13:12.886 "rw_ios_per_sec": 0, 00:13:12.886 "rw_mbytes_per_sec": 0, 00:13:12.886 "r_mbytes_per_sec": 0, 00:13:12.886 "w_mbytes_per_sec": 0 00:13:12.886 }, 00:13:12.887 "claimed": true, 00:13:12.887 "claim_type": "exclusive_write", 00:13:12.887 "zoned": false, 00:13:12.887 "supported_io_types": { 00:13:12.887 "read": true, 00:13:12.887 "write": true, 00:13:12.887 "unmap": true, 00:13:12.887 "flush": true, 00:13:12.887 "reset": true, 00:13:12.887 "nvme_admin": false, 00:13:12.887 "nvme_io": false, 00:13:12.887 "nvme_io_md": false, 00:13:12.887 "write_zeroes": true, 00:13:12.887 "zcopy": true, 00:13:12.887 "get_zone_info": false, 00:13:12.887 "zone_management": false, 00:13:12.887 "zone_append": false, 00:13:12.887 "compare": false, 00:13:12.887 "compare_and_write": false, 00:13:12.887 "abort": true, 00:13:12.887 "seek_hole": false, 00:13:12.887 "seek_data": false, 00:13:12.887 "copy": true, 00:13:12.887 "nvme_iov_md": false 00:13:12.887 }, 00:13:12.887 "memory_domains": [ 00:13:12.887 { 00:13:12.887 "dma_device_id": "system", 00:13:12.887 "dma_device_type": 1 00:13:12.887 }, 00:13:12.887 { 00:13:12.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:12.887 "dma_device_type": 2 00:13:12.887 } 00:13:12.887 ], 00:13:12.887 "driver_specific": {} 00:13:12.887 } 00:13:12.887 ] 00:13:12.887 08:26:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:12.887 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:12.887 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:12.887 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:12.887 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:12.887 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:12.887 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:12.887 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:12.887 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:12.887 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:12.887 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:12.887 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:12.887 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:12.887 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.887 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:13.146 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:13.146 "name": "Existed_Raid", 00:13:13.146 "uuid": "45572762-38b8-4c7e-aaa0-a9a0fd0428b0", 00:13:13.146 "strip_size_kb": 0, 00:13:13.146 "state": "online", 00:13:13.146 "raid_level": "raid1", 00:13:13.146 "superblock": true, 00:13:13.146 "num_base_bdevs": 2, 00:13:13.146 "num_base_bdevs_discovered": 2, 00:13:13.146 "num_base_bdevs_operational": 2, 00:13:13.146 "base_bdevs_list": [ 00:13:13.146 { 00:13:13.146 "name": "BaseBdev1", 00:13:13.146 "uuid": "ac03cae1-94af-4b09-8e5f-82c377fbb121", 00:13:13.146 "is_configured": true, 00:13:13.146 "data_offset": 2048, 00:13:13.146 "data_size": 63488 00:13:13.146 }, 00:13:13.146 { 00:13:13.146 "name": "BaseBdev2", 00:13:13.146 "uuid": "704fdaa4-1d5c-4ca5-9524-10d6f943f2c7", 00:13:13.146 "is_configured": true, 00:13:13.146 "data_offset": 2048, 00:13:13.146 "data_size": 63488 00:13:13.146 } 00:13:13.146 ] 00:13:13.146 }' 00:13:13.146 08:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:13.146 08:26:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:13.713 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:13.713 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:13.713 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:13.713 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:13.713 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:13.713 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:13.713 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:13.713 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:13.713 [2024-07-23 08:26:26.228593] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:13.972 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:13.972 "name": "Existed_Raid", 00:13:13.972 "aliases": [ 00:13:13.972 "45572762-38b8-4c7e-aaa0-a9a0fd0428b0" 00:13:13.972 ], 00:13:13.972 "product_name": "Raid Volume", 00:13:13.972 "block_size": 512, 00:13:13.972 "num_blocks": 63488, 00:13:13.972 "uuid": "45572762-38b8-4c7e-aaa0-a9a0fd0428b0", 00:13:13.972 "assigned_rate_limits": { 00:13:13.972 "rw_ios_per_sec": 0, 00:13:13.972 "rw_mbytes_per_sec": 0, 00:13:13.972 "r_mbytes_per_sec": 0, 00:13:13.972 "w_mbytes_per_sec": 0 00:13:13.972 }, 00:13:13.972 "claimed": false, 00:13:13.972 "zoned": false, 00:13:13.972 "supported_io_types": { 00:13:13.972 "read": true, 00:13:13.972 "write": true, 00:13:13.972 "unmap": false, 00:13:13.972 "flush": false, 00:13:13.972 "reset": true, 00:13:13.972 "nvme_admin": false, 00:13:13.972 "nvme_io": false, 00:13:13.972 "nvme_io_md": false, 00:13:13.972 "write_zeroes": true, 00:13:13.972 "zcopy": false, 00:13:13.972 "get_zone_info": false, 00:13:13.972 "zone_management": false, 00:13:13.972 "zone_append": false, 00:13:13.972 "compare": false, 00:13:13.972 "compare_and_write": false, 00:13:13.972 "abort": false, 00:13:13.972 "seek_hole": false, 00:13:13.972 "seek_data": false, 00:13:13.972 "copy": false, 00:13:13.972 "nvme_iov_md": false 00:13:13.972 }, 00:13:13.972 "memory_domains": [ 00:13:13.972 { 00:13:13.972 "dma_device_id": "system", 00:13:13.972 "dma_device_type": 1 00:13:13.972 }, 00:13:13.972 { 00:13:13.972 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:13.972 "dma_device_type": 2 00:13:13.972 }, 00:13:13.972 { 00:13:13.972 "dma_device_id": "system", 00:13:13.972 "dma_device_type": 1 00:13:13.972 }, 00:13:13.972 { 00:13:13.972 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:13.972 "dma_device_type": 2 00:13:13.972 } 00:13:13.972 ], 00:13:13.972 "driver_specific": { 00:13:13.972 "raid": { 00:13:13.972 "uuid": "45572762-38b8-4c7e-aaa0-a9a0fd0428b0", 00:13:13.972 "strip_size_kb": 0, 00:13:13.972 "state": "online", 00:13:13.972 "raid_level": "raid1", 00:13:13.972 "superblock": true, 00:13:13.972 "num_base_bdevs": 2, 00:13:13.972 "num_base_bdevs_discovered": 2, 00:13:13.973 "num_base_bdevs_operational": 2, 00:13:13.973 "base_bdevs_list": [ 00:13:13.973 { 00:13:13.973 "name": "BaseBdev1", 00:13:13.973 "uuid": "ac03cae1-94af-4b09-8e5f-82c377fbb121", 00:13:13.973 "is_configured": true, 00:13:13.973 "data_offset": 2048, 00:13:13.973 "data_size": 63488 00:13:13.973 }, 00:13:13.973 { 00:13:13.973 "name": "BaseBdev2", 00:13:13.973 "uuid": "704fdaa4-1d5c-4ca5-9524-10d6f943f2c7", 00:13:13.973 "is_configured": true, 00:13:13.973 "data_offset": 2048, 00:13:13.973 "data_size": 63488 00:13:13.973 } 00:13:13.973 ] 00:13:13.973 } 00:13:13.973 } 00:13:13.973 }' 00:13:13.973 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:13.973 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:13.973 BaseBdev2' 00:13:13.973 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:13.973 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:13.973 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:13.973 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:13.973 "name": "BaseBdev1", 00:13:13.973 "aliases": [ 00:13:13.973 "ac03cae1-94af-4b09-8e5f-82c377fbb121" 00:13:13.973 ], 00:13:13.973 "product_name": "Malloc disk", 00:13:13.973 "block_size": 512, 00:13:13.973 "num_blocks": 65536, 00:13:13.973 "uuid": "ac03cae1-94af-4b09-8e5f-82c377fbb121", 00:13:13.973 "assigned_rate_limits": { 00:13:13.973 "rw_ios_per_sec": 0, 00:13:13.973 "rw_mbytes_per_sec": 0, 00:13:13.973 "r_mbytes_per_sec": 0, 00:13:13.973 "w_mbytes_per_sec": 0 00:13:13.973 }, 00:13:13.973 "claimed": true, 00:13:13.973 "claim_type": "exclusive_write", 00:13:13.973 "zoned": false, 00:13:13.973 "supported_io_types": { 00:13:13.973 "read": true, 00:13:13.973 "write": true, 00:13:13.973 "unmap": true, 00:13:13.973 "flush": true, 00:13:13.973 "reset": true, 00:13:13.973 "nvme_admin": false, 00:13:13.973 "nvme_io": false, 00:13:13.973 "nvme_io_md": false, 00:13:13.973 "write_zeroes": true, 00:13:13.973 "zcopy": true, 00:13:13.973 "get_zone_info": false, 00:13:13.973 "zone_management": false, 00:13:13.973 "zone_append": false, 00:13:13.973 "compare": false, 00:13:13.973 "compare_and_write": false, 00:13:13.973 "abort": true, 00:13:13.973 "seek_hole": false, 00:13:13.973 "seek_data": false, 00:13:13.973 "copy": true, 00:13:13.973 "nvme_iov_md": false 00:13:13.973 }, 00:13:13.973 "memory_domains": [ 00:13:13.973 { 00:13:13.973 "dma_device_id": "system", 00:13:13.973 "dma_device_type": 1 00:13:13.973 }, 00:13:13.973 { 00:13:13.973 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:13.973 "dma_device_type": 2 00:13:13.973 } 00:13:13.973 ], 00:13:13.973 "driver_specific": {} 00:13:13.973 }' 00:13:13.973 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:13.973 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.232 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:14.232 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.232 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.232 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:14.232 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.232 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.232 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:14.232 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:14.232 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:14.491 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:14.491 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:14.491 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:14.491 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:14.491 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:14.491 "name": "BaseBdev2", 00:13:14.491 "aliases": [ 00:13:14.491 "704fdaa4-1d5c-4ca5-9524-10d6f943f2c7" 00:13:14.491 ], 00:13:14.491 "product_name": "Malloc disk", 00:13:14.491 "block_size": 512, 00:13:14.491 "num_blocks": 65536, 00:13:14.491 "uuid": "704fdaa4-1d5c-4ca5-9524-10d6f943f2c7", 00:13:14.491 "assigned_rate_limits": { 00:13:14.491 "rw_ios_per_sec": 0, 00:13:14.491 "rw_mbytes_per_sec": 0, 00:13:14.491 "r_mbytes_per_sec": 0, 00:13:14.491 "w_mbytes_per_sec": 0 00:13:14.491 }, 00:13:14.491 "claimed": true, 00:13:14.491 "claim_type": "exclusive_write", 00:13:14.491 "zoned": false, 00:13:14.491 "supported_io_types": { 00:13:14.491 "read": true, 00:13:14.491 "write": true, 00:13:14.491 "unmap": true, 00:13:14.491 "flush": true, 00:13:14.491 "reset": true, 00:13:14.491 "nvme_admin": false, 00:13:14.491 "nvme_io": false, 00:13:14.491 "nvme_io_md": false, 00:13:14.491 "write_zeroes": true, 00:13:14.491 "zcopy": true, 00:13:14.491 "get_zone_info": false, 00:13:14.491 "zone_management": false, 00:13:14.491 "zone_append": false, 00:13:14.491 "compare": false, 00:13:14.491 "compare_and_write": false, 00:13:14.491 "abort": true, 00:13:14.491 "seek_hole": false, 00:13:14.491 "seek_data": false, 00:13:14.491 "copy": true, 00:13:14.491 "nvme_iov_md": false 00:13:14.491 }, 00:13:14.491 "memory_domains": [ 00:13:14.491 { 00:13:14.491 "dma_device_id": "system", 00:13:14.491 "dma_device_type": 1 00:13:14.491 }, 00:13:14.491 { 00:13:14.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.491 "dma_device_type": 2 00:13:14.491 } 00:13:14.491 ], 00:13:14.491 "driver_specific": {} 00:13:14.491 }' 00:13:14.491 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.491 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.491 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:14.491 08:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.750 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.750 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:14.750 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.750 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.750 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:14.750 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:14.750 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:14.750 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:14.750 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:15.009 [2024-07-23 08:26:27.375416] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.009 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:15.267 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:15.268 "name": "Existed_Raid", 00:13:15.268 "uuid": "45572762-38b8-4c7e-aaa0-a9a0fd0428b0", 00:13:15.268 "strip_size_kb": 0, 00:13:15.268 "state": "online", 00:13:15.268 "raid_level": "raid1", 00:13:15.268 "superblock": true, 00:13:15.268 "num_base_bdevs": 2, 00:13:15.268 "num_base_bdevs_discovered": 1, 00:13:15.268 "num_base_bdevs_operational": 1, 00:13:15.268 "base_bdevs_list": [ 00:13:15.268 { 00:13:15.268 "name": null, 00:13:15.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:15.268 "is_configured": false, 00:13:15.268 "data_offset": 2048, 00:13:15.268 "data_size": 63488 00:13:15.268 }, 00:13:15.268 { 00:13:15.268 "name": "BaseBdev2", 00:13:15.268 "uuid": "704fdaa4-1d5c-4ca5-9524-10d6f943f2c7", 00:13:15.268 "is_configured": true, 00:13:15.268 "data_offset": 2048, 00:13:15.268 "data_size": 63488 00:13:15.268 } 00:13:15.268 ] 00:13:15.268 }' 00:13:15.268 08:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:15.268 08:26:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:15.835 08:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:15.835 08:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:15.835 08:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:15.835 08:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.835 08:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:15.835 08:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:15.835 08:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:16.094 [2024-07-23 08:26:28.395398] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:16.094 [2024-07-23 08:26:28.395500] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:16.094 [2024-07-23 08:26:28.488498] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:16.094 [2024-07-23 08:26:28.488546] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:16.094 [2024-07-23 08:26:28.488559] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:13:16.094 08:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:16.094 08:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:16.094 08:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.094 08:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:16.353 08:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:16.353 08:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:16.353 08:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:16.353 08:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1416726 00:13:16.353 08:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1416726 ']' 00:13:16.353 08:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1416726 00:13:16.353 08:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:13:16.353 08:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:16.353 08:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1416726 00:13:16.353 08:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:16.353 08:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:16.353 08:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1416726' 00:13:16.353 killing process with pid 1416726 00:13:16.353 08:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1416726 00:13:16.353 [2024-07-23 08:26:28.718400] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:16.353 08:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1416726 00:13:16.353 [2024-07-23 08:26:28.736458] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:17.731 08:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:17.731 00:13:17.731 real 0m9.369s 00:13:17.731 user 0m15.595s 00:13:17.731 sys 0m1.446s 00:13:17.731 08:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:17.731 08:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:17.731 ************************************ 00:13:17.731 END TEST raid_state_function_test_sb 00:13:17.731 ************************************ 00:13:17.731 08:26:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:17.731 08:26:30 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:13:17.731 08:26:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:13:17.731 08:26:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:17.731 08:26:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:17.731 ************************************ 00:13:17.731 START TEST raid_superblock_test 00:13:17.731 ************************************ 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1418709 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1418709 /var/tmp/spdk-raid.sock 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1418709 ']' 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:17.731 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:17.731 08:26:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:17.731 [2024-07-23 08:26:30.132330] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:13:17.731 [2024-07-23 08:26:30.132416] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1418709 ] 00:13:17.990 [2024-07-23 08:26:30.256494] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:17.990 [2024-07-23 08:26:30.464428] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:18.249 [2024-07-23 08:26:30.743808] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:18.249 [2024-07-23 08:26:30.743835] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:18.509 08:26:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:18.509 08:26:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:13:18.509 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:18.509 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:18.509 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:18.509 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:18.509 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:18.509 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:18.509 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:18.509 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:18.509 08:26:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:18.768 malloc1 00:13:18.768 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:18.768 [2024-07-23 08:26:31.283402] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:18.768 [2024-07-23 08:26:31.283454] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:18.768 [2024-07-23 08:26:31.283476] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:13:18.768 [2024-07-23 08:26:31.283488] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:18.768 [2024-07-23 08:26:31.285472] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:18.768 [2024-07-23 08:26:31.285499] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:19.027 pt1 00:13:19.027 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:19.027 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:19.027 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:19.027 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:19.027 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:19.027 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:19.027 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:19.028 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:19.028 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:19.028 malloc2 00:13:19.028 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:19.286 [2024-07-23 08:26:31.646176] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:19.286 [2024-07-23 08:26:31.646225] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:19.286 [2024-07-23 08:26:31.646260] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:13:19.286 [2024-07-23 08:26:31.646272] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:19.287 [2024-07-23 08:26:31.648230] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:19.287 [2024-07-23 08:26:31.648257] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:19.287 pt2 00:13:19.287 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:19.287 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:19.287 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:13:19.545 [2024-07-23 08:26:31.814646] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:19.545 [2024-07-23 08:26:31.816296] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:19.545 [2024-07-23 08:26:31.816481] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035a80 00:13:19.545 [2024-07-23 08:26:31.816494] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:19.545 [2024-07-23 08:26:31.816761] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:13:19.545 [2024-07-23 08:26:31.816969] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035a80 00:13:19.545 [2024-07-23 08:26:31.816985] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000035a80 00:13:19.545 [2024-07-23 08:26:31.817142] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:19.546 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:19.546 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:19.546 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:19.546 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:19.546 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:19.546 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:19.546 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:19.546 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:19.546 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:19.546 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:19.546 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:19.546 08:26:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:19.546 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:19.546 "name": "raid_bdev1", 00:13:19.546 "uuid": "1b4be173-80e4-47f3-8d18-6f873f9b1ee5", 00:13:19.546 "strip_size_kb": 0, 00:13:19.546 "state": "online", 00:13:19.546 "raid_level": "raid1", 00:13:19.546 "superblock": true, 00:13:19.546 "num_base_bdevs": 2, 00:13:19.546 "num_base_bdevs_discovered": 2, 00:13:19.546 "num_base_bdevs_operational": 2, 00:13:19.546 "base_bdevs_list": [ 00:13:19.546 { 00:13:19.546 "name": "pt1", 00:13:19.546 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:19.546 "is_configured": true, 00:13:19.546 "data_offset": 2048, 00:13:19.546 "data_size": 63488 00:13:19.546 }, 00:13:19.546 { 00:13:19.546 "name": "pt2", 00:13:19.546 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:19.546 "is_configured": true, 00:13:19.546 "data_offset": 2048, 00:13:19.546 "data_size": 63488 00:13:19.546 } 00:13:19.546 ] 00:13:19.546 }' 00:13:19.546 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:19.546 08:26:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:20.114 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:20.114 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:20.114 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:20.114 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:20.114 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:20.114 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:20.114 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:20.114 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:20.373 [2024-07-23 08:26:32.649047] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:20.373 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:20.373 "name": "raid_bdev1", 00:13:20.373 "aliases": [ 00:13:20.373 "1b4be173-80e4-47f3-8d18-6f873f9b1ee5" 00:13:20.373 ], 00:13:20.373 "product_name": "Raid Volume", 00:13:20.374 "block_size": 512, 00:13:20.374 "num_blocks": 63488, 00:13:20.374 "uuid": "1b4be173-80e4-47f3-8d18-6f873f9b1ee5", 00:13:20.374 "assigned_rate_limits": { 00:13:20.374 "rw_ios_per_sec": 0, 00:13:20.374 "rw_mbytes_per_sec": 0, 00:13:20.374 "r_mbytes_per_sec": 0, 00:13:20.374 "w_mbytes_per_sec": 0 00:13:20.374 }, 00:13:20.374 "claimed": false, 00:13:20.374 "zoned": false, 00:13:20.374 "supported_io_types": { 00:13:20.374 "read": true, 00:13:20.374 "write": true, 00:13:20.374 "unmap": false, 00:13:20.374 "flush": false, 00:13:20.374 "reset": true, 00:13:20.374 "nvme_admin": false, 00:13:20.374 "nvme_io": false, 00:13:20.374 "nvme_io_md": false, 00:13:20.374 "write_zeroes": true, 00:13:20.374 "zcopy": false, 00:13:20.374 "get_zone_info": false, 00:13:20.374 "zone_management": false, 00:13:20.374 "zone_append": false, 00:13:20.374 "compare": false, 00:13:20.374 "compare_and_write": false, 00:13:20.374 "abort": false, 00:13:20.374 "seek_hole": false, 00:13:20.374 "seek_data": false, 00:13:20.374 "copy": false, 00:13:20.374 "nvme_iov_md": false 00:13:20.374 }, 00:13:20.374 "memory_domains": [ 00:13:20.374 { 00:13:20.374 "dma_device_id": "system", 00:13:20.374 "dma_device_type": 1 00:13:20.374 }, 00:13:20.374 { 00:13:20.374 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.374 "dma_device_type": 2 00:13:20.374 }, 00:13:20.374 { 00:13:20.374 "dma_device_id": "system", 00:13:20.374 "dma_device_type": 1 00:13:20.374 }, 00:13:20.374 { 00:13:20.374 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.374 "dma_device_type": 2 00:13:20.374 } 00:13:20.374 ], 00:13:20.374 "driver_specific": { 00:13:20.374 "raid": { 00:13:20.374 "uuid": "1b4be173-80e4-47f3-8d18-6f873f9b1ee5", 00:13:20.374 "strip_size_kb": 0, 00:13:20.374 "state": "online", 00:13:20.374 "raid_level": "raid1", 00:13:20.374 "superblock": true, 00:13:20.374 "num_base_bdevs": 2, 00:13:20.374 "num_base_bdevs_discovered": 2, 00:13:20.374 "num_base_bdevs_operational": 2, 00:13:20.374 "base_bdevs_list": [ 00:13:20.374 { 00:13:20.374 "name": "pt1", 00:13:20.374 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:20.374 "is_configured": true, 00:13:20.374 "data_offset": 2048, 00:13:20.374 "data_size": 63488 00:13:20.374 }, 00:13:20.374 { 00:13:20.374 "name": "pt2", 00:13:20.374 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:20.374 "is_configured": true, 00:13:20.374 "data_offset": 2048, 00:13:20.374 "data_size": 63488 00:13:20.374 } 00:13:20.374 ] 00:13:20.374 } 00:13:20.374 } 00:13:20.374 }' 00:13:20.374 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:20.374 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:20.374 pt2' 00:13:20.374 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:20.374 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:20.374 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:20.374 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:20.374 "name": "pt1", 00:13:20.374 "aliases": [ 00:13:20.374 "00000000-0000-0000-0000-000000000001" 00:13:20.374 ], 00:13:20.374 "product_name": "passthru", 00:13:20.374 "block_size": 512, 00:13:20.374 "num_blocks": 65536, 00:13:20.374 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:20.374 "assigned_rate_limits": { 00:13:20.374 "rw_ios_per_sec": 0, 00:13:20.374 "rw_mbytes_per_sec": 0, 00:13:20.374 "r_mbytes_per_sec": 0, 00:13:20.374 "w_mbytes_per_sec": 0 00:13:20.374 }, 00:13:20.374 "claimed": true, 00:13:20.374 "claim_type": "exclusive_write", 00:13:20.374 "zoned": false, 00:13:20.374 "supported_io_types": { 00:13:20.374 "read": true, 00:13:20.374 "write": true, 00:13:20.374 "unmap": true, 00:13:20.374 "flush": true, 00:13:20.374 "reset": true, 00:13:20.374 "nvme_admin": false, 00:13:20.374 "nvme_io": false, 00:13:20.374 "nvme_io_md": false, 00:13:20.374 "write_zeroes": true, 00:13:20.374 "zcopy": true, 00:13:20.374 "get_zone_info": false, 00:13:20.374 "zone_management": false, 00:13:20.374 "zone_append": false, 00:13:20.374 "compare": false, 00:13:20.374 "compare_and_write": false, 00:13:20.374 "abort": true, 00:13:20.374 "seek_hole": false, 00:13:20.374 "seek_data": false, 00:13:20.374 "copy": true, 00:13:20.374 "nvme_iov_md": false 00:13:20.374 }, 00:13:20.374 "memory_domains": [ 00:13:20.374 { 00:13:20.374 "dma_device_id": "system", 00:13:20.374 "dma_device_type": 1 00:13:20.374 }, 00:13:20.374 { 00:13:20.374 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.374 "dma_device_type": 2 00:13:20.374 } 00:13:20.374 ], 00:13:20.374 "driver_specific": { 00:13:20.374 "passthru": { 00:13:20.374 "name": "pt1", 00:13:20.374 "base_bdev_name": "malloc1" 00:13:20.374 } 00:13:20.374 } 00:13:20.374 }' 00:13:20.374 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:20.635 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:20.635 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:20.635 08:26:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:20.635 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:20.635 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:20.635 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:20.635 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:20.635 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:20.635 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:20.635 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:20.900 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:20.900 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:20.900 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:20.900 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:20.900 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:20.900 "name": "pt2", 00:13:20.900 "aliases": [ 00:13:20.900 "00000000-0000-0000-0000-000000000002" 00:13:20.900 ], 00:13:20.900 "product_name": "passthru", 00:13:20.900 "block_size": 512, 00:13:20.900 "num_blocks": 65536, 00:13:20.900 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:20.900 "assigned_rate_limits": { 00:13:20.900 "rw_ios_per_sec": 0, 00:13:20.900 "rw_mbytes_per_sec": 0, 00:13:20.900 "r_mbytes_per_sec": 0, 00:13:20.900 "w_mbytes_per_sec": 0 00:13:20.900 }, 00:13:20.900 "claimed": true, 00:13:20.900 "claim_type": "exclusive_write", 00:13:20.900 "zoned": false, 00:13:20.900 "supported_io_types": { 00:13:20.900 "read": true, 00:13:20.900 "write": true, 00:13:20.900 "unmap": true, 00:13:20.900 "flush": true, 00:13:20.900 "reset": true, 00:13:20.900 "nvme_admin": false, 00:13:20.900 "nvme_io": false, 00:13:20.900 "nvme_io_md": false, 00:13:20.900 "write_zeroes": true, 00:13:20.900 "zcopy": true, 00:13:20.900 "get_zone_info": false, 00:13:20.900 "zone_management": false, 00:13:20.900 "zone_append": false, 00:13:20.900 "compare": false, 00:13:20.900 "compare_and_write": false, 00:13:20.900 "abort": true, 00:13:20.900 "seek_hole": false, 00:13:20.900 "seek_data": false, 00:13:20.900 "copy": true, 00:13:20.900 "nvme_iov_md": false 00:13:20.900 }, 00:13:20.900 "memory_domains": [ 00:13:20.900 { 00:13:20.900 "dma_device_id": "system", 00:13:20.900 "dma_device_type": 1 00:13:20.900 }, 00:13:20.900 { 00:13:20.900 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.900 "dma_device_type": 2 00:13:20.900 } 00:13:20.900 ], 00:13:20.900 "driver_specific": { 00:13:20.900 "passthru": { 00:13:20.900 "name": "pt2", 00:13:20.900 "base_bdev_name": "malloc2" 00:13:20.900 } 00:13:20.900 } 00:13:20.900 }' 00:13:20.900 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:20.900 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:21.233 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:21.233 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:21.233 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:21.233 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:21.233 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:21.233 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:21.233 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:21.233 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:21.233 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:21.233 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:21.233 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:21.233 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:21.491 [2024-07-23 08:26:33.852215] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:21.491 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=1b4be173-80e4-47f3-8d18-6f873f9b1ee5 00:13:21.491 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 1b4be173-80e4-47f3-8d18-6f873f9b1ee5 ']' 00:13:21.491 08:26:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:21.750 [2024-07-23 08:26:34.036450] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:21.750 [2024-07-23 08:26:34.036476] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:21.750 [2024-07-23 08:26:34.036560] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:21.750 [2024-07-23 08:26:34.036624] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:21.750 [2024-07-23 08:26:34.036639] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035a80 name raid_bdev1, state offline 00:13:21.750 08:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.750 08:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:21.750 08:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:21.750 08:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:21.750 08:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:21.750 08:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:22.008 08:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:22.008 08:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:22.266 08:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:22.266 08:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:22.266 08:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:22.266 08:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:22.266 08:26:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:22.266 08:26:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:22.266 08:26:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:22.266 08:26:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:22.266 08:26:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:22.266 08:26:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:22.266 08:26:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:22.266 08:26:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:22.266 08:26:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:22.266 08:26:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:22.266 08:26:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:22.525 [2024-07-23 08:26:34.870659] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:22.525 [2024-07-23 08:26:34.872256] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:22.525 [2024-07-23 08:26:34.872314] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:22.525 [2024-07-23 08:26:34.872374] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:22.525 [2024-07-23 08:26:34.872393] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:22.525 [2024-07-23 08:26:34.872404] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036080 name raid_bdev1, state configuring 00:13:22.525 request: 00:13:22.525 { 00:13:22.525 "name": "raid_bdev1", 00:13:22.525 "raid_level": "raid1", 00:13:22.525 "base_bdevs": [ 00:13:22.525 "malloc1", 00:13:22.525 "malloc2" 00:13:22.525 ], 00:13:22.525 "superblock": false, 00:13:22.525 "method": "bdev_raid_create", 00:13:22.525 "req_id": 1 00:13:22.525 } 00:13:22.525 Got JSON-RPC error response 00:13:22.525 response: 00:13:22.525 { 00:13:22.525 "code": -17, 00:13:22.525 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:22.525 } 00:13:22.525 08:26:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:22.525 08:26:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:22.525 08:26:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:22.525 08:26:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:22.525 08:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:22.525 08:26:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.784 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:22.784 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:22.784 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:22.784 [2024-07-23 08:26:35.211474] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:22.784 [2024-07-23 08:26:35.211527] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:22.784 [2024-07-23 08:26:35.211558] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036680 00:13:22.784 [2024-07-23 08:26:35.211569] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:22.784 [2024-07-23 08:26:35.213550] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:22.784 [2024-07-23 08:26:35.213582] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:22.784 [2024-07-23 08:26:35.213672] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:22.784 [2024-07-23 08:26:35.213751] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:22.784 pt1 00:13:22.784 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:13:22.784 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:22.784 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:22.784 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:22.784 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:22.784 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:22.784 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:22.784 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:22.784 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:22.784 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:22.784 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.784 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:23.043 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:23.043 "name": "raid_bdev1", 00:13:23.043 "uuid": "1b4be173-80e4-47f3-8d18-6f873f9b1ee5", 00:13:23.043 "strip_size_kb": 0, 00:13:23.043 "state": "configuring", 00:13:23.043 "raid_level": "raid1", 00:13:23.043 "superblock": true, 00:13:23.043 "num_base_bdevs": 2, 00:13:23.043 "num_base_bdevs_discovered": 1, 00:13:23.043 "num_base_bdevs_operational": 2, 00:13:23.043 "base_bdevs_list": [ 00:13:23.043 { 00:13:23.043 "name": "pt1", 00:13:23.043 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:23.043 "is_configured": true, 00:13:23.043 "data_offset": 2048, 00:13:23.043 "data_size": 63488 00:13:23.043 }, 00:13:23.043 { 00:13:23.043 "name": null, 00:13:23.043 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:23.043 "is_configured": false, 00:13:23.043 "data_offset": 2048, 00:13:23.043 "data_size": 63488 00:13:23.043 } 00:13:23.043 ] 00:13:23.043 }' 00:13:23.043 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:23.043 08:26:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:24.016 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:13:24.016 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:24.016 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:24.016 08:26:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:24.016 [2024-07-23 08:26:36.021628] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:24.016 [2024-07-23 08:26:36.021689] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:24.016 [2024-07-23 08:26:36.021708] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036f80 00:13:24.016 [2024-07-23 08:26:36.021719] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:24.016 [2024-07-23 08:26:36.022149] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:24.016 [2024-07-23 08:26:36.022168] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:24.016 [2024-07-23 08:26:36.022238] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:24.016 [2024-07-23 08:26:36.022261] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:24.016 [2024-07-23 08:26:36.022396] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036c80 00:13:24.016 [2024-07-23 08:26:36.022410] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:24.016 [2024-07-23 08:26:36.022625] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:13:24.016 [2024-07-23 08:26:36.022801] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036c80 00:13:24.016 [2024-07-23 08:26:36.022810] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036c80 00:13:24.016 [2024-07-23 08:26:36.022950] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:24.016 pt2 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.016 "name": "raid_bdev1", 00:13:24.016 "uuid": "1b4be173-80e4-47f3-8d18-6f873f9b1ee5", 00:13:24.016 "strip_size_kb": 0, 00:13:24.016 "state": "online", 00:13:24.016 "raid_level": "raid1", 00:13:24.016 "superblock": true, 00:13:24.016 "num_base_bdevs": 2, 00:13:24.016 "num_base_bdevs_discovered": 2, 00:13:24.016 "num_base_bdevs_operational": 2, 00:13:24.016 "base_bdevs_list": [ 00:13:24.016 { 00:13:24.016 "name": "pt1", 00:13:24.016 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:24.016 "is_configured": true, 00:13:24.016 "data_offset": 2048, 00:13:24.016 "data_size": 63488 00:13:24.016 }, 00:13:24.016 { 00:13:24.016 "name": "pt2", 00:13:24.016 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:24.016 "is_configured": true, 00:13:24.016 "data_offset": 2048, 00:13:24.016 "data_size": 63488 00:13:24.016 } 00:13:24.016 ] 00:13:24.016 }' 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.016 08:26:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:24.273 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:24.273 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:24.273 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:24.273 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:24.273 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:24.273 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:24.273 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:24.273 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:24.531 [2024-07-23 08:26:36.876094] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:24.531 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:24.531 "name": "raid_bdev1", 00:13:24.531 "aliases": [ 00:13:24.531 "1b4be173-80e4-47f3-8d18-6f873f9b1ee5" 00:13:24.531 ], 00:13:24.531 "product_name": "Raid Volume", 00:13:24.531 "block_size": 512, 00:13:24.531 "num_blocks": 63488, 00:13:24.531 "uuid": "1b4be173-80e4-47f3-8d18-6f873f9b1ee5", 00:13:24.531 "assigned_rate_limits": { 00:13:24.531 "rw_ios_per_sec": 0, 00:13:24.531 "rw_mbytes_per_sec": 0, 00:13:24.531 "r_mbytes_per_sec": 0, 00:13:24.531 "w_mbytes_per_sec": 0 00:13:24.531 }, 00:13:24.531 "claimed": false, 00:13:24.531 "zoned": false, 00:13:24.531 "supported_io_types": { 00:13:24.531 "read": true, 00:13:24.531 "write": true, 00:13:24.531 "unmap": false, 00:13:24.531 "flush": false, 00:13:24.531 "reset": true, 00:13:24.531 "nvme_admin": false, 00:13:24.531 "nvme_io": false, 00:13:24.531 "nvme_io_md": false, 00:13:24.531 "write_zeroes": true, 00:13:24.531 "zcopy": false, 00:13:24.531 "get_zone_info": false, 00:13:24.531 "zone_management": false, 00:13:24.531 "zone_append": false, 00:13:24.531 "compare": false, 00:13:24.531 "compare_and_write": false, 00:13:24.531 "abort": false, 00:13:24.531 "seek_hole": false, 00:13:24.531 "seek_data": false, 00:13:24.531 "copy": false, 00:13:24.531 "nvme_iov_md": false 00:13:24.531 }, 00:13:24.531 "memory_domains": [ 00:13:24.531 { 00:13:24.531 "dma_device_id": "system", 00:13:24.531 "dma_device_type": 1 00:13:24.531 }, 00:13:24.531 { 00:13:24.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.531 "dma_device_type": 2 00:13:24.531 }, 00:13:24.531 { 00:13:24.531 "dma_device_id": "system", 00:13:24.531 "dma_device_type": 1 00:13:24.531 }, 00:13:24.531 { 00:13:24.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.531 "dma_device_type": 2 00:13:24.531 } 00:13:24.531 ], 00:13:24.531 "driver_specific": { 00:13:24.531 "raid": { 00:13:24.531 "uuid": "1b4be173-80e4-47f3-8d18-6f873f9b1ee5", 00:13:24.531 "strip_size_kb": 0, 00:13:24.531 "state": "online", 00:13:24.531 "raid_level": "raid1", 00:13:24.531 "superblock": true, 00:13:24.531 "num_base_bdevs": 2, 00:13:24.531 "num_base_bdevs_discovered": 2, 00:13:24.531 "num_base_bdevs_operational": 2, 00:13:24.531 "base_bdevs_list": [ 00:13:24.531 { 00:13:24.531 "name": "pt1", 00:13:24.531 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:24.531 "is_configured": true, 00:13:24.531 "data_offset": 2048, 00:13:24.531 "data_size": 63488 00:13:24.531 }, 00:13:24.531 { 00:13:24.531 "name": "pt2", 00:13:24.531 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:24.531 "is_configured": true, 00:13:24.531 "data_offset": 2048, 00:13:24.531 "data_size": 63488 00:13:24.531 } 00:13:24.531 ] 00:13:24.531 } 00:13:24.531 } 00:13:24.531 }' 00:13:24.531 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:24.531 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:24.531 pt2' 00:13:24.531 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:24.531 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:24.531 08:26:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:24.789 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:24.789 "name": "pt1", 00:13:24.789 "aliases": [ 00:13:24.789 "00000000-0000-0000-0000-000000000001" 00:13:24.789 ], 00:13:24.789 "product_name": "passthru", 00:13:24.789 "block_size": 512, 00:13:24.789 "num_blocks": 65536, 00:13:24.789 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:24.789 "assigned_rate_limits": { 00:13:24.789 "rw_ios_per_sec": 0, 00:13:24.789 "rw_mbytes_per_sec": 0, 00:13:24.789 "r_mbytes_per_sec": 0, 00:13:24.789 "w_mbytes_per_sec": 0 00:13:24.789 }, 00:13:24.789 "claimed": true, 00:13:24.789 "claim_type": "exclusive_write", 00:13:24.789 "zoned": false, 00:13:24.789 "supported_io_types": { 00:13:24.789 "read": true, 00:13:24.789 "write": true, 00:13:24.789 "unmap": true, 00:13:24.789 "flush": true, 00:13:24.789 "reset": true, 00:13:24.789 "nvme_admin": false, 00:13:24.789 "nvme_io": false, 00:13:24.789 "nvme_io_md": false, 00:13:24.789 "write_zeroes": true, 00:13:24.789 "zcopy": true, 00:13:24.789 "get_zone_info": false, 00:13:24.789 "zone_management": false, 00:13:24.789 "zone_append": false, 00:13:24.789 "compare": false, 00:13:24.789 "compare_and_write": false, 00:13:24.789 "abort": true, 00:13:24.789 "seek_hole": false, 00:13:24.789 "seek_data": false, 00:13:24.789 "copy": true, 00:13:24.789 "nvme_iov_md": false 00:13:24.789 }, 00:13:24.789 "memory_domains": [ 00:13:24.789 { 00:13:24.789 "dma_device_id": "system", 00:13:24.789 "dma_device_type": 1 00:13:24.789 }, 00:13:24.789 { 00:13:24.789 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.789 "dma_device_type": 2 00:13:24.789 } 00:13:24.789 ], 00:13:24.789 "driver_specific": { 00:13:24.789 "passthru": { 00:13:24.789 "name": "pt1", 00:13:24.789 "base_bdev_name": "malloc1" 00:13:24.789 } 00:13:24.789 } 00:13:24.789 }' 00:13:24.789 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.789 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.789 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:24.789 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.789 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.789 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:24.789 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.789 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:25.048 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:25.048 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:25.048 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:25.048 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:25.048 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:25.048 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:25.048 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:25.307 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:25.307 "name": "pt2", 00:13:25.307 "aliases": [ 00:13:25.307 "00000000-0000-0000-0000-000000000002" 00:13:25.307 ], 00:13:25.307 "product_name": "passthru", 00:13:25.307 "block_size": 512, 00:13:25.307 "num_blocks": 65536, 00:13:25.307 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:25.307 "assigned_rate_limits": { 00:13:25.307 "rw_ios_per_sec": 0, 00:13:25.307 "rw_mbytes_per_sec": 0, 00:13:25.307 "r_mbytes_per_sec": 0, 00:13:25.307 "w_mbytes_per_sec": 0 00:13:25.307 }, 00:13:25.307 "claimed": true, 00:13:25.307 "claim_type": "exclusive_write", 00:13:25.307 "zoned": false, 00:13:25.307 "supported_io_types": { 00:13:25.307 "read": true, 00:13:25.307 "write": true, 00:13:25.307 "unmap": true, 00:13:25.307 "flush": true, 00:13:25.307 "reset": true, 00:13:25.307 "nvme_admin": false, 00:13:25.307 "nvme_io": false, 00:13:25.307 "nvme_io_md": false, 00:13:25.307 "write_zeroes": true, 00:13:25.307 "zcopy": true, 00:13:25.307 "get_zone_info": false, 00:13:25.307 "zone_management": false, 00:13:25.307 "zone_append": false, 00:13:25.307 "compare": false, 00:13:25.307 "compare_and_write": false, 00:13:25.307 "abort": true, 00:13:25.307 "seek_hole": false, 00:13:25.307 "seek_data": false, 00:13:25.307 "copy": true, 00:13:25.307 "nvme_iov_md": false 00:13:25.307 }, 00:13:25.307 "memory_domains": [ 00:13:25.307 { 00:13:25.307 "dma_device_id": "system", 00:13:25.307 "dma_device_type": 1 00:13:25.307 }, 00:13:25.307 { 00:13:25.307 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.307 "dma_device_type": 2 00:13:25.307 } 00:13:25.307 ], 00:13:25.307 "driver_specific": { 00:13:25.307 "passthru": { 00:13:25.307 "name": "pt2", 00:13:25.307 "base_bdev_name": "malloc2" 00:13:25.307 } 00:13:25.307 } 00:13:25.307 }' 00:13:25.307 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:25.307 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:25.307 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:25.307 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:25.307 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:25.307 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:25.307 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:25.307 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:25.307 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:25.307 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:25.566 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:25.566 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:25.566 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:25.566 08:26:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:25.566 [2024-07-23 08:26:38.039169] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:25.566 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 1b4be173-80e4-47f3-8d18-6f873f9b1ee5 '!=' 1b4be173-80e4-47f3-8d18-6f873f9b1ee5 ']' 00:13:25.566 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:13:25.567 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:25.567 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:25.567 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:25.825 [2024-07-23 08:26:38.211398] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:13:25.825 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:25.825 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:25.825 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:25.825 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:25.825 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:25.825 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:25.825 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:25.825 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:25.825 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:25.825 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:25.825 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.825 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:26.083 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:26.083 "name": "raid_bdev1", 00:13:26.083 "uuid": "1b4be173-80e4-47f3-8d18-6f873f9b1ee5", 00:13:26.083 "strip_size_kb": 0, 00:13:26.083 "state": "online", 00:13:26.083 "raid_level": "raid1", 00:13:26.083 "superblock": true, 00:13:26.083 "num_base_bdevs": 2, 00:13:26.083 "num_base_bdevs_discovered": 1, 00:13:26.083 "num_base_bdevs_operational": 1, 00:13:26.083 "base_bdevs_list": [ 00:13:26.083 { 00:13:26.083 "name": null, 00:13:26.083 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.083 "is_configured": false, 00:13:26.083 "data_offset": 2048, 00:13:26.083 "data_size": 63488 00:13:26.083 }, 00:13:26.083 { 00:13:26.083 "name": "pt2", 00:13:26.083 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:26.083 "is_configured": true, 00:13:26.083 "data_offset": 2048, 00:13:26.083 "data_size": 63488 00:13:26.083 } 00:13:26.083 ] 00:13:26.083 }' 00:13:26.083 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:26.083 08:26:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:26.649 08:26:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:26.649 [2024-07-23 08:26:39.045551] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:26.649 [2024-07-23 08:26:39.045578] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:26.649 [2024-07-23 08:26:39.045653] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:26.649 [2024-07-23 08:26:39.045699] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:26.649 [2024-07-23 08:26:39.045711] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036c80 name raid_bdev1, state offline 00:13:26.649 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.649 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:13:26.908 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:13:26.908 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:13:26.908 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:13:26.908 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:13:26.908 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:26.908 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:13:26.908 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:13:26.908 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:13:26.908 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:13:26.908 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:13:26.908 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:27.166 [2024-07-23 08:26:39.554887] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:27.166 [2024-07-23 08:26:39.554946] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:27.166 [2024-07-23 08:26:39.554962] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037280 00:13:27.166 [2024-07-23 08:26:39.554974] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:27.166 [2024-07-23 08:26:39.556990] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:27.166 [2024-07-23 08:26:39.557021] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:27.166 [2024-07-23 08:26:39.557098] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:27.166 [2024-07-23 08:26:39.557140] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:27.166 [2024-07-23 08:26:39.557269] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037880 00:13:27.166 [2024-07-23 08:26:39.557280] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:27.166 [2024-07-23 08:26:39.557510] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:13:27.166 [2024-07-23 08:26:39.557699] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037880 00:13:27.166 [2024-07-23 08:26:39.557712] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037880 00:13:27.166 [2024-07-23 08:26:39.557869] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:27.166 pt2 00:13:27.166 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:27.167 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:27.167 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:27.167 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:27.167 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:27.167 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:27.167 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:27.167 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:27.167 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:27.167 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:27.167 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:27.167 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:27.425 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:27.425 "name": "raid_bdev1", 00:13:27.425 "uuid": "1b4be173-80e4-47f3-8d18-6f873f9b1ee5", 00:13:27.425 "strip_size_kb": 0, 00:13:27.425 "state": "online", 00:13:27.425 "raid_level": "raid1", 00:13:27.425 "superblock": true, 00:13:27.425 "num_base_bdevs": 2, 00:13:27.425 "num_base_bdevs_discovered": 1, 00:13:27.425 "num_base_bdevs_operational": 1, 00:13:27.425 "base_bdevs_list": [ 00:13:27.425 { 00:13:27.425 "name": null, 00:13:27.425 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:27.425 "is_configured": false, 00:13:27.425 "data_offset": 2048, 00:13:27.425 "data_size": 63488 00:13:27.425 }, 00:13:27.425 { 00:13:27.425 "name": "pt2", 00:13:27.425 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:27.425 "is_configured": true, 00:13:27.425 "data_offset": 2048, 00:13:27.425 "data_size": 63488 00:13:27.425 } 00:13:27.425 ] 00:13:27.425 }' 00:13:27.425 08:26:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:27.425 08:26:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:27.683 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:27.941 [2024-07-23 08:26:40.336990] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:27.941 [2024-07-23 08:26:40.337020] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:27.941 [2024-07-23 08:26:40.337090] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:27.941 [2024-07-23 08:26:40.337139] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:27.941 [2024-07-23 08:26:40.337149] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037880 name raid_bdev1, state offline 00:13:27.941 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:27.941 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:28.200 [2024-07-23 08:26:40.669835] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:28.200 [2024-07-23 08:26:40.669888] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:28.200 [2024-07-23 08:26:40.669905] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037b80 00:13:28.200 [2024-07-23 08:26:40.669915] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:28.200 [2024-07-23 08:26:40.671917] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:28.200 [2024-07-23 08:26:40.671945] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:28.200 [2024-07-23 08:26:40.672020] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:28.200 [2024-07-23 08:26:40.672066] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:28.200 [2024-07-23 08:26:40.672214] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:13:28.200 [2024-07-23 08:26:40.672225] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:28.200 [2024-07-23 08:26:40.672242] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038180 name raid_bdev1, state configuring 00:13:28.200 [2024-07-23 08:26:40.672301] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:28.200 [2024-07-23 08:26:40.672372] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000038480 00:13:28.200 [2024-07-23 08:26:40.672381] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:28.200 [2024-07-23 08:26:40.672603] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:13:28.200 [2024-07-23 08:26:40.672775] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000038480 00:13:28.200 [2024-07-23 08:26:40.672786] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000038480 00:13:28.200 [2024-07-23 08:26:40.672929] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:28.200 pt1 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.200 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:28.459 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:28.459 "name": "raid_bdev1", 00:13:28.459 "uuid": "1b4be173-80e4-47f3-8d18-6f873f9b1ee5", 00:13:28.459 "strip_size_kb": 0, 00:13:28.459 "state": "online", 00:13:28.459 "raid_level": "raid1", 00:13:28.459 "superblock": true, 00:13:28.459 "num_base_bdevs": 2, 00:13:28.459 "num_base_bdevs_discovered": 1, 00:13:28.459 "num_base_bdevs_operational": 1, 00:13:28.459 "base_bdevs_list": [ 00:13:28.459 { 00:13:28.459 "name": null, 00:13:28.459 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.459 "is_configured": false, 00:13:28.459 "data_offset": 2048, 00:13:28.459 "data_size": 63488 00:13:28.459 }, 00:13:28.459 { 00:13:28.459 "name": "pt2", 00:13:28.459 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:28.459 "is_configured": true, 00:13:28.459 "data_offset": 2048, 00:13:28.459 "data_size": 63488 00:13:28.459 } 00:13:28.459 ] 00:13:28.459 }' 00:13:28.459 08:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:28.459 08:26:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:29.026 08:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:13:29.026 08:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:13:29.026 08:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:13:29.026 08:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:29.026 08:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:13:29.285 [2024-07-23 08:26:41.676725] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:29.285 08:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 1b4be173-80e4-47f3-8d18-6f873f9b1ee5 '!=' 1b4be173-80e4-47f3-8d18-6f873f9b1ee5 ']' 00:13:29.285 08:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1418709 00:13:29.285 08:26:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1418709 ']' 00:13:29.285 08:26:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1418709 00:13:29.285 08:26:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:13:29.285 08:26:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:29.285 08:26:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1418709 00:13:29.285 08:26:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:29.285 08:26:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:29.285 08:26:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1418709' 00:13:29.285 killing process with pid 1418709 00:13:29.285 08:26:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1418709 00:13:29.285 [2024-07-23 08:26:41.732231] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:29.285 [2024-07-23 08:26:41.732316] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:29.285 [2024-07-23 08:26:41.732361] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:29.285 [2024-07-23 08:26:41.732373] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038480 name raid_bdev1, state offline 00:13:29.285 08:26:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1418709 00:13:29.545 [2024-07-23 08:26:41.879633] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:30.923 08:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:30.923 00:13:30.923 real 0m13.076s 00:13:30.923 user 0m22.851s 00:13:30.923 sys 0m1.935s 00:13:30.923 08:26:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:30.923 08:26:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:30.923 ************************************ 00:13:30.923 END TEST raid_superblock_test 00:13:30.923 ************************************ 00:13:30.923 08:26:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:30.923 08:26:43 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:13:30.923 08:26:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:30.923 08:26:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:30.923 08:26:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:30.923 ************************************ 00:13:30.923 START TEST raid_read_error_test 00:13:30.923 ************************************ 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.kngdJMLfaP 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1421533 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1421533 /var/tmp/spdk-raid.sock 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1421533 ']' 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:30.923 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:30.923 08:26:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:30.923 [2024-07-23 08:26:43.278108] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:13:30.923 [2024-07-23 08:26:43.278200] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1421533 ] 00:13:30.923 [2024-07-23 08:26:43.404581] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:31.182 [2024-07-23 08:26:43.620127] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:31.440 [2024-07-23 08:26:43.855324] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:31.440 [2024-07-23 08:26:43.855360] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:31.700 08:26:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:31.700 08:26:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:31.700 08:26:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:31.700 08:26:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:31.959 BaseBdev1_malloc 00:13:31.959 08:26:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:31.959 true 00:13:31.959 08:26:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:32.218 [2024-07-23 08:26:44.584703] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:32.218 [2024-07-23 08:26:44.584756] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:32.218 [2024-07-23 08:26:44.584792] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:13:32.218 [2024-07-23 08:26:44.584803] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:32.218 [2024-07-23 08:26:44.586773] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:32.218 [2024-07-23 08:26:44.586801] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:32.218 BaseBdev1 00:13:32.218 08:26:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:32.218 08:26:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:32.478 BaseBdev2_malloc 00:13:32.478 08:26:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:32.478 true 00:13:32.478 08:26:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:32.737 [2024-07-23 08:26:45.121950] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:32.737 [2024-07-23 08:26:45.122001] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:32.737 [2024-07-23 08:26:45.122036] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:13:32.737 [2024-07-23 08:26:45.122049] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:32.737 [2024-07-23 08:26:45.124012] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:32.737 [2024-07-23 08:26:45.124040] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:32.737 BaseBdev2 00:13:32.737 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:32.996 [2024-07-23 08:26:45.290449] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:32.996 [2024-07-23 08:26:45.292066] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:32.996 [2024-07-23 08:26:45.292266] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036080 00:13:32.996 [2024-07-23 08:26:45.292282] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:32.996 [2024-07-23 08:26:45.292529] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:13:32.996 [2024-07-23 08:26:45.292741] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036080 00:13:32.996 [2024-07-23 08:26:45.292752] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036080 00:13:32.996 [2024-07-23 08:26:45.292926] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:32.996 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:32.996 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:32.996 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:32.996 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:32.996 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:32.996 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:32.996 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:32.996 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:32.996 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:32.996 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:32.996 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.996 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:32.996 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:32.996 "name": "raid_bdev1", 00:13:32.996 "uuid": "c905d792-9bce-4647-b08f-e2f2acbaad33", 00:13:32.996 "strip_size_kb": 0, 00:13:32.996 "state": "online", 00:13:32.996 "raid_level": "raid1", 00:13:32.996 "superblock": true, 00:13:32.996 "num_base_bdevs": 2, 00:13:32.996 "num_base_bdevs_discovered": 2, 00:13:32.996 "num_base_bdevs_operational": 2, 00:13:32.996 "base_bdevs_list": [ 00:13:32.996 { 00:13:32.996 "name": "BaseBdev1", 00:13:32.996 "uuid": "b5294213-29d3-54c0-b26e-0bd1c3508639", 00:13:32.996 "is_configured": true, 00:13:32.996 "data_offset": 2048, 00:13:32.996 "data_size": 63488 00:13:32.996 }, 00:13:32.996 { 00:13:32.996 "name": "BaseBdev2", 00:13:32.996 "uuid": "bae71b09-9900-52bd-b216-106c42e5c8ed", 00:13:32.996 "is_configured": true, 00:13:32.996 "data_offset": 2048, 00:13:32.996 "data_size": 63488 00:13:32.996 } 00:13:32.996 ] 00:13:32.996 }' 00:13:32.996 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:32.996 08:26:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:33.564 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:33.564 08:26:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:33.565 [2024-07-23 08:26:46.033711] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:13:34.504 08:26:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:34.762 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:34.762 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:34.762 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:13:34.762 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:13:34.762 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:34.762 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:34.762 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:34.762 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:34.762 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:34.762 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:34.762 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:34.762 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:34.762 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:34.762 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:34.762 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.762 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:35.021 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:35.021 "name": "raid_bdev1", 00:13:35.021 "uuid": "c905d792-9bce-4647-b08f-e2f2acbaad33", 00:13:35.021 "strip_size_kb": 0, 00:13:35.021 "state": "online", 00:13:35.021 "raid_level": "raid1", 00:13:35.021 "superblock": true, 00:13:35.021 "num_base_bdevs": 2, 00:13:35.021 "num_base_bdevs_discovered": 2, 00:13:35.021 "num_base_bdevs_operational": 2, 00:13:35.021 "base_bdevs_list": [ 00:13:35.021 { 00:13:35.021 "name": "BaseBdev1", 00:13:35.021 "uuid": "b5294213-29d3-54c0-b26e-0bd1c3508639", 00:13:35.021 "is_configured": true, 00:13:35.021 "data_offset": 2048, 00:13:35.021 "data_size": 63488 00:13:35.021 }, 00:13:35.021 { 00:13:35.021 "name": "BaseBdev2", 00:13:35.021 "uuid": "bae71b09-9900-52bd-b216-106c42e5c8ed", 00:13:35.021 "is_configured": true, 00:13:35.021 "data_offset": 2048, 00:13:35.021 "data_size": 63488 00:13:35.021 } 00:13:35.021 ] 00:13:35.021 }' 00:13:35.021 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:35.021 08:26:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:35.589 08:26:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:35.589 [2024-07-23 08:26:47.998891] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:35.589 [2024-07-23 08:26:47.998922] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:35.589 [2024-07-23 08:26:48.001377] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:35.589 [2024-07-23 08:26:48.001419] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:35.589 [2024-07-23 08:26:48.001497] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:35.589 [2024-07-23 08:26:48.001511] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036080 name raid_bdev1, state offline 00:13:35.589 0 00:13:35.589 08:26:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1421533 00:13:35.589 08:26:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1421533 ']' 00:13:35.589 08:26:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1421533 00:13:35.589 08:26:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:35.589 08:26:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:35.589 08:26:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1421533 00:13:35.589 08:26:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:35.589 08:26:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:35.589 08:26:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1421533' 00:13:35.589 killing process with pid 1421533 00:13:35.589 08:26:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1421533 00:13:35.589 [2024-07-23 08:26:48.053975] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:35.589 08:26:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1421533 00:13:35.848 [2024-07-23 08:26:48.131699] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:37.225 08:26:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.kngdJMLfaP 00:13:37.225 08:26:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:37.225 08:26:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:37.225 08:26:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:37.225 08:26:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:37.225 08:26:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:37.225 08:26:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:37.225 08:26:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:37.225 00:13:37.225 real 0m6.285s 00:13:37.225 user 0m8.838s 00:13:37.225 sys 0m0.840s 00:13:37.225 08:26:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:37.225 08:26:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.225 ************************************ 00:13:37.225 END TEST raid_read_error_test 00:13:37.225 ************************************ 00:13:37.225 08:26:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:37.225 08:26:49 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:13:37.225 08:26:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:37.225 08:26:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:37.225 08:26:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:37.225 ************************************ 00:13:37.225 START TEST raid_write_error_test 00:13:37.225 ************************************ 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.tvQUiB2WOW 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1422672 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1422672 /var/tmp/spdk-raid.sock 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1422672 ']' 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:37.225 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:37.225 08:26:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.225 [2024-07-23 08:26:49.626369] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:13:37.225 [2024-07-23 08:26:49.626497] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1422672 ] 00:13:37.484 [2024-07-23 08:26:49.762825] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:37.484 [2024-07-23 08:26:49.971278] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:37.742 [2024-07-23 08:26:50.226257] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:37.742 [2024-07-23 08:26:50.226290] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:38.001 08:26:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:38.001 08:26:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:38.001 08:26:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:38.001 08:26:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:38.260 BaseBdev1_malloc 00:13:38.260 08:26:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:38.260 true 00:13:38.260 08:26:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:38.518 [2024-07-23 08:26:50.930238] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:38.518 [2024-07-23 08:26:50.930295] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:38.518 [2024-07-23 08:26:50.930330] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:13:38.518 [2024-07-23 08:26:50.930341] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:38.518 [2024-07-23 08:26:50.932329] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:38.518 [2024-07-23 08:26:50.932359] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:38.518 BaseBdev1 00:13:38.518 08:26:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:38.518 08:26:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:38.777 BaseBdev2_malloc 00:13:38.777 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:39.035 true 00:13:39.035 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:39.035 [2024-07-23 08:26:51.480846] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:39.035 [2024-07-23 08:26:51.480896] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:39.035 [2024-07-23 08:26:51.480915] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:13:39.035 [2024-07-23 08:26:51.480928] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:39.035 [2024-07-23 08:26:51.482844] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:39.035 [2024-07-23 08:26:51.482874] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:39.035 BaseBdev2 00:13:39.035 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:39.294 [2024-07-23 08:26:51.653341] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:39.294 [2024-07-23 08:26:51.654963] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:39.294 [2024-07-23 08:26:51.655170] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036080 00:13:39.294 [2024-07-23 08:26:51.655185] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:39.294 [2024-07-23 08:26:51.655426] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:13:39.294 [2024-07-23 08:26:51.655641] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036080 00:13:39.294 [2024-07-23 08:26:51.655652] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036080 00:13:39.294 [2024-07-23 08:26:51.655821] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:39.294 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:39.294 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:39.294 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:39.294 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:39.294 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:39.294 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:39.294 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:39.294 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:39.295 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:39.295 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:39.295 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:39.295 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:39.553 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:39.553 "name": "raid_bdev1", 00:13:39.553 "uuid": "5e367359-fcad-408c-8de1-4853beadd07b", 00:13:39.553 "strip_size_kb": 0, 00:13:39.553 "state": "online", 00:13:39.553 "raid_level": "raid1", 00:13:39.553 "superblock": true, 00:13:39.553 "num_base_bdevs": 2, 00:13:39.553 "num_base_bdevs_discovered": 2, 00:13:39.553 "num_base_bdevs_operational": 2, 00:13:39.553 "base_bdevs_list": [ 00:13:39.553 { 00:13:39.553 "name": "BaseBdev1", 00:13:39.553 "uuid": "1406bfee-220e-5620-abc1-77c09ccce8f4", 00:13:39.553 "is_configured": true, 00:13:39.553 "data_offset": 2048, 00:13:39.553 "data_size": 63488 00:13:39.553 }, 00:13:39.553 { 00:13:39.553 "name": "BaseBdev2", 00:13:39.553 "uuid": "629a89c6-8066-5a3f-aaca-c7ae98dcc489", 00:13:39.553 "is_configured": true, 00:13:39.553 "data_offset": 2048, 00:13:39.553 "data_size": 63488 00:13:39.553 } 00:13:39.553 ] 00:13:39.553 }' 00:13:39.553 08:26:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:39.553 08:26:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:40.120 08:26:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:40.120 08:26:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:40.120 [2024-07-23 08:26:52.420706] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:41.057 [2024-07-23 08:26:53.501531] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:13:41.057 [2024-07-23 08:26:53.501584] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:41.057 [2024-07-23 08:26:53.501778] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d00000bf90 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.057 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:41.315 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:41.315 "name": "raid_bdev1", 00:13:41.315 "uuid": "5e367359-fcad-408c-8de1-4853beadd07b", 00:13:41.315 "strip_size_kb": 0, 00:13:41.315 "state": "online", 00:13:41.315 "raid_level": "raid1", 00:13:41.315 "superblock": true, 00:13:41.315 "num_base_bdevs": 2, 00:13:41.315 "num_base_bdevs_discovered": 1, 00:13:41.315 "num_base_bdevs_operational": 1, 00:13:41.315 "base_bdevs_list": [ 00:13:41.315 { 00:13:41.315 "name": null, 00:13:41.315 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:41.315 "is_configured": false, 00:13:41.315 "data_offset": 2048, 00:13:41.315 "data_size": 63488 00:13:41.315 }, 00:13:41.315 { 00:13:41.315 "name": "BaseBdev2", 00:13:41.315 "uuid": "629a89c6-8066-5a3f-aaca-c7ae98dcc489", 00:13:41.315 "is_configured": true, 00:13:41.315 "data_offset": 2048, 00:13:41.315 "data_size": 63488 00:13:41.315 } 00:13:41.315 ] 00:13:41.315 }' 00:13:41.315 08:26:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:41.315 08:26:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:41.884 08:26:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:41.884 [2024-07-23 08:26:54.342843] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:41.884 [2024-07-23 08:26:54.342878] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:41.884 [2024-07-23 08:26:54.345209] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:41.884 [2024-07-23 08:26:54.345256] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:41.884 [2024-07-23 08:26:54.345305] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:41.884 [2024-07-23 08:26:54.345316] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036080 name raid_bdev1, state offline 00:13:41.884 0 00:13:41.884 08:26:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1422672 00:13:41.884 08:26:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1422672 ']' 00:13:41.884 08:26:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1422672 00:13:41.884 08:26:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:41.884 08:26:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:41.884 08:26:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1422672 00:13:41.884 08:26:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:41.884 08:26:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:41.884 08:26:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1422672' 00:13:41.884 killing process with pid 1422672 00:13:41.884 08:26:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1422672 00:13:41.884 [2024-07-23 08:26:54.390839] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:41.884 08:26:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1422672 00:13:42.143 [2024-07-23 08:26:54.466996] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:43.519 08:26:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.tvQUiB2WOW 00:13:43.519 08:26:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:43.519 08:26:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:43.519 08:26:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:43.519 08:26:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:43.519 08:26:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:43.519 08:26:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:43.519 08:26:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:43.519 00:13:43.519 real 0m6.266s 00:13:43.519 user 0m8.771s 00:13:43.519 sys 0m0.870s 00:13:43.519 08:26:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:43.519 08:26:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.519 ************************************ 00:13:43.519 END TEST raid_write_error_test 00:13:43.519 ************************************ 00:13:43.519 08:26:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:43.519 08:26:55 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:13:43.519 08:26:55 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:43.519 08:26:55 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:13:43.519 08:26:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:43.519 08:26:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:43.519 08:26:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:43.519 ************************************ 00:13:43.519 START TEST raid_state_function_test 00:13:43.519 ************************************ 00:13:43.519 08:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:13:43.519 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:43.519 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:43.519 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:43.519 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:43.519 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:43.519 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:43.519 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:43.519 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:43.519 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:43.519 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:43.519 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:43.519 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:43.519 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:43.519 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:43.519 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1424010 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1424010' 00:13:43.520 Process raid pid: 1424010 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1424010 /var/tmp/spdk-raid.sock 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1424010 ']' 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:43.520 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:43.520 08:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.520 [2024-07-23 08:26:55.953286] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:13:43.520 [2024-07-23 08:26:55.953370] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:43.779 [2024-07-23 08:26:56.079821] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:43.779 [2024-07-23 08:26:56.296098] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:44.346 [2024-07-23 08:26:56.586096] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:44.346 [2024-07-23 08:26:56.586124] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:44.346 08:26:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:44.346 08:26:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:44.346 08:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:44.605 [2024-07-23 08:26:56.907121] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:44.605 [2024-07-23 08:26:56.907164] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:44.605 [2024-07-23 08:26:56.907174] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:44.605 [2024-07-23 08:26:56.907186] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:44.605 [2024-07-23 08:26:56.907193] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:44.605 [2024-07-23 08:26:56.907202] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:44.605 08:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:44.605 08:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:44.605 08:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:44.605 08:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:44.605 08:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:44.605 08:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:44.605 08:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:44.605 08:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:44.605 08:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:44.605 08:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:44.605 08:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:44.605 08:26:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:44.605 08:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:44.605 "name": "Existed_Raid", 00:13:44.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:44.605 "strip_size_kb": 64, 00:13:44.605 "state": "configuring", 00:13:44.605 "raid_level": "raid0", 00:13:44.605 "superblock": false, 00:13:44.605 "num_base_bdevs": 3, 00:13:44.605 "num_base_bdevs_discovered": 0, 00:13:44.605 "num_base_bdevs_operational": 3, 00:13:44.605 "base_bdevs_list": [ 00:13:44.605 { 00:13:44.605 "name": "BaseBdev1", 00:13:44.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:44.605 "is_configured": false, 00:13:44.605 "data_offset": 0, 00:13:44.605 "data_size": 0 00:13:44.605 }, 00:13:44.605 { 00:13:44.605 "name": "BaseBdev2", 00:13:44.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:44.605 "is_configured": false, 00:13:44.605 "data_offset": 0, 00:13:44.605 "data_size": 0 00:13:44.605 }, 00:13:44.605 { 00:13:44.605 "name": "BaseBdev3", 00:13:44.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:44.605 "is_configured": false, 00:13:44.605 "data_offset": 0, 00:13:44.605 "data_size": 0 00:13:44.605 } 00:13:44.605 ] 00:13:44.605 }' 00:13:44.605 08:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:44.605 08:26:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:45.172 08:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:45.431 [2024-07-23 08:26:57.749216] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:45.431 [2024-07-23 08:26:57.749249] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:13:45.431 08:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:45.431 [2024-07-23 08:26:57.925718] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:45.431 [2024-07-23 08:26:57.925756] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:45.431 [2024-07-23 08:26:57.925766] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:45.431 [2024-07-23 08:26:57.925778] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:45.431 [2024-07-23 08:26:57.925785] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:45.431 [2024-07-23 08:26:57.925797] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:45.431 08:26:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:45.689 [2024-07-23 08:26:58.127567] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:45.689 BaseBdev1 00:13:45.689 08:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:45.689 08:26:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:45.689 08:26:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:45.689 08:26:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:45.689 08:26:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:45.689 08:26:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:45.689 08:26:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:45.948 08:26:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:46.207 [ 00:13:46.207 { 00:13:46.207 "name": "BaseBdev1", 00:13:46.207 "aliases": [ 00:13:46.207 "e6b86829-e668-4677-a733-b995185c739a" 00:13:46.208 ], 00:13:46.208 "product_name": "Malloc disk", 00:13:46.208 "block_size": 512, 00:13:46.208 "num_blocks": 65536, 00:13:46.208 "uuid": "e6b86829-e668-4677-a733-b995185c739a", 00:13:46.208 "assigned_rate_limits": { 00:13:46.208 "rw_ios_per_sec": 0, 00:13:46.208 "rw_mbytes_per_sec": 0, 00:13:46.208 "r_mbytes_per_sec": 0, 00:13:46.208 "w_mbytes_per_sec": 0 00:13:46.208 }, 00:13:46.208 "claimed": true, 00:13:46.208 "claim_type": "exclusive_write", 00:13:46.208 "zoned": false, 00:13:46.208 "supported_io_types": { 00:13:46.208 "read": true, 00:13:46.208 "write": true, 00:13:46.208 "unmap": true, 00:13:46.208 "flush": true, 00:13:46.208 "reset": true, 00:13:46.208 "nvme_admin": false, 00:13:46.208 "nvme_io": false, 00:13:46.208 "nvme_io_md": false, 00:13:46.208 "write_zeroes": true, 00:13:46.208 "zcopy": true, 00:13:46.208 "get_zone_info": false, 00:13:46.208 "zone_management": false, 00:13:46.208 "zone_append": false, 00:13:46.208 "compare": false, 00:13:46.208 "compare_and_write": false, 00:13:46.208 "abort": true, 00:13:46.208 "seek_hole": false, 00:13:46.208 "seek_data": false, 00:13:46.208 "copy": true, 00:13:46.208 "nvme_iov_md": false 00:13:46.208 }, 00:13:46.208 "memory_domains": [ 00:13:46.208 { 00:13:46.208 "dma_device_id": "system", 00:13:46.208 "dma_device_type": 1 00:13:46.208 }, 00:13:46.208 { 00:13:46.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:46.208 "dma_device_type": 2 00:13:46.208 } 00:13:46.208 ], 00:13:46.208 "driver_specific": {} 00:13:46.208 } 00:13:46.208 ] 00:13:46.208 08:26:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:46.208 08:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:46.208 08:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:46.208 08:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:46.208 08:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:46.208 08:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:46.208 08:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:46.208 08:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:46.208 08:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:46.208 08:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:46.208 08:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:46.208 08:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.208 08:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:46.208 08:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:46.208 "name": "Existed_Raid", 00:13:46.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.208 "strip_size_kb": 64, 00:13:46.208 "state": "configuring", 00:13:46.208 "raid_level": "raid0", 00:13:46.208 "superblock": false, 00:13:46.208 "num_base_bdevs": 3, 00:13:46.208 "num_base_bdevs_discovered": 1, 00:13:46.208 "num_base_bdevs_operational": 3, 00:13:46.208 "base_bdevs_list": [ 00:13:46.208 { 00:13:46.208 "name": "BaseBdev1", 00:13:46.208 "uuid": "e6b86829-e668-4677-a733-b995185c739a", 00:13:46.208 "is_configured": true, 00:13:46.208 "data_offset": 0, 00:13:46.208 "data_size": 65536 00:13:46.208 }, 00:13:46.208 { 00:13:46.208 "name": "BaseBdev2", 00:13:46.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.208 "is_configured": false, 00:13:46.208 "data_offset": 0, 00:13:46.208 "data_size": 0 00:13:46.208 }, 00:13:46.208 { 00:13:46.208 "name": "BaseBdev3", 00:13:46.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.208 "is_configured": false, 00:13:46.208 "data_offset": 0, 00:13:46.208 "data_size": 0 00:13:46.208 } 00:13:46.208 ] 00:13:46.208 }' 00:13:46.208 08:26:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:46.208 08:26:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:46.777 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:47.036 [2024-07-23 08:26:59.302702] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:47.036 [2024-07-23 08:26:59.302750] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:13:47.036 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:47.036 [2024-07-23 08:26:59.475179] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:47.037 [2024-07-23 08:26:59.476811] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:47.037 [2024-07-23 08:26:59.476846] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:47.037 [2024-07-23 08:26:59.476855] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:47.037 [2024-07-23 08:26:59.476880] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:47.037 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:47.037 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:47.037 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:47.037 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:47.037 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:47.037 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:47.037 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:47.037 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:47.037 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:47.037 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:47.037 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:47.037 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:47.037 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.037 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.295 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.295 "name": "Existed_Raid", 00:13:47.295 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.295 "strip_size_kb": 64, 00:13:47.295 "state": "configuring", 00:13:47.295 "raid_level": "raid0", 00:13:47.295 "superblock": false, 00:13:47.295 "num_base_bdevs": 3, 00:13:47.295 "num_base_bdevs_discovered": 1, 00:13:47.295 "num_base_bdevs_operational": 3, 00:13:47.295 "base_bdevs_list": [ 00:13:47.295 { 00:13:47.295 "name": "BaseBdev1", 00:13:47.295 "uuid": "e6b86829-e668-4677-a733-b995185c739a", 00:13:47.295 "is_configured": true, 00:13:47.295 "data_offset": 0, 00:13:47.295 "data_size": 65536 00:13:47.295 }, 00:13:47.295 { 00:13:47.295 "name": "BaseBdev2", 00:13:47.295 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.295 "is_configured": false, 00:13:47.295 "data_offset": 0, 00:13:47.295 "data_size": 0 00:13:47.295 }, 00:13:47.295 { 00:13:47.295 "name": "BaseBdev3", 00:13:47.295 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.295 "is_configured": false, 00:13:47.295 "data_offset": 0, 00:13:47.295 "data_size": 0 00:13:47.295 } 00:13:47.295 ] 00:13:47.295 }' 00:13:47.295 08:26:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.295 08:26:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:47.863 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:47.863 [2024-07-23 08:27:00.315809] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:47.863 BaseBdev2 00:13:47.863 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:47.863 08:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:47.863 08:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:47.863 08:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:47.863 08:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:47.863 08:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:47.863 08:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:48.122 08:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:48.380 [ 00:13:48.380 { 00:13:48.380 "name": "BaseBdev2", 00:13:48.380 "aliases": [ 00:13:48.380 "8954ac73-f99d-4324-b32f-ddc30cd6206b" 00:13:48.380 ], 00:13:48.380 "product_name": "Malloc disk", 00:13:48.380 "block_size": 512, 00:13:48.380 "num_blocks": 65536, 00:13:48.380 "uuid": "8954ac73-f99d-4324-b32f-ddc30cd6206b", 00:13:48.380 "assigned_rate_limits": { 00:13:48.380 "rw_ios_per_sec": 0, 00:13:48.380 "rw_mbytes_per_sec": 0, 00:13:48.380 "r_mbytes_per_sec": 0, 00:13:48.380 "w_mbytes_per_sec": 0 00:13:48.380 }, 00:13:48.380 "claimed": true, 00:13:48.380 "claim_type": "exclusive_write", 00:13:48.380 "zoned": false, 00:13:48.380 "supported_io_types": { 00:13:48.380 "read": true, 00:13:48.380 "write": true, 00:13:48.380 "unmap": true, 00:13:48.380 "flush": true, 00:13:48.380 "reset": true, 00:13:48.380 "nvme_admin": false, 00:13:48.380 "nvme_io": false, 00:13:48.380 "nvme_io_md": false, 00:13:48.380 "write_zeroes": true, 00:13:48.380 "zcopy": true, 00:13:48.380 "get_zone_info": false, 00:13:48.380 "zone_management": false, 00:13:48.380 "zone_append": false, 00:13:48.380 "compare": false, 00:13:48.380 "compare_and_write": false, 00:13:48.380 "abort": true, 00:13:48.380 "seek_hole": false, 00:13:48.380 "seek_data": false, 00:13:48.380 "copy": true, 00:13:48.380 "nvme_iov_md": false 00:13:48.380 }, 00:13:48.380 "memory_domains": [ 00:13:48.380 { 00:13:48.380 "dma_device_id": "system", 00:13:48.380 "dma_device_type": 1 00:13:48.380 }, 00:13:48.380 { 00:13:48.380 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.380 "dma_device_type": 2 00:13:48.380 } 00:13:48.380 ], 00:13:48.380 "driver_specific": {} 00:13:48.380 } 00:13:48.380 ] 00:13:48.380 08:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:48.380 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:48.380 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:48.380 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:48.380 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:48.380 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:48.380 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:48.380 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:48.380 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:48.380 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:48.380 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:48.380 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:48.380 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:48.380 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.380 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:48.380 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:48.380 "name": "Existed_Raid", 00:13:48.380 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.380 "strip_size_kb": 64, 00:13:48.380 "state": "configuring", 00:13:48.380 "raid_level": "raid0", 00:13:48.380 "superblock": false, 00:13:48.380 "num_base_bdevs": 3, 00:13:48.380 "num_base_bdevs_discovered": 2, 00:13:48.380 "num_base_bdevs_operational": 3, 00:13:48.380 "base_bdevs_list": [ 00:13:48.380 { 00:13:48.380 "name": "BaseBdev1", 00:13:48.380 "uuid": "e6b86829-e668-4677-a733-b995185c739a", 00:13:48.380 "is_configured": true, 00:13:48.380 "data_offset": 0, 00:13:48.380 "data_size": 65536 00:13:48.380 }, 00:13:48.380 { 00:13:48.380 "name": "BaseBdev2", 00:13:48.380 "uuid": "8954ac73-f99d-4324-b32f-ddc30cd6206b", 00:13:48.380 "is_configured": true, 00:13:48.380 "data_offset": 0, 00:13:48.380 "data_size": 65536 00:13:48.380 }, 00:13:48.380 { 00:13:48.380 "name": "BaseBdev3", 00:13:48.380 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.380 "is_configured": false, 00:13:48.380 "data_offset": 0, 00:13:48.380 "data_size": 0 00:13:48.380 } 00:13:48.380 ] 00:13:48.380 }' 00:13:48.381 08:27:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:48.381 08:27:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.948 08:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:48.948 [2024-07-23 08:27:01.464897] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:48.948 [2024-07-23 08:27:01.464936] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:13:48.948 [2024-07-23 08:27:01.464946] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:48.948 [2024-07-23 08:27:01.465167] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:13:48.948 [2024-07-23 08:27:01.465349] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:13:48.948 [2024-07-23 08:27:01.465360] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:13:48.948 [2024-07-23 08:27:01.465601] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:48.948 BaseBdev3 00:13:49.207 08:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:49.207 08:27:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:49.207 08:27:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:49.207 08:27:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:49.207 08:27:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:49.207 08:27:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:49.207 08:27:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:49.207 08:27:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:49.465 [ 00:13:49.465 { 00:13:49.465 "name": "BaseBdev3", 00:13:49.465 "aliases": [ 00:13:49.465 "b6d5de76-0de5-4d6a-a992-ef2618ffc967" 00:13:49.465 ], 00:13:49.465 "product_name": "Malloc disk", 00:13:49.465 "block_size": 512, 00:13:49.465 "num_blocks": 65536, 00:13:49.465 "uuid": "b6d5de76-0de5-4d6a-a992-ef2618ffc967", 00:13:49.465 "assigned_rate_limits": { 00:13:49.465 "rw_ios_per_sec": 0, 00:13:49.465 "rw_mbytes_per_sec": 0, 00:13:49.465 "r_mbytes_per_sec": 0, 00:13:49.465 "w_mbytes_per_sec": 0 00:13:49.465 }, 00:13:49.465 "claimed": true, 00:13:49.465 "claim_type": "exclusive_write", 00:13:49.465 "zoned": false, 00:13:49.466 "supported_io_types": { 00:13:49.466 "read": true, 00:13:49.466 "write": true, 00:13:49.466 "unmap": true, 00:13:49.466 "flush": true, 00:13:49.466 "reset": true, 00:13:49.466 "nvme_admin": false, 00:13:49.466 "nvme_io": false, 00:13:49.466 "nvme_io_md": false, 00:13:49.466 "write_zeroes": true, 00:13:49.466 "zcopy": true, 00:13:49.466 "get_zone_info": false, 00:13:49.466 "zone_management": false, 00:13:49.466 "zone_append": false, 00:13:49.466 "compare": false, 00:13:49.466 "compare_and_write": false, 00:13:49.466 "abort": true, 00:13:49.466 "seek_hole": false, 00:13:49.466 "seek_data": false, 00:13:49.466 "copy": true, 00:13:49.466 "nvme_iov_md": false 00:13:49.466 }, 00:13:49.466 "memory_domains": [ 00:13:49.466 { 00:13:49.466 "dma_device_id": "system", 00:13:49.466 "dma_device_type": 1 00:13:49.466 }, 00:13:49.466 { 00:13:49.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.466 "dma_device_type": 2 00:13:49.466 } 00:13:49.466 ], 00:13:49.466 "driver_specific": {} 00:13:49.466 } 00:13:49.466 ] 00:13:49.466 08:27:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:49.466 08:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:49.466 08:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:49.466 08:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:49.466 08:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:49.466 08:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:49.466 08:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:49.466 08:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:49.466 08:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:49.466 08:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:49.466 08:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:49.466 08:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:49.466 08:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:49.466 08:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.466 08:27:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:49.724 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:49.724 "name": "Existed_Raid", 00:13:49.724 "uuid": "b8fc1536-09db-470d-95cc-603ef2bc772a", 00:13:49.724 "strip_size_kb": 64, 00:13:49.724 "state": "online", 00:13:49.724 "raid_level": "raid0", 00:13:49.724 "superblock": false, 00:13:49.724 "num_base_bdevs": 3, 00:13:49.724 "num_base_bdevs_discovered": 3, 00:13:49.724 "num_base_bdevs_operational": 3, 00:13:49.724 "base_bdevs_list": [ 00:13:49.724 { 00:13:49.724 "name": "BaseBdev1", 00:13:49.724 "uuid": "e6b86829-e668-4677-a733-b995185c739a", 00:13:49.724 "is_configured": true, 00:13:49.724 "data_offset": 0, 00:13:49.724 "data_size": 65536 00:13:49.724 }, 00:13:49.724 { 00:13:49.724 "name": "BaseBdev2", 00:13:49.724 "uuid": "8954ac73-f99d-4324-b32f-ddc30cd6206b", 00:13:49.724 "is_configured": true, 00:13:49.724 "data_offset": 0, 00:13:49.724 "data_size": 65536 00:13:49.724 }, 00:13:49.724 { 00:13:49.724 "name": "BaseBdev3", 00:13:49.724 "uuid": "b6d5de76-0de5-4d6a-a992-ef2618ffc967", 00:13:49.724 "is_configured": true, 00:13:49.724 "data_offset": 0, 00:13:49.724 "data_size": 65536 00:13:49.724 } 00:13:49.724 ] 00:13:49.724 }' 00:13:49.724 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:49.724 08:27:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:49.988 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:49.988 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:49.988 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:49.988 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:49.988 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:49.988 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:49.988 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:49.988 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:50.288 [2024-07-23 08:27:02.632270] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:50.288 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:50.288 "name": "Existed_Raid", 00:13:50.288 "aliases": [ 00:13:50.288 "b8fc1536-09db-470d-95cc-603ef2bc772a" 00:13:50.288 ], 00:13:50.288 "product_name": "Raid Volume", 00:13:50.288 "block_size": 512, 00:13:50.288 "num_blocks": 196608, 00:13:50.288 "uuid": "b8fc1536-09db-470d-95cc-603ef2bc772a", 00:13:50.288 "assigned_rate_limits": { 00:13:50.288 "rw_ios_per_sec": 0, 00:13:50.288 "rw_mbytes_per_sec": 0, 00:13:50.288 "r_mbytes_per_sec": 0, 00:13:50.288 "w_mbytes_per_sec": 0 00:13:50.288 }, 00:13:50.288 "claimed": false, 00:13:50.288 "zoned": false, 00:13:50.288 "supported_io_types": { 00:13:50.288 "read": true, 00:13:50.288 "write": true, 00:13:50.288 "unmap": true, 00:13:50.288 "flush": true, 00:13:50.288 "reset": true, 00:13:50.288 "nvme_admin": false, 00:13:50.288 "nvme_io": false, 00:13:50.288 "nvme_io_md": false, 00:13:50.288 "write_zeroes": true, 00:13:50.288 "zcopy": false, 00:13:50.288 "get_zone_info": false, 00:13:50.288 "zone_management": false, 00:13:50.288 "zone_append": false, 00:13:50.288 "compare": false, 00:13:50.288 "compare_and_write": false, 00:13:50.288 "abort": false, 00:13:50.288 "seek_hole": false, 00:13:50.288 "seek_data": false, 00:13:50.288 "copy": false, 00:13:50.288 "nvme_iov_md": false 00:13:50.288 }, 00:13:50.288 "memory_domains": [ 00:13:50.288 { 00:13:50.288 "dma_device_id": "system", 00:13:50.288 "dma_device_type": 1 00:13:50.288 }, 00:13:50.288 { 00:13:50.288 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.288 "dma_device_type": 2 00:13:50.288 }, 00:13:50.288 { 00:13:50.288 "dma_device_id": "system", 00:13:50.288 "dma_device_type": 1 00:13:50.288 }, 00:13:50.288 { 00:13:50.288 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.288 "dma_device_type": 2 00:13:50.288 }, 00:13:50.288 { 00:13:50.288 "dma_device_id": "system", 00:13:50.288 "dma_device_type": 1 00:13:50.288 }, 00:13:50.288 { 00:13:50.288 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.288 "dma_device_type": 2 00:13:50.288 } 00:13:50.288 ], 00:13:50.288 "driver_specific": { 00:13:50.288 "raid": { 00:13:50.288 "uuid": "b8fc1536-09db-470d-95cc-603ef2bc772a", 00:13:50.288 "strip_size_kb": 64, 00:13:50.288 "state": "online", 00:13:50.288 "raid_level": "raid0", 00:13:50.288 "superblock": false, 00:13:50.288 "num_base_bdevs": 3, 00:13:50.288 "num_base_bdevs_discovered": 3, 00:13:50.288 "num_base_bdevs_operational": 3, 00:13:50.288 "base_bdevs_list": [ 00:13:50.288 { 00:13:50.288 "name": "BaseBdev1", 00:13:50.288 "uuid": "e6b86829-e668-4677-a733-b995185c739a", 00:13:50.288 "is_configured": true, 00:13:50.288 "data_offset": 0, 00:13:50.288 "data_size": 65536 00:13:50.288 }, 00:13:50.288 { 00:13:50.288 "name": "BaseBdev2", 00:13:50.288 "uuid": "8954ac73-f99d-4324-b32f-ddc30cd6206b", 00:13:50.288 "is_configured": true, 00:13:50.288 "data_offset": 0, 00:13:50.288 "data_size": 65536 00:13:50.288 }, 00:13:50.288 { 00:13:50.288 "name": "BaseBdev3", 00:13:50.288 "uuid": "b6d5de76-0de5-4d6a-a992-ef2618ffc967", 00:13:50.288 "is_configured": true, 00:13:50.288 "data_offset": 0, 00:13:50.288 "data_size": 65536 00:13:50.288 } 00:13:50.288 ] 00:13:50.288 } 00:13:50.288 } 00:13:50.288 }' 00:13:50.288 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:50.288 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:50.288 BaseBdev2 00:13:50.288 BaseBdev3' 00:13:50.288 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:50.288 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:50.288 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:50.571 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:50.571 "name": "BaseBdev1", 00:13:50.571 "aliases": [ 00:13:50.571 "e6b86829-e668-4677-a733-b995185c739a" 00:13:50.571 ], 00:13:50.571 "product_name": "Malloc disk", 00:13:50.571 "block_size": 512, 00:13:50.571 "num_blocks": 65536, 00:13:50.571 "uuid": "e6b86829-e668-4677-a733-b995185c739a", 00:13:50.571 "assigned_rate_limits": { 00:13:50.571 "rw_ios_per_sec": 0, 00:13:50.571 "rw_mbytes_per_sec": 0, 00:13:50.571 "r_mbytes_per_sec": 0, 00:13:50.571 "w_mbytes_per_sec": 0 00:13:50.571 }, 00:13:50.571 "claimed": true, 00:13:50.571 "claim_type": "exclusive_write", 00:13:50.571 "zoned": false, 00:13:50.571 "supported_io_types": { 00:13:50.571 "read": true, 00:13:50.571 "write": true, 00:13:50.571 "unmap": true, 00:13:50.571 "flush": true, 00:13:50.571 "reset": true, 00:13:50.571 "nvme_admin": false, 00:13:50.571 "nvme_io": false, 00:13:50.571 "nvme_io_md": false, 00:13:50.571 "write_zeroes": true, 00:13:50.571 "zcopy": true, 00:13:50.571 "get_zone_info": false, 00:13:50.571 "zone_management": false, 00:13:50.571 "zone_append": false, 00:13:50.571 "compare": false, 00:13:50.571 "compare_and_write": false, 00:13:50.571 "abort": true, 00:13:50.571 "seek_hole": false, 00:13:50.571 "seek_data": false, 00:13:50.571 "copy": true, 00:13:50.571 "nvme_iov_md": false 00:13:50.571 }, 00:13:50.571 "memory_domains": [ 00:13:50.571 { 00:13:50.571 "dma_device_id": "system", 00:13:50.571 "dma_device_type": 1 00:13:50.571 }, 00:13:50.571 { 00:13:50.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.571 "dma_device_type": 2 00:13:50.571 } 00:13:50.571 ], 00:13:50.571 "driver_specific": {} 00:13:50.571 }' 00:13:50.571 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:50.571 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:50.571 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:50.571 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:50.571 08:27:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:50.571 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:50.571 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:50.571 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:50.830 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:50.830 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:50.830 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:50.830 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:50.830 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:50.830 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:50.830 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:50.830 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:50.830 "name": "BaseBdev2", 00:13:50.830 "aliases": [ 00:13:50.830 "8954ac73-f99d-4324-b32f-ddc30cd6206b" 00:13:50.830 ], 00:13:50.830 "product_name": "Malloc disk", 00:13:50.830 "block_size": 512, 00:13:50.830 "num_blocks": 65536, 00:13:50.830 "uuid": "8954ac73-f99d-4324-b32f-ddc30cd6206b", 00:13:50.830 "assigned_rate_limits": { 00:13:50.830 "rw_ios_per_sec": 0, 00:13:50.830 "rw_mbytes_per_sec": 0, 00:13:50.830 "r_mbytes_per_sec": 0, 00:13:50.830 "w_mbytes_per_sec": 0 00:13:50.830 }, 00:13:50.830 "claimed": true, 00:13:50.830 "claim_type": "exclusive_write", 00:13:50.830 "zoned": false, 00:13:50.830 "supported_io_types": { 00:13:50.830 "read": true, 00:13:50.830 "write": true, 00:13:50.830 "unmap": true, 00:13:50.830 "flush": true, 00:13:50.830 "reset": true, 00:13:50.830 "nvme_admin": false, 00:13:50.830 "nvme_io": false, 00:13:50.830 "nvme_io_md": false, 00:13:50.830 "write_zeroes": true, 00:13:50.830 "zcopy": true, 00:13:50.830 "get_zone_info": false, 00:13:50.830 "zone_management": false, 00:13:50.830 "zone_append": false, 00:13:50.830 "compare": false, 00:13:50.830 "compare_and_write": false, 00:13:50.830 "abort": true, 00:13:50.830 "seek_hole": false, 00:13:50.830 "seek_data": false, 00:13:50.830 "copy": true, 00:13:50.830 "nvme_iov_md": false 00:13:50.830 }, 00:13:50.830 "memory_domains": [ 00:13:50.830 { 00:13:50.830 "dma_device_id": "system", 00:13:50.830 "dma_device_type": 1 00:13:50.830 }, 00:13:50.830 { 00:13:50.830 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.830 "dma_device_type": 2 00:13:50.830 } 00:13:50.830 ], 00:13:50.830 "driver_specific": {} 00:13:50.830 }' 00:13:50.830 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:51.089 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:51.089 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:51.089 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:51.089 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:51.089 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:51.089 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:51.089 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:51.089 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:51.089 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:51.089 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:51.349 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:51.350 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:51.350 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:51.350 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:51.350 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:51.350 "name": "BaseBdev3", 00:13:51.350 "aliases": [ 00:13:51.350 "b6d5de76-0de5-4d6a-a992-ef2618ffc967" 00:13:51.350 ], 00:13:51.350 "product_name": "Malloc disk", 00:13:51.350 "block_size": 512, 00:13:51.350 "num_blocks": 65536, 00:13:51.350 "uuid": "b6d5de76-0de5-4d6a-a992-ef2618ffc967", 00:13:51.350 "assigned_rate_limits": { 00:13:51.350 "rw_ios_per_sec": 0, 00:13:51.350 "rw_mbytes_per_sec": 0, 00:13:51.350 "r_mbytes_per_sec": 0, 00:13:51.350 "w_mbytes_per_sec": 0 00:13:51.350 }, 00:13:51.350 "claimed": true, 00:13:51.350 "claim_type": "exclusive_write", 00:13:51.350 "zoned": false, 00:13:51.350 "supported_io_types": { 00:13:51.350 "read": true, 00:13:51.350 "write": true, 00:13:51.350 "unmap": true, 00:13:51.350 "flush": true, 00:13:51.350 "reset": true, 00:13:51.350 "nvme_admin": false, 00:13:51.350 "nvme_io": false, 00:13:51.350 "nvme_io_md": false, 00:13:51.350 "write_zeroes": true, 00:13:51.350 "zcopy": true, 00:13:51.350 "get_zone_info": false, 00:13:51.350 "zone_management": false, 00:13:51.350 "zone_append": false, 00:13:51.350 "compare": false, 00:13:51.350 "compare_and_write": false, 00:13:51.350 "abort": true, 00:13:51.350 "seek_hole": false, 00:13:51.350 "seek_data": false, 00:13:51.350 "copy": true, 00:13:51.350 "nvme_iov_md": false 00:13:51.350 }, 00:13:51.350 "memory_domains": [ 00:13:51.350 { 00:13:51.350 "dma_device_id": "system", 00:13:51.350 "dma_device_type": 1 00:13:51.350 }, 00:13:51.350 { 00:13:51.350 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.350 "dma_device_type": 2 00:13:51.350 } 00:13:51.350 ], 00:13:51.350 "driver_specific": {} 00:13:51.350 }' 00:13:51.350 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:51.350 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:51.609 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:51.609 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:51.609 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:51.609 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:51.609 08:27:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:51.609 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:51.609 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:51.609 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:51.609 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:51.868 [2024-07-23 08:27:04.284596] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:51.868 [2024-07-23 08:27:04.284632] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:51.868 [2024-07-23 08:27:04.284684] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.868 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:52.127 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:52.127 "name": "Existed_Raid", 00:13:52.127 "uuid": "b8fc1536-09db-470d-95cc-603ef2bc772a", 00:13:52.127 "strip_size_kb": 64, 00:13:52.127 "state": "offline", 00:13:52.127 "raid_level": "raid0", 00:13:52.127 "superblock": false, 00:13:52.127 "num_base_bdevs": 3, 00:13:52.127 "num_base_bdevs_discovered": 2, 00:13:52.127 "num_base_bdevs_operational": 2, 00:13:52.127 "base_bdevs_list": [ 00:13:52.127 { 00:13:52.127 "name": null, 00:13:52.127 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.127 "is_configured": false, 00:13:52.127 "data_offset": 0, 00:13:52.127 "data_size": 65536 00:13:52.127 }, 00:13:52.127 { 00:13:52.127 "name": "BaseBdev2", 00:13:52.127 "uuid": "8954ac73-f99d-4324-b32f-ddc30cd6206b", 00:13:52.127 "is_configured": true, 00:13:52.127 "data_offset": 0, 00:13:52.127 "data_size": 65536 00:13:52.127 }, 00:13:52.127 { 00:13:52.127 "name": "BaseBdev3", 00:13:52.127 "uuid": "b6d5de76-0de5-4d6a-a992-ef2618ffc967", 00:13:52.127 "is_configured": true, 00:13:52.127 "data_offset": 0, 00:13:52.127 "data_size": 65536 00:13:52.127 } 00:13:52.127 ] 00:13:52.127 }' 00:13:52.127 08:27:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:52.127 08:27:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:52.695 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:52.695 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:52.695 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.695 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:52.695 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:52.695 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:52.695 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:52.954 [2024-07-23 08:27:05.324690] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:52.954 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:52.954 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:52.954 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.954 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:53.213 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:53.213 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:53.213 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:53.471 [2024-07-23 08:27:05.736021] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:53.471 [2024-07-23 08:27:05.736067] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:13:53.471 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:53.471 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:53.471 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.471 08:27:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:53.729 08:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:53.729 08:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:53.729 08:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:53.729 08:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:53.729 08:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:53.729 08:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:53.729 BaseBdev2 00:13:53.729 08:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:53.729 08:27:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:53.729 08:27:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:53.729 08:27:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:53.729 08:27:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:53.729 08:27:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:53.729 08:27:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:53.988 08:27:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:54.247 [ 00:13:54.247 { 00:13:54.247 "name": "BaseBdev2", 00:13:54.247 "aliases": [ 00:13:54.247 "ef2beaa4-0e0f-4dca-b95f-1401a2defab5" 00:13:54.247 ], 00:13:54.247 "product_name": "Malloc disk", 00:13:54.247 "block_size": 512, 00:13:54.247 "num_blocks": 65536, 00:13:54.247 "uuid": "ef2beaa4-0e0f-4dca-b95f-1401a2defab5", 00:13:54.247 "assigned_rate_limits": { 00:13:54.247 "rw_ios_per_sec": 0, 00:13:54.247 "rw_mbytes_per_sec": 0, 00:13:54.247 "r_mbytes_per_sec": 0, 00:13:54.247 "w_mbytes_per_sec": 0 00:13:54.247 }, 00:13:54.247 "claimed": false, 00:13:54.247 "zoned": false, 00:13:54.247 "supported_io_types": { 00:13:54.247 "read": true, 00:13:54.247 "write": true, 00:13:54.247 "unmap": true, 00:13:54.247 "flush": true, 00:13:54.247 "reset": true, 00:13:54.247 "nvme_admin": false, 00:13:54.247 "nvme_io": false, 00:13:54.247 "nvme_io_md": false, 00:13:54.247 "write_zeroes": true, 00:13:54.247 "zcopy": true, 00:13:54.247 "get_zone_info": false, 00:13:54.247 "zone_management": false, 00:13:54.247 "zone_append": false, 00:13:54.247 "compare": false, 00:13:54.247 "compare_and_write": false, 00:13:54.247 "abort": true, 00:13:54.247 "seek_hole": false, 00:13:54.247 "seek_data": false, 00:13:54.247 "copy": true, 00:13:54.247 "nvme_iov_md": false 00:13:54.247 }, 00:13:54.247 "memory_domains": [ 00:13:54.247 { 00:13:54.247 "dma_device_id": "system", 00:13:54.247 "dma_device_type": 1 00:13:54.247 }, 00:13:54.247 { 00:13:54.247 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.247 "dma_device_type": 2 00:13:54.247 } 00:13:54.247 ], 00:13:54.247 "driver_specific": {} 00:13:54.247 } 00:13:54.247 ] 00:13:54.247 08:27:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:54.247 08:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:54.247 08:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:54.247 08:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:54.506 BaseBdev3 00:13:54.506 08:27:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:54.506 08:27:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:54.506 08:27:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:54.506 08:27:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:54.506 08:27:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:54.506 08:27:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:54.506 08:27:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:54.506 08:27:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:54.769 [ 00:13:54.769 { 00:13:54.769 "name": "BaseBdev3", 00:13:54.769 "aliases": [ 00:13:54.769 "6be2326c-ace2-401b-95ec-6d2296082346" 00:13:54.769 ], 00:13:54.769 "product_name": "Malloc disk", 00:13:54.769 "block_size": 512, 00:13:54.769 "num_blocks": 65536, 00:13:54.769 "uuid": "6be2326c-ace2-401b-95ec-6d2296082346", 00:13:54.769 "assigned_rate_limits": { 00:13:54.769 "rw_ios_per_sec": 0, 00:13:54.769 "rw_mbytes_per_sec": 0, 00:13:54.769 "r_mbytes_per_sec": 0, 00:13:54.769 "w_mbytes_per_sec": 0 00:13:54.769 }, 00:13:54.769 "claimed": false, 00:13:54.769 "zoned": false, 00:13:54.769 "supported_io_types": { 00:13:54.769 "read": true, 00:13:54.769 "write": true, 00:13:54.769 "unmap": true, 00:13:54.769 "flush": true, 00:13:54.769 "reset": true, 00:13:54.769 "nvme_admin": false, 00:13:54.769 "nvme_io": false, 00:13:54.769 "nvme_io_md": false, 00:13:54.769 "write_zeroes": true, 00:13:54.769 "zcopy": true, 00:13:54.769 "get_zone_info": false, 00:13:54.769 "zone_management": false, 00:13:54.769 "zone_append": false, 00:13:54.769 "compare": false, 00:13:54.769 "compare_and_write": false, 00:13:54.769 "abort": true, 00:13:54.769 "seek_hole": false, 00:13:54.769 "seek_data": false, 00:13:54.769 "copy": true, 00:13:54.769 "nvme_iov_md": false 00:13:54.769 }, 00:13:54.769 "memory_domains": [ 00:13:54.769 { 00:13:54.769 "dma_device_id": "system", 00:13:54.769 "dma_device_type": 1 00:13:54.769 }, 00:13:54.769 { 00:13:54.769 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.769 "dma_device_type": 2 00:13:54.769 } 00:13:54.769 ], 00:13:54.769 "driver_specific": {} 00:13:54.769 } 00:13:54.769 ] 00:13:54.769 08:27:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:54.769 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:54.769 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:54.769 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:54.770 [2024-07-23 08:27:07.273802] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:54.770 [2024-07-23 08:27:07.273841] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:54.770 [2024-07-23 08:27:07.273864] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:54.770 [2024-07-23 08:27:07.275463] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:55.028 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:55.028 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:55.028 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:55.028 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:55.028 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:55.028 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:55.028 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:55.028 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:55.028 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:55.028 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:55.028 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.028 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:55.028 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:55.028 "name": "Existed_Raid", 00:13:55.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.028 "strip_size_kb": 64, 00:13:55.028 "state": "configuring", 00:13:55.028 "raid_level": "raid0", 00:13:55.028 "superblock": false, 00:13:55.028 "num_base_bdevs": 3, 00:13:55.028 "num_base_bdevs_discovered": 2, 00:13:55.028 "num_base_bdevs_operational": 3, 00:13:55.028 "base_bdevs_list": [ 00:13:55.028 { 00:13:55.028 "name": "BaseBdev1", 00:13:55.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.028 "is_configured": false, 00:13:55.028 "data_offset": 0, 00:13:55.028 "data_size": 0 00:13:55.028 }, 00:13:55.028 { 00:13:55.028 "name": "BaseBdev2", 00:13:55.028 "uuid": "ef2beaa4-0e0f-4dca-b95f-1401a2defab5", 00:13:55.028 "is_configured": true, 00:13:55.028 "data_offset": 0, 00:13:55.028 "data_size": 65536 00:13:55.028 }, 00:13:55.028 { 00:13:55.028 "name": "BaseBdev3", 00:13:55.028 "uuid": "6be2326c-ace2-401b-95ec-6d2296082346", 00:13:55.028 "is_configured": true, 00:13:55.028 "data_offset": 0, 00:13:55.028 "data_size": 65536 00:13:55.028 } 00:13:55.028 ] 00:13:55.028 }' 00:13:55.028 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:55.028 08:27:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:55.594 08:27:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:55.594 [2024-07-23 08:27:08.112055] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:55.854 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:55.854 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:55.854 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:55.854 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:55.854 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:55.854 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:55.854 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:55.854 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:55.854 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:55.854 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:55.854 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.854 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:55.854 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:55.854 "name": "Existed_Raid", 00:13:55.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.854 "strip_size_kb": 64, 00:13:55.854 "state": "configuring", 00:13:55.854 "raid_level": "raid0", 00:13:55.854 "superblock": false, 00:13:55.854 "num_base_bdevs": 3, 00:13:55.854 "num_base_bdevs_discovered": 1, 00:13:55.854 "num_base_bdevs_operational": 3, 00:13:55.854 "base_bdevs_list": [ 00:13:55.854 { 00:13:55.854 "name": "BaseBdev1", 00:13:55.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.854 "is_configured": false, 00:13:55.854 "data_offset": 0, 00:13:55.854 "data_size": 0 00:13:55.854 }, 00:13:55.854 { 00:13:55.854 "name": null, 00:13:55.854 "uuid": "ef2beaa4-0e0f-4dca-b95f-1401a2defab5", 00:13:55.854 "is_configured": false, 00:13:55.854 "data_offset": 0, 00:13:55.854 "data_size": 65536 00:13:55.854 }, 00:13:55.854 { 00:13:55.854 "name": "BaseBdev3", 00:13:55.854 "uuid": "6be2326c-ace2-401b-95ec-6d2296082346", 00:13:55.854 "is_configured": true, 00:13:55.854 "data_offset": 0, 00:13:55.854 "data_size": 65536 00:13:55.854 } 00:13:55.854 ] 00:13:55.854 }' 00:13:55.854 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:55.854 08:27:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:56.420 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.420 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:56.420 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:56.420 08:27:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:56.679 [2024-07-23 08:27:09.119592] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:56.679 BaseBdev1 00:13:56.679 08:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:56.679 08:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:56.679 08:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:56.679 08:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:56.679 08:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:56.679 08:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:56.679 08:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:56.938 08:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:56.938 [ 00:13:56.938 { 00:13:56.938 "name": "BaseBdev1", 00:13:56.938 "aliases": [ 00:13:56.938 "f482061a-2861-4fa4-b355-3a32cdd3211f" 00:13:56.938 ], 00:13:56.938 "product_name": "Malloc disk", 00:13:56.938 "block_size": 512, 00:13:56.938 "num_blocks": 65536, 00:13:56.938 "uuid": "f482061a-2861-4fa4-b355-3a32cdd3211f", 00:13:56.938 "assigned_rate_limits": { 00:13:56.938 "rw_ios_per_sec": 0, 00:13:56.938 "rw_mbytes_per_sec": 0, 00:13:56.938 "r_mbytes_per_sec": 0, 00:13:56.938 "w_mbytes_per_sec": 0 00:13:56.938 }, 00:13:56.938 "claimed": true, 00:13:56.938 "claim_type": "exclusive_write", 00:13:56.938 "zoned": false, 00:13:56.938 "supported_io_types": { 00:13:56.938 "read": true, 00:13:56.938 "write": true, 00:13:56.938 "unmap": true, 00:13:56.938 "flush": true, 00:13:56.938 "reset": true, 00:13:56.938 "nvme_admin": false, 00:13:56.938 "nvme_io": false, 00:13:56.938 "nvme_io_md": false, 00:13:56.938 "write_zeroes": true, 00:13:56.938 "zcopy": true, 00:13:56.938 "get_zone_info": false, 00:13:56.938 "zone_management": false, 00:13:56.938 "zone_append": false, 00:13:56.938 "compare": false, 00:13:56.938 "compare_and_write": false, 00:13:56.938 "abort": true, 00:13:56.938 "seek_hole": false, 00:13:56.938 "seek_data": false, 00:13:56.938 "copy": true, 00:13:56.938 "nvme_iov_md": false 00:13:56.938 }, 00:13:56.938 "memory_domains": [ 00:13:56.938 { 00:13:56.938 "dma_device_id": "system", 00:13:56.938 "dma_device_type": 1 00:13:56.938 }, 00:13:56.938 { 00:13:56.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:56.938 "dma_device_type": 2 00:13:56.938 } 00:13:56.938 ], 00:13:56.938 "driver_specific": {} 00:13:56.938 } 00:13:56.938 ] 00:13:57.197 08:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:57.197 08:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:57.197 08:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:57.197 08:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:57.197 08:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:57.197 08:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:57.197 08:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:57.197 08:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:57.197 08:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:57.197 08:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:57.197 08:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:57.197 08:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.197 08:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:57.197 08:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:57.197 "name": "Existed_Raid", 00:13:57.197 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.197 "strip_size_kb": 64, 00:13:57.197 "state": "configuring", 00:13:57.197 "raid_level": "raid0", 00:13:57.197 "superblock": false, 00:13:57.197 "num_base_bdevs": 3, 00:13:57.197 "num_base_bdevs_discovered": 2, 00:13:57.197 "num_base_bdevs_operational": 3, 00:13:57.197 "base_bdevs_list": [ 00:13:57.197 { 00:13:57.197 "name": "BaseBdev1", 00:13:57.197 "uuid": "f482061a-2861-4fa4-b355-3a32cdd3211f", 00:13:57.197 "is_configured": true, 00:13:57.197 "data_offset": 0, 00:13:57.197 "data_size": 65536 00:13:57.197 }, 00:13:57.197 { 00:13:57.197 "name": null, 00:13:57.197 "uuid": "ef2beaa4-0e0f-4dca-b95f-1401a2defab5", 00:13:57.197 "is_configured": false, 00:13:57.197 "data_offset": 0, 00:13:57.197 "data_size": 65536 00:13:57.197 }, 00:13:57.197 { 00:13:57.197 "name": "BaseBdev3", 00:13:57.197 "uuid": "6be2326c-ace2-401b-95ec-6d2296082346", 00:13:57.197 "is_configured": true, 00:13:57.197 "data_offset": 0, 00:13:57.197 "data_size": 65536 00:13:57.197 } 00:13:57.197 ] 00:13:57.197 }' 00:13:57.197 08:27:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:57.197 08:27:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:57.764 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:57.764 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.023 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:58.023 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:58.023 [2024-07-23 08:27:10.463243] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:58.023 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:58.023 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:58.023 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:58.023 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:58.023 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:58.023 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:58.023 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:58.023 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:58.023 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:58.023 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:58.023 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.023 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:58.282 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:58.282 "name": "Existed_Raid", 00:13:58.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:58.282 "strip_size_kb": 64, 00:13:58.282 "state": "configuring", 00:13:58.282 "raid_level": "raid0", 00:13:58.282 "superblock": false, 00:13:58.282 "num_base_bdevs": 3, 00:13:58.282 "num_base_bdevs_discovered": 1, 00:13:58.282 "num_base_bdevs_operational": 3, 00:13:58.282 "base_bdevs_list": [ 00:13:58.282 { 00:13:58.282 "name": "BaseBdev1", 00:13:58.282 "uuid": "f482061a-2861-4fa4-b355-3a32cdd3211f", 00:13:58.282 "is_configured": true, 00:13:58.282 "data_offset": 0, 00:13:58.282 "data_size": 65536 00:13:58.282 }, 00:13:58.282 { 00:13:58.282 "name": null, 00:13:58.282 "uuid": "ef2beaa4-0e0f-4dca-b95f-1401a2defab5", 00:13:58.282 "is_configured": false, 00:13:58.282 "data_offset": 0, 00:13:58.282 "data_size": 65536 00:13:58.282 }, 00:13:58.282 { 00:13:58.282 "name": null, 00:13:58.282 "uuid": "6be2326c-ace2-401b-95ec-6d2296082346", 00:13:58.282 "is_configured": false, 00:13:58.282 "data_offset": 0, 00:13:58.282 "data_size": 65536 00:13:58.282 } 00:13:58.282 ] 00:13:58.282 }' 00:13:58.282 08:27:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:58.282 08:27:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:58.857 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.858 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:58.858 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:58.858 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:59.116 [2024-07-23 08:27:11.449821] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:59.116 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:59.116 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:59.116 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:59.116 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:59.116 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:59.116 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:59.116 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:59.116 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:59.116 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:59.116 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:59.116 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.116 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.374 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.374 "name": "Existed_Raid", 00:13:59.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.374 "strip_size_kb": 64, 00:13:59.374 "state": "configuring", 00:13:59.375 "raid_level": "raid0", 00:13:59.375 "superblock": false, 00:13:59.375 "num_base_bdevs": 3, 00:13:59.375 "num_base_bdevs_discovered": 2, 00:13:59.375 "num_base_bdevs_operational": 3, 00:13:59.375 "base_bdevs_list": [ 00:13:59.375 { 00:13:59.375 "name": "BaseBdev1", 00:13:59.375 "uuid": "f482061a-2861-4fa4-b355-3a32cdd3211f", 00:13:59.375 "is_configured": true, 00:13:59.375 "data_offset": 0, 00:13:59.375 "data_size": 65536 00:13:59.375 }, 00:13:59.375 { 00:13:59.375 "name": null, 00:13:59.375 "uuid": "ef2beaa4-0e0f-4dca-b95f-1401a2defab5", 00:13:59.375 "is_configured": false, 00:13:59.375 "data_offset": 0, 00:13:59.375 "data_size": 65536 00:13:59.375 }, 00:13:59.375 { 00:13:59.375 "name": "BaseBdev3", 00:13:59.375 "uuid": "6be2326c-ace2-401b-95ec-6d2296082346", 00:13:59.375 "is_configured": true, 00:13:59.375 "data_offset": 0, 00:13:59.375 "data_size": 65536 00:13:59.375 } 00:13:59.375 ] 00:13:59.375 }' 00:13:59.375 08:27:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.375 08:27:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:59.633 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.633 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:59.891 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:59.891 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:00.150 [2024-07-23 08:27:12.440504] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:00.150 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:00.150 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:00.150 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:00.150 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:00.150 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:00.150 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:00.150 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:00.150 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:00.150 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:00.150 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:00.150 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.150 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:00.409 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:00.409 "name": "Existed_Raid", 00:14:00.409 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:00.409 "strip_size_kb": 64, 00:14:00.409 "state": "configuring", 00:14:00.409 "raid_level": "raid0", 00:14:00.409 "superblock": false, 00:14:00.409 "num_base_bdevs": 3, 00:14:00.409 "num_base_bdevs_discovered": 1, 00:14:00.409 "num_base_bdevs_operational": 3, 00:14:00.409 "base_bdevs_list": [ 00:14:00.409 { 00:14:00.409 "name": null, 00:14:00.409 "uuid": "f482061a-2861-4fa4-b355-3a32cdd3211f", 00:14:00.409 "is_configured": false, 00:14:00.409 "data_offset": 0, 00:14:00.409 "data_size": 65536 00:14:00.409 }, 00:14:00.409 { 00:14:00.409 "name": null, 00:14:00.409 "uuid": "ef2beaa4-0e0f-4dca-b95f-1401a2defab5", 00:14:00.409 "is_configured": false, 00:14:00.409 "data_offset": 0, 00:14:00.409 "data_size": 65536 00:14:00.409 }, 00:14:00.409 { 00:14:00.409 "name": "BaseBdev3", 00:14:00.409 "uuid": "6be2326c-ace2-401b-95ec-6d2296082346", 00:14:00.409 "is_configured": true, 00:14:00.409 "data_offset": 0, 00:14:00.409 "data_size": 65536 00:14:00.409 } 00:14:00.409 ] 00:14:00.409 }' 00:14:00.409 08:27:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:00.409 08:27:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:00.976 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.976 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:00.976 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:00.976 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:01.234 [2024-07-23 08:27:13.532502] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:01.234 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:01.234 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:01.234 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:01.234 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:01.234 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:01.234 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:01.234 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:01.234 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:01.234 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:01.234 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:01.234 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.234 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:01.234 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:01.234 "name": "Existed_Raid", 00:14:01.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:01.234 "strip_size_kb": 64, 00:14:01.234 "state": "configuring", 00:14:01.234 "raid_level": "raid0", 00:14:01.234 "superblock": false, 00:14:01.234 "num_base_bdevs": 3, 00:14:01.234 "num_base_bdevs_discovered": 2, 00:14:01.234 "num_base_bdevs_operational": 3, 00:14:01.234 "base_bdevs_list": [ 00:14:01.234 { 00:14:01.234 "name": null, 00:14:01.234 "uuid": "f482061a-2861-4fa4-b355-3a32cdd3211f", 00:14:01.234 "is_configured": false, 00:14:01.234 "data_offset": 0, 00:14:01.234 "data_size": 65536 00:14:01.234 }, 00:14:01.234 { 00:14:01.234 "name": "BaseBdev2", 00:14:01.234 "uuid": "ef2beaa4-0e0f-4dca-b95f-1401a2defab5", 00:14:01.234 "is_configured": true, 00:14:01.234 "data_offset": 0, 00:14:01.234 "data_size": 65536 00:14:01.234 }, 00:14:01.234 { 00:14:01.234 "name": "BaseBdev3", 00:14:01.234 "uuid": "6be2326c-ace2-401b-95ec-6d2296082346", 00:14:01.234 "is_configured": true, 00:14:01.234 "data_offset": 0, 00:14:01.234 "data_size": 65536 00:14:01.234 } 00:14:01.234 ] 00:14:01.234 }' 00:14:01.234 08:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:01.234 08:27:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:01.800 08:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.800 08:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:02.058 08:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:02.058 08:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.058 08:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:02.058 08:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f482061a-2861-4fa4-b355-3a32cdd3211f 00:14:02.316 [2024-07-23 08:27:14.740074] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:02.316 [2024-07-23 08:27:14.740114] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036980 00:14:02.316 [2024-07-23 08:27:14.740123] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:02.316 [2024-07-23 08:27:14.740345] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c200 00:14:02.316 [2024-07-23 08:27:14.740503] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036980 00:14:02.316 [2024-07-23 08:27:14.740512] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000036980 00:14:02.316 [2024-07-23 08:27:14.740790] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:02.316 NewBaseBdev 00:14:02.316 08:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:02.316 08:27:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:02.316 08:27:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:02.316 08:27:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:02.316 08:27:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:02.316 08:27:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:02.316 08:27:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:02.575 08:27:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:02.575 [ 00:14:02.575 { 00:14:02.575 "name": "NewBaseBdev", 00:14:02.575 "aliases": [ 00:14:02.575 "f482061a-2861-4fa4-b355-3a32cdd3211f" 00:14:02.575 ], 00:14:02.575 "product_name": "Malloc disk", 00:14:02.575 "block_size": 512, 00:14:02.575 "num_blocks": 65536, 00:14:02.575 "uuid": "f482061a-2861-4fa4-b355-3a32cdd3211f", 00:14:02.575 "assigned_rate_limits": { 00:14:02.575 "rw_ios_per_sec": 0, 00:14:02.575 "rw_mbytes_per_sec": 0, 00:14:02.575 "r_mbytes_per_sec": 0, 00:14:02.575 "w_mbytes_per_sec": 0 00:14:02.575 }, 00:14:02.575 "claimed": true, 00:14:02.575 "claim_type": "exclusive_write", 00:14:02.575 "zoned": false, 00:14:02.575 "supported_io_types": { 00:14:02.575 "read": true, 00:14:02.575 "write": true, 00:14:02.575 "unmap": true, 00:14:02.575 "flush": true, 00:14:02.575 "reset": true, 00:14:02.575 "nvme_admin": false, 00:14:02.575 "nvme_io": false, 00:14:02.575 "nvme_io_md": false, 00:14:02.575 "write_zeroes": true, 00:14:02.575 "zcopy": true, 00:14:02.575 "get_zone_info": false, 00:14:02.575 "zone_management": false, 00:14:02.575 "zone_append": false, 00:14:02.575 "compare": false, 00:14:02.575 "compare_and_write": false, 00:14:02.575 "abort": true, 00:14:02.575 "seek_hole": false, 00:14:02.575 "seek_data": false, 00:14:02.575 "copy": true, 00:14:02.575 "nvme_iov_md": false 00:14:02.575 }, 00:14:02.575 "memory_domains": [ 00:14:02.575 { 00:14:02.575 "dma_device_id": "system", 00:14:02.575 "dma_device_type": 1 00:14:02.575 }, 00:14:02.575 { 00:14:02.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:02.575 "dma_device_type": 2 00:14:02.575 } 00:14:02.575 ], 00:14:02.575 "driver_specific": {} 00:14:02.575 } 00:14:02.575 ] 00:14:02.575 08:27:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:02.575 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:02.575 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:02.575 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:02.575 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:02.576 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:02.576 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:02.576 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.576 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.576 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.576 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.576 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:02.576 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.835 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:02.835 "name": "Existed_Raid", 00:14:02.835 "uuid": "39d7b1eb-decd-4820-9f0d-da23334aaac4", 00:14:02.835 "strip_size_kb": 64, 00:14:02.835 "state": "online", 00:14:02.835 "raid_level": "raid0", 00:14:02.835 "superblock": false, 00:14:02.835 "num_base_bdevs": 3, 00:14:02.835 "num_base_bdevs_discovered": 3, 00:14:02.835 "num_base_bdevs_operational": 3, 00:14:02.835 "base_bdevs_list": [ 00:14:02.835 { 00:14:02.835 "name": "NewBaseBdev", 00:14:02.835 "uuid": "f482061a-2861-4fa4-b355-3a32cdd3211f", 00:14:02.835 "is_configured": true, 00:14:02.835 "data_offset": 0, 00:14:02.835 "data_size": 65536 00:14:02.835 }, 00:14:02.835 { 00:14:02.835 "name": "BaseBdev2", 00:14:02.835 "uuid": "ef2beaa4-0e0f-4dca-b95f-1401a2defab5", 00:14:02.835 "is_configured": true, 00:14:02.835 "data_offset": 0, 00:14:02.835 "data_size": 65536 00:14:02.835 }, 00:14:02.835 { 00:14:02.835 "name": "BaseBdev3", 00:14:02.835 "uuid": "6be2326c-ace2-401b-95ec-6d2296082346", 00:14:02.835 "is_configured": true, 00:14:02.835 "data_offset": 0, 00:14:02.835 "data_size": 65536 00:14:02.835 } 00:14:02.835 ] 00:14:02.835 }' 00:14:02.835 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:02.835 08:27:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:03.402 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:03.402 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:03.402 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:03.402 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:03.402 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:03.402 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:03.402 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:03.402 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:03.402 [2024-07-23 08:27:15.911488] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:03.724 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:03.724 "name": "Existed_Raid", 00:14:03.724 "aliases": [ 00:14:03.724 "39d7b1eb-decd-4820-9f0d-da23334aaac4" 00:14:03.724 ], 00:14:03.724 "product_name": "Raid Volume", 00:14:03.724 "block_size": 512, 00:14:03.724 "num_blocks": 196608, 00:14:03.724 "uuid": "39d7b1eb-decd-4820-9f0d-da23334aaac4", 00:14:03.724 "assigned_rate_limits": { 00:14:03.724 "rw_ios_per_sec": 0, 00:14:03.724 "rw_mbytes_per_sec": 0, 00:14:03.724 "r_mbytes_per_sec": 0, 00:14:03.724 "w_mbytes_per_sec": 0 00:14:03.724 }, 00:14:03.724 "claimed": false, 00:14:03.724 "zoned": false, 00:14:03.724 "supported_io_types": { 00:14:03.724 "read": true, 00:14:03.724 "write": true, 00:14:03.724 "unmap": true, 00:14:03.724 "flush": true, 00:14:03.724 "reset": true, 00:14:03.724 "nvme_admin": false, 00:14:03.724 "nvme_io": false, 00:14:03.724 "nvme_io_md": false, 00:14:03.724 "write_zeroes": true, 00:14:03.724 "zcopy": false, 00:14:03.724 "get_zone_info": false, 00:14:03.724 "zone_management": false, 00:14:03.724 "zone_append": false, 00:14:03.724 "compare": false, 00:14:03.724 "compare_and_write": false, 00:14:03.724 "abort": false, 00:14:03.724 "seek_hole": false, 00:14:03.724 "seek_data": false, 00:14:03.724 "copy": false, 00:14:03.724 "nvme_iov_md": false 00:14:03.724 }, 00:14:03.724 "memory_domains": [ 00:14:03.724 { 00:14:03.724 "dma_device_id": "system", 00:14:03.724 "dma_device_type": 1 00:14:03.724 }, 00:14:03.724 { 00:14:03.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.724 "dma_device_type": 2 00:14:03.724 }, 00:14:03.724 { 00:14:03.724 "dma_device_id": "system", 00:14:03.724 "dma_device_type": 1 00:14:03.724 }, 00:14:03.724 { 00:14:03.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.724 "dma_device_type": 2 00:14:03.724 }, 00:14:03.724 { 00:14:03.724 "dma_device_id": "system", 00:14:03.724 "dma_device_type": 1 00:14:03.724 }, 00:14:03.724 { 00:14:03.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.724 "dma_device_type": 2 00:14:03.724 } 00:14:03.724 ], 00:14:03.724 "driver_specific": { 00:14:03.724 "raid": { 00:14:03.724 "uuid": "39d7b1eb-decd-4820-9f0d-da23334aaac4", 00:14:03.724 "strip_size_kb": 64, 00:14:03.724 "state": "online", 00:14:03.724 "raid_level": "raid0", 00:14:03.724 "superblock": false, 00:14:03.724 "num_base_bdevs": 3, 00:14:03.724 "num_base_bdevs_discovered": 3, 00:14:03.724 "num_base_bdevs_operational": 3, 00:14:03.724 "base_bdevs_list": [ 00:14:03.724 { 00:14:03.724 "name": "NewBaseBdev", 00:14:03.724 "uuid": "f482061a-2861-4fa4-b355-3a32cdd3211f", 00:14:03.724 "is_configured": true, 00:14:03.724 "data_offset": 0, 00:14:03.724 "data_size": 65536 00:14:03.724 }, 00:14:03.724 { 00:14:03.724 "name": "BaseBdev2", 00:14:03.724 "uuid": "ef2beaa4-0e0f-4dca-b95f-1401a2defab5", 00:14:03.724 "is_configured": true, 00:14:03.724 "data_offset": 0, 00:14:03.724 "data_size": 65536 00:14:03.724 }, 00:14:03.724 { 00:14:03.724 "name": "BaseBdev3", 00:14:03.724 "uuid": "6be2326c-ace2-401b-95ec-6d2296082346", 00:14:03.724 "is_configured": true, 00:14:03.724 "data_offset": 0, 00:14:03.724 "data_size": 65536 00:14:03.725 } 00:14:03.725 ] 00:14:03.725 } 00:14:03.725 } 00:14:03.725 }' 00:14:03.725 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:03.725 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:03.725 BaseBdev2 00:14:03.725 BaseBdev3' 00:14:03.725 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:03.725 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:03.725 08:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:03.725 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:03.725 "name": "NewBaseBdev", 00:14:03.725 "aliases": [ 00:14:03.725 "f482061a-2861-4fa4-b355-3a32cdd3211f" 00:14:03.725 ], 00:14:03.725 "product_name": "Malloc disk", 00:14:03.725 "block_size": 512, 00:14:03.725 "num_blocks": 65536, 00:14:03.725 "uuid": "f482061a-2861-4fa4-b355-3a32cdd3211f", 00:14:03.725 "assigned_rate_limits": { 00:14:03.725 "rw_ios_per_sec": 0, 00:14:03.725 "rw_mbytes_per_sec": 0, 00:14:03.725 "r_mbytes_per_sec": 0, 00:14:03.725 "w_mbytes_per_sec": 0 00:14:03.725 }, 00:14:03.725 "claimed": true, 00:14:03.725 "claim_type": "exclusive_write", 00:14:03.725 "zoned": false, 00:14:03.725 "supported_io_types": { 00:14:03.725 "read": true, 00:14:03.725 "write": true, 00:14:03.725 "unmap": true, 00:14:03.725 "flush": true, 00:14:03.725 "reset": true, 00:14:03.725 "nvme_admin": false, 00:14:03.725 "nvme_io": false, 00:14:03.725 "nvme_io_md": false, 00:14:03.725 "write_zeroes": true, 00:14:03.725 "zcopy": true, 00:14:03.725 "get_zone_info": false, 00:14:03.725 "zone_management": false, 00:14:03.725 "zone_append": false, 00:14:03.725 "compare": false, 00:14:03.725 "compare_and_write": false, 00:14:03.725 "abort": true, 00:14:03.725 "seek_hole": false, 00:14:03.725 "seek_data": false, 00:14:03.725 "copy": true, 00:14:03.725 "nvme_iov_md": false 00:14:03.725 }, 00:14:03.725 "memory_domains": [ 00:14:03.725 { 00:14:03.725 "dma_device_id": "system", 00:14:03.725 "dma_device_type": 1 00:14:03.725 }, 00:14:03.725 { 00:14:03.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.725 "dma_device_type": 2 00:14:03.725 } 00:14:03.725 ], 00:14:03.725 "driver_specific": {} 00:14:03.725 }' 00:14:03.725 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:03.725 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:03.725 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:03.725 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:03.984 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:03.984 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:03.984 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:03.984 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:03.984 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:03.984 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:03.984 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:03.984 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:03.984 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:03.984 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:03.984 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:04.242 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:04.242 "name": "BaseBdev2", 00:14:04.242 "aliases": [ 00:14:04.242 "ef2beaa4-0e0f-4dca-b95f-1401a2defab5" 00:14:04.242 ], 00:14:04.242 "product_name": "Malloc disk", 00:14:04.242 "block_size": 512, 00:14:04.242 "num_blocks": 65536, 00:14:04.242 "uuid": "ef2beaa4-0e0f-4dca-b95f-1401a2defab5", 00:14:04.242 "assigned_rate_limits": { 00:14:04.242 "rw_ios_per_sec": 0, 00:14:04.242 "rw_mbytes_per_sec": 0, 00:14:04.242 "r_mbytes_per_sec": 0, 00:14:04.242 "w_mbytes_per_sec": 0 00:14:04.242 }, 00:14:04.242 "claimed": true, 00:14:04.242 "claim_type": "exclusive_write", 00:14:04.242 "zoned": false, 00:14:04.242 "supported_io_types": { 00:14:04.242 "read": true, 00:14:04.242 "write": true, 00:14:04.242 "unmap": true, 00:14:04.242 "flush": true, 00:14:04.242 "reset": true, 00:14:04.242 "nvme_admin": false, 00:14:04.242 "nvme_io": false, 00:14:04.242 "nvme_io_md": false, 00:14:04.242 "write_zeroes": true, 00:14:04.242 "zcopy": true, 00:14:04.242 "get_zone_info": false, 00:14:04.242 "zone_management": false, 00:14:04.242 "zone_append": false, 00:14:04.242 "compare": false, 00:14:04.242 "compare_and_write": false, 00:14:04.242 "abort": true, 00:14:04.242 "seek_hole": false, 00:14:04.242 "seek_data": false, 00:14:04.242 "copy": true, 00:14:04.242 "nvme_iov_md": false 00:14:04.242 }, 00:14:04.242 "memory_domains": [ 00:14:04.242 { 00:14:04.242 "dma_device_id": "system", 00:14:04.242 "dma_device_type": 1 00:14:04.242 }, 00:14:04.242 { 00:14:04.242 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.242 "dma_device_type": 2 00:14:04.242 } 00:14:04.242 ], 00:14:04.242 "driver_specific": {} 00:14:04.242 }' 00:14:04.242 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:04.242 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:04.242 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:04.242 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:04.500 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:04.500 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:04.500 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:04.500 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:04.500 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:04.500 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:04.500 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:04.500 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:04.500 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:04.500 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:04.500 08:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:04.759 08:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:04.759 "name": "BaseBdev3", 00:14:04.759 "aliases": [ 00:14:04.759 "6be2326c-ace2-401b-95ec-6d2296082346" 00:14:04.759 ], 00:14:04.759 "product_name": "Malloc disk", 00:14:04.759 "block_size": 512, 00:14:04.759 "num_blocks": 65536, 00:14:04.759 "uuid": "6be2326c-ace2-401b-95ec-6d2296082346", 00:14:04.759 "assigned_rate_limits": { 00:14:04.759 "rw_ios_per_sec": 0, 00:14:04.759 "rw_mbytes_per_sec": 0, 00:14:04.759 "r_mbytes_per_sec": 0, 00:14:04.759 "w_mbytes_per_sec": 0 00:14:04.759 }, 00:14:04.759 "claimed": true, 00:14:04.759 "claim_type": "exclusive_write", 00:14:04.759 "zoned": false, 00:14:04.759 "supported_io_types": { 00:14:04.759 "read": true, 00:14:04.759 "write": true, 00:14:04.759 "unmap": true, 00:14:04.759 "flush": true, 00:14:04.759 "reset": true, 00:14:04.759 "nvme_admin": false, 00:14:04.759 "nvme_io": false, 00:14:04.759 "nvme_io_md": false, 00:14:04.759 "write_zeroes": true, 00:14:04.759 "zcopy": true, 00:14:04.759 "get_zone_info": false, 00:14:04.759 "zone_management": false, 00:14:04.759 "zone_append": false, 00:14:04.759 "compare": false, 00:14:04.759 "compare_and_write": false, 00:14:04.759 "abort": true, 00:14:04.759 "seek_hole": false, 00:14:04.759 "seek_data": false, 00:14:04.759 "copy": true, 00:14:04.759 "nvme_iov_md": false 00:14:04.759 }, 00:14:04.759 "memory_domains": [ 00:14:04.759 { 00:14:04.759 "dma_device_id": "system", 00:14:04.759 "dma_device_type": 1 00:14:04.759 }, 00:14:04.759 { 00:14:04.759 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.759 "dma_device_type": 2 00:14:04.759 } 00:14:04.759 ], 00:14:04.759 "driver_specific": {} 00:14:04.759 }' 00:14:04.759 08:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:04.759 08:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:04.759 08:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:04.759 08:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:04.759 08:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:05.018 08:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:05.018 08:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:05.018 08:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:05.018 08:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:05.018 08:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:05.018 08:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:05.018 08:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:05.018 08:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:05.277 [2024-07-23 08:27:17.611692] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:05.277 [2024-07-23 08:27:17.611717] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:05.277 [2024-07-23 08:27:17.611793] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:05.277 [2024-07-23 08:27:17.611847] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:05.277 [2024-07-23 08:27:17.611865] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036980 name Existed_Raid, state offline 00:14:05.277 08:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1424010 00:14:05.277 08:27:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1424010 ']' 00:14:05.277 08:27:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1424010 00:14:05.277 08:27:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:14:05.277 08:27:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:05.277 08:27:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1424010 00:14:05.277 08:27:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:05.277 08:27:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:05.278 08:27:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1424010' 00:14:05.278 killing process with pid 1424010 00:14:05.278 08:27:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1424010 00:14:05.278 [2024-07-23 08:27:17.667867] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:05.278 08:27:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1424010 00:14:05.536 [2024-07-23 08:27:17.903479] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:06.913 00:14:06.913 real 0m23.318s 00:14:06.913 user 0m41.646s 00:14:06.913 sys 0m3.450s 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:06.913 ************************************ 00:14:06.913 END TEST raid_state_function_test 00:14:06.913 ************************************ 00:14:06.913 08:27:19 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:06.913 08:27:19 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:14:06.913 08:27:19 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:06.913 08:27:19 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:06.913 08:27:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:06.913 ************************************ 00:14:06.913 START TEST raid_state_function_test_sb 00:14:06.913 ************************************ 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1428905 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1428905' 00:14:06.913 Process raid pid: 1428905 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1428905 /var/tmp/spdk-raid.sock 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1428905 ']' 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:06.913 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:06.913 08:27:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:06.913 [2024-07-23 08:27:19.328054] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:14:06.913 [2024-07-23 08:27:19.328155] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:07.172 [2024-07-23 08:27:19.452641] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:07.172 [2024-07-23 08:27:19.668172] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:07.739 [2024-07-23 08:27:19.961333] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:07.739 [2024-07-23 08:27:19.961361] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:07.739 08:27:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:07.739 08:27:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:14:07.739 08:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:07.739 [2024-07-23 08:27:20.254937] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:07.739 [2024-07-23 08:27:20.254979] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:07.739 [2024-07-23 08:27:20.254989] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:07.739 [2024-07-23 08:27:20.255001] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:07.739 [2024-07-23 08:27:20.255008] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:07.739 [2024-07-23 08:27:20.255017] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:07.998 08:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:07.998 08:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:07.998 08:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:07.998 08:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:07.998 08:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:07.998 08:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:07.998 08:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:07.998 08:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:07.998 08:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:07.998 08:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:07.998 08:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.998 08:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:07.998 08:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:07.998 "name": "Existed_Raid", 00:14:07.998 "uuid": "ee448d83-3d59-4d96-b68e-4d7f091dfaba", 00:14:07.998 "strip_size_kb": 64, 00:14:07.998 "state": "configuring", 00:14:07.998 "raid_level": "raid0", 00:14:07.998 "superblock": true, 00:14:07.998 "num_base_bdevs": 3, 00:14:07.998 "num_base_bdevs_discovered": 0, 00:14:07.998 "num_base_bdevs_operational": 3, 00:14:07.998 "base_bdevs_list": [ 00:14:07.998 { 00:14:07.998 "name": "BaseBdev1", 00:14:07.998 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:07.998 "is_configured": false, 00:14:07.998 "data_offset": 0, 00:14:07.998 "data_size": 0 00:14:07.998 }, 00:14:07.998 { 00:14:07.998 "name": "BaseBdev2", 00:14:07.998 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:07.998 "is_configured": false, 00:14:07.998 "data_offset": 0, 00:14:07.998 "data_size": 0 00:14:07.998 }, 00:14:07.998 { 00:14:07.998 "name": "BaseBdev3", 00:14:07.998 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:07.998 "is_configured": false, 00:14:07.998 "data_offset": 0, 00:14:07.998 "data_size": 0 00:14:07.998 } 00:14:07.998 ] 00:14:07.998 }' 00:14:07.998 08:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:07.998 08:27:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:08.565 08:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:08.565 [2024-07-23 08:27:21.044895] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:08.565 [2024-07-23 08:27:21.044932] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:14:08.565 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:08.824 [2024-07-23 08:27:21.233422] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:08.824 [2024-07-23 08:27:21.233461] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:08.824 [2024-07-23 08:27:21.233473] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:08.824 [2024-07-23 08:27:21.233501] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:08.824 [2024-07-23 08:27:21.233509] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:08.824 [2024-07-23 08:27:21.233521] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:08.824 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:09.083 [2024-07-23 08:27:21.431249] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:09.083 BaseBdev1 00:14:09.083 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:09.083 08:27:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:09.083 08:27:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:09.083 08:27:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:09.083 08:27:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:09.083 08:27:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:09.083 08:27:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:09.341 08:27:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:09.341 [ 00:14:09.341 { 00:14:09.341 "name": "BaseBdev1", 00:14:09.341 "aliases": [ 00:14:09.341 "1de3cfbb-1032-49ab-936b-4f7395731c6c" 00:14:09.341 ], 00:14:09.341 "product_name": "Malloc disk", 00:14:09.341 "block_size": 512, 00:14:09.341 "num_blocks": 65536, 00:14:09.341 "uuid": "1de3cfbb-1032-49ab-936b-4f7395731c6c", 00:14:09.341 "assigned_rate_limits": { 00:14:09.341 "rw_ios_per_sec": 0, 00:14:09.341 "rw_mbytes_per_sec": 0, 00:14:09.341 "r_mbytes_per_sec": 0, 00:14:09.341 "w_mbytes_per_sec": 0 00:14:09.341 }, 00:14:09.341 "claimed": true, 00:14:09.341 "claim_type": "exclusive_write", 00:14:09.341 "zoned": false, 00:14:09.341 "supported_io_types": { 00:14:09.341 "read": true, 00:14:09.341 "write": true, 00:14:09.342 "unmap": true, 00:14:09.342 "flush": true, 00:14:09.342 "reset": true, 00:14:09.342 "nvme_admin": false, 00:14:09.342 "nvme_io": false, 00:14:09.342 "nvme_io_md": false, 00:14:09.342 "write_zeroes": true, 00:14:09.342 "zcopy": true, 00:14:09.342 "get_zone_info": false, 00:14:09.342 "zone_management": false, 00:14:09.342 "zone_append": false, 00:14:09.342 "compare": false, 00:14:09.342 "compare_and_write": false, 00:14:09.342 "abort": true, 00:14:09.342 "seek_hole": false, 00:14:09.342 "seek_data": false, 00:14:09.342 "copy": true, 00:14:09.342 "nvme_iov_md": false 00:14:09.342 }, 00:14:09.342 "memory_domains": [ 00:14:09.342 { 00:14:09.342 "dma_device_id": "system", 00:14:09.342 "dma_device_type": 1 00:14:09.342 }, 00:14:09.342 { 00:14:09.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:09.342 "dma_device_type": 2 00:14:09.342 } 00:14:09.342 ], 00:14:09.342 "driver_specific": {} 00:14:09.342 } 00:14:09.342 ] 00:14:09.342 08:27:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:09.342 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:09.342 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:09.342 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:09.342 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:09.342 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:09.342 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:09.342 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:09.342 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:09.342 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:09.342 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:09.342 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:09.342 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.600 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:09.600 "name": "Existed_Raid", 00:14:09.600 "uuid": "5c09933a-2a7d-449b-9d44-a4f20f3e6a54", 00:14:09.600 "strip_size_kb": 64, 00:14:09.600 "state": "configuring", 00:14:09.600 "raid_level": "raid0", 00:14:09.600 "superblock": true, 00:14:09.600 "num_base_bdevs": 3, 00:14:09.600 "num_base_bdevs_discovered": 1, 00:14:09.600 "num_base_bdevs_operational": 3, 00:14:09.600 "base_bdevs_list": [ 00:14:09.600 { 00:14:09.600 "name": "BaseBdev1", 00:14:09.600 "uuid": "1de3cfbb-1032-49ab-936b-4f7395731c6c", 00:14:09.600 "is_configured": true, 00:14:09.600 "data_offset": 2048, 00:14:09.600 "data_size": 63488 00:14:09.600 }, 00:14:09.600 { 00:14:09.600 "name": "BaseBdev2", 00:14:09.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:09.600 "is_configured": false, 00:14:09.600 "data_offset": 0, 00:14:09.600 "data_size": 0 00:14:09.600 }, 00:14:09.600 { 00:14:09.600 "name": "BaseBdev3", 00:14:09.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:09.600 "is_configured": false, 00:14:09.600 "data_offset": 0, 00:14:09.601 "data_size": 0 00:14:09.601 } 00:14:09.601 ] 00:14:09.601 }' 00:14:09.601 08:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:09.601 08:27:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:10.167 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:10.167 [2024-07-23 08:27:22.534211] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:10.167 [2024-07-23 08:27:22.534259] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:14:10.167 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:10.426 [2024-07-23 08:27:22.714716] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:10.426 [2024-07-23 08:27:22.716230] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:10.426 [2024-07-23 08:27:22.716264] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:10.426 [2024-07-23 08:27:22.716273] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:10.426 [2024-07-23 08:27:22.716298] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:10.426 "name": "Existed_Raid", 00:14:10.426 "uuid": "b0abe2fc-f9fd-4189-a909-8fc1e7f7ee63", 00:14:10.426 "strip_size_kb": 64, 00:14:10.426 "state": "configuring", 00:14:10.426 "raid_level": "raid0", 00:14:10.426 "superblock": true, 00:14:10.426 "num_base_bdevs": 3, 00:14:10.426 "num_base_bdevs_discovered": 1, 00:14:10.426 "num_base_bdevs_operational": 3, 00:14:10.426 "base_bdevs_list": [ 00:14:10.426 { 00:14:10.426 "name": "BaseBdev1", 00:14:10.426 "uuid": "1de3cfbb-1032-49ab-936b-4f7395731c6c", 00:14:10.426 "is_configured": true, 00:14:10.426 "data_offset": 2048, 00:14:10.426 "data_size": 63488 00:14:10.426 }, 00:14:10.426 { 00:14:10.426 "name": "BaseBdev2", 00:14:10.426 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.426 "is_configured": false, 00:14:10.426 "data_offset": 0, 00:14:10.426 "data_size": 0 00:14:10.426 }, 00:14:10.426 { 00:14:10.426 "name": "BaseBdev3", 00:14:10.426 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.426 "is_configured": false, 00:14:10.426 "data_offset": 0, 00:14:10.426 "data_size": 0 00:14:10.426 } 00:14:10.426 ] 00:14:10.426 }' 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:10.426 08:27:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:10.993 08:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:11.252 [2024-07-23 08:27:23.588781] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:11.252 BaseBdev2 00:14:11.252 08:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:11.252 08:27:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:11.252 08:27:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:11.252 08:27:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:11.252 08:27:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:11.252 08:27:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:11.252 08:27:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:11.511 08:27:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:11.511 [ 00:14:11.511 { 00:14:11.511 "name": "BaseBdev2", 00:14:11.511 "aliases": [ 00:14:11.511 "d037babd-969d-4cfd-a82e-7c5aa921afd8" 00:14:11.511 ], 00:14:11.511 "product_name": "Malloc disk", 00:14:11.511 "block_size": 512, 00:14:11.511 "num_blocks": 65536, 00:14:11.511 "uuid": "d037babd-969d-4cfd-a82e-7c5aa921afd8", 00:14:11.511 "assigned_rate_limits": { 00:14:11.511 "rw_ios_per_sec": 0, 00:14:11.511 "rw_mbytes_per_sec": 0, 00:14:11.511 "r_mbytes_per_sec": 0, 00:14:11.511 "w_mbytes_per_sec": 0 00:14:11.511 }, 00:14:11.511 "claimed": true, 00:14:11.511 "claim_type": "exclusive_write", 00:14:11.511 "zoned": false, 00:14:11.511 "supported_io_types": { 00:14:11.511 "read": true, 00:14:11.511 "write": true, 00:14:11.511 "unmap": true, 00:14:11.511 "flush": true, 00:14:11.511 "reset": true, 00:14:11.511 "nvme_admin": false, 00:14:11.511 "nvme_io": false, 00:14:11.511 "nvme_io_md": false, 00:14:11.511 "write_zeroes": true, 00:14:11.511 "zcopy": true, 00:14:11.511 "get_zone_info": false, 00:14:11.511 "zone_management": false, 00:14:11.511 "zone_append": false, 00:14:11.511 "compare": false, 00:14:11.511 "compare_and_write": false, 00:14:11.511 "abort": true, 00:14:11.511 "seek_hole": false, 00:14:11.511 "seek_data": false, 00:14:11.511 "copy": true, 00:14:11.511 "nvme_iov_md": false 00:14:11.511 }, 00:14:11.511 "memory_domains": [ 00:14:11.511 { 00:14:11.512 "dma_device_id": "system", 00:14:11.512 "dma_device_type": 1 00:14:11.512 }, 00:14:11.512 { 00:14:11.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.512 "dma_device_type": 2 00:14:11.512 } 00:14:11.512 ], 00:14:11.512 "driver_specific": {} 00:14:11.512 } 00:14:11.512 ] 00:14:11.512 08:27:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:11.512 08:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:11.512 08:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:11.512 08:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:11.512 08:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:11.512 08:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:11.512 08:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:11.512 08:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:11.512 08:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:11.512 08:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:11.512 08:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:11.512 08:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:11.512 08:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:11.512 08:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.512 08:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:11.771 08:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:11.771 "name": "Existed_Raid", 00:14:11.771 "uuid": "b0abe2fc-f9fd-4189-a909-8fc1e7f7ee63", 00:14:11.771 "strip_size_kb": 64, 00:14:11.771 "state": "configuring", 00:14:11.771 "raid_level": "raid0", 00:14:11.771 "superblock": true, 00:14:11.771 "num_base_bdevs": 3, 00:14:11.771 "num_base_bdevs_discovered": 2, 00:14:11.771 "num_base_bdevs_operational": 3, 00:14:11.771 "base_bdevs_list": [ 00:14:11.771 { 00:14:11.771 "name": "BaseBdev1", 00:14:11.771 "uuid": "1de3cfbb-1032-49ab-936b-4f7395731c6c", 00:14:11.771 "is_configured": true, 00:14:11.771 "data_offset": 2048, 00:14:11.771 "data_size": 63488 00:14:11.771 }, 00:14:11.771 { 00:14:11.771 "name": "BaseBdev2", 00:14:11.771 "uuid": "d037babd-969d-4cfd-a82e-7c5aa921afd8", 00:14:11.771 "is_configured": true, 00:14:11.771 "data_offset": 2048, 00:14:11.771 "data_size": 63488 00:14:11.771 }, 00:14:11.771 { 00:14:11.771 "name": "BaseBdev3", 00:14:11.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:11.771 "is_configured": false, 00:14:11.771 "data_offset": 0, 00:14:11.771 "data_size": 0 00:14:11.771 } 00:14:11.771 ] 00:14:11.771 }' 00:14:11.771 08:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:11.771 08:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:12.339 08:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:12.339 [2024-07-23 08:27:24.784761] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:12.339 [2024-07-23 08:27:24.784975] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:14:12.339 [2024-07-23 08:27:24.784992] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:12.339 [2024-07-23 08:27:24.785236] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:14:12.339 [2024-07-23 08:27:24.785413] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:14:12.339 [2024-07-23 08:27:24.785423] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:14:12.339 [2024-07-23 08:27:24.785568] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:12.339 BaseBdev3 00:14:12.339 08:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:12.339 08:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:12.339 08:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:12.339 08:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:12.339 08:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:12.339 08:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:12.339 08:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:12.598 08:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:12.857 [ 00:14:12.857 { 00:14:12.857 "name": "BaseBdev3", 00:14:12.857 "aliases": [ 00:14:12.857 "c4f57763-9fd4-4c2d-ab08-8af1be1bd71f" 00:14:12.857 ], 00:14:12.857 "product_name": "Malloc disk", 00:14:12.857 "block_size": 512, 00:14:12.857 "num_blocks": 65536, 00:14:12.857 "uuid": "c4f57763-9fd4-4c2d-ab08-8af1be1bd71f", 00:14:12.857 "assigned_rate_limits": { 00:14:12.857 "rw_ios_per_sec": 0, 00:14:12.857 "rw_mbytes_per_sec": 0, 00:14:12.857 "r_mbytes_per_sec": 0, 00:14:12.858 "w_mbytes_per_sec": 0 00:14:12.858 }, 00:14:12.858 "claimed": true, 00:14:12.858 "claim_type": "exclusive_write", 00:14:12.858 "zoned": false, 00:14:12.858 "supported_io_types": { 00:14:12.858 "read": true, 00:14:12.858 "write": true, 00:14:12.858 "unmap": true, 00:14:12.858 "flush": true, 00:14:12.858 "reset": true, 00:14:12.858 "nvme_admin": false, 00:14:12.858 "nvme_io": false, 00:14:12.858 "nvme_io_md": false, 00:14:12.858 "write_zeroes": true, 00:14:12.858 "zcopy": true, 00:14:12.858 "get_zone_info": false, 00:14:12.858 "zone_management": false, 00:14:12.858 "zone_append": false, 00:14:12.858 "compare": false, 00:14:12.858 "compare_and_write": false, 00:14:12.858 "abort": true, 00:14:12.858 "seek_hole": false, 00:14:12.858 "seek_data": false, 00:14:12.858 "copy": true, 00:14:12.858 "nvme_iov_md": false 00:14:12.858 }, 00:14:12.858 "memory_domains": [ 00:14:12.858 { 00:14:12.858 "dma_device_id": "system", 00:14:12.858 "dma_device_type": 1 00:14:12.858 }, 00:14:12.858 { 00:14:12.858 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.858 "dma_device_type": 2 00:14:12.858 } 00:14:12.858 ], 00:14:12.858 "driver_specific": {} 00:14:12.858 } 00:14:12.858 ] 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:12.858 "name": "Existed_Raid", 00:14:12.858 "uuid": "b0abe2fc-f9fd-4189-a909-8fc1e7f7ee63", 00:14:12.858 "strip_size_kb": 64, 00:14:12.858 "state": "online", 00:14:12.858 "raid_level": "raid0", 00:14:12.858 "superblock": true, 00:14:12.858 "num_base_bdevs": 3, 00:14:12.858 "num_base_bdevs_discovered": 3, 00:14:12.858 "num_base_bdevs_operational": 3, 00:14:12.858 "base_bdevs_list": [ 00:14:12.858 { 00:14:12.858 "name": "BaseBdev1", 00:14:12.858 "uuid": "1de3cfbb-1032-49ab-936b-4f7395731c6c", 00:14:12.858 "is_configured": true, 00:14:12.858 "data_offset": 2048, 00:14:12.858 "data_size": 63488 00:14:12.858 }, 00:14:12.858 { 00:14:12.858 "name": "BaseBdev2", 00:14:12.858 "uuid": "d037babd-969d-4cfd-a82e-7c5aa921afd8", 00:14:12.858 "is_configured": true, 00:14:12.858 "data_offset": 2048, 00:14:12.858 "data_size": 63488 00:14:12.858 }, 00:14:12.858 { 00:14:12.858 "name": "BaseBdev3", 00:14:12.858 "uuid": "c4f57763-9fd4-4c2d-ab08-8af1be1bd71f", 00:14:12.858 "is_configured": true, 00:14:12.858 "data_offset": 2048, 00:14:12.858 "data_size": 63488 00:14:12.858 } 00:14:12.858 ] 00:14:12.858 }' 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:12.858 08:27:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:13.425 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:13.425 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:13.425 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:13.425 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:13.425 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:13.425 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:13.425 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:13.425 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:13.684 [2024-07-23 08:27:25.968190] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:13.684 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:13.684 "name": "Existed_Raid", 00:14:13.684 "aliases": [ 00:14:13.684 "b0abe2fc-f9fd-4189-a909-8fc1e7f7ee63" 00:14:13.684 ], 00:14:13.684 "product_name": "Raid Volume", 00:14:13.684 "block_size": 512, 00:14:13.684 "num_blocks": 190464, 00:14:13.684 "uuid": "b0abe2fc-f9fd-4189-a909-8fc1e7f7ee63", 00:14:13.684 "assigned_rate_limits": { 00:14:13.684 "rw_ios_per_sec": 0, 00:14:13.684 "rw_mbytes_per_sec": 0, 00:14:13.684 "r_mbytes_per_sec": 0, 00:14:13.684 "w_mbytes_per_sec": 0 00:14:13.684 }, 00:14:13.684 "claimed": false, 00:14:13.684 "zoned": false, 00:14:13.684 "supported_io_types": { 00:14:13.684 "read": true, 00:14:13.684 "write": true, 00:14:13.684 "unmap": true, 00:14:13.684 "flush": true, 00:14:13.684 "reset": true, 00:14:13.684 "nvme_admin": false, 00:14:13.684 "nvme_io": false, 00:14:13.684 "nvme_io_md": false, 00:14:13.684 "write_zeroes": true, 00:14:13.684 "zcopy": false, 00:14:13.684 "get_zone_info": false, 00:14:13.684 "zone_management": false, 00:14:13.684 "zone_append": false, 00:14:13.684 "compare": false, 00:14:13.684 "compare_and_write": false, 00:14:13.684 "abort": false, 00:14:13.684 "seek_hole": false, 00:14:13.684 "seek_data": false, 00:14:13.684 "copy": false, 00:14:13.684 "nvme_iov_md": false 00:14:13.684 }, 00:14:13.684 "memory_domains": [ 00:14:13.684 { 00:14:13.684 "dma_device_id": "system", 00:14:13.684 "dma_device_type": 1 00:14:13.684 }, 00:14:13.684 { 00:14:13.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.684 "dma_device_type": 2 00:14:13.684 }, 00:14:13.684 { 00:14:13.684 "dma_device_id": "system", 00:14:13.684 "dma_device_type": 1 00:14:13.684 }, 00:14:13.684 { 00:14:13.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.684 "dma_device_type": 2 00:14:13.684 }, 00:14:13.684 { 00:14:13.684 "dma_device_id": "system", 00:14:13.684 "dma_device_type": 1 00:14:13.684 }, 00:14:13.684 { 00:14:13.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.684 "dma_device_type": 2 00:14:13.684 } 00:14:13.684 ], 00:14:13.684 "driver_specific": { 00:14:13.684 "raid": { 00:14:13.684 "uuid": "b0abe2fc-f9fd-4189-a909-8fc1e7f7ee63", 00:14:13.684 "strip_size_kb": 64, 00:14:13.684 "state": "online", 00:14:13.684 "raid_level": "raid0", 00:14:13.684 "superblock": true, 00:14:13.685 "num_base_bdevs": 3, 00:14:13.685 "num_base_bdevs_discovered": 3, 00:14:13.685 "num_base_bdevs_operational": 3, 00:14:13.685 "base_bdevs_list": [ 00:14:13.685 { 00:14:13.685 "name": "BaseBdev1", 00:14:13.685 "uuid": "1de3cfbb-1032-49ab-936b-4f7395731c6c", 00:14:13.685 "is_configured": true, 00:14:13.685 "data_offset": 2048, 00:14:13.685 "data_size": 63488 00:14:13.685 }, 00:14:13.685 { 00:14:13.685 "name": "BaseBdev2", 00:14:13.685 "uuid": "d037babd-969d-4cfd-a82e-7c5aa921afd8", 00:14:13.685 "is_configured": true, 00:14:13.685 "data_offset": 2048, 00:14:13.685 "data_size": 63488 00:14:13.685 }, 00:14:13.685 { 00:14:13.685 "name": "BaseBdev3", 00:14:13.685 "uuid": "c4f57763-9fd4-4c2d-ab08-8af1be1bd71f", 00:14:13.685 "is_configured": true, 00:14:13.685 "data_offset": 2048, 00:14:13.685 "data_size": 63488 00:14:13.685 } 00:14:13.685 ] 00:14:13.685 } 00:14:13.685 } 00:14:13.685 }' 00:14:13.685 08:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:13.685 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:13.685 BaseBdev2 00:14:13.685 BaseBdev3' 00:14:13.685 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:13.685 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:13.685 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:13.685 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:13.685 "name": "BaseBdev1", 00:14:13.685 "aliases": [ 00:14:13.685 "1de3cfbb-1032-49ab-936b-4f7395731c6c" 00:14:13.685 ], 00:14:13.685 "product_name": "Malloc disk", 00:14:13.685 "block_size": 512, 00:14:13.685 "num_blocks": 65536, 00:14:13.685 "uuid": "1de3cfbb-1032-49ab-936b-4f7395731c6c", 00:14:13.685 "assigned_rate_limits": { 00:14:13.685 "rw_ios_per_sec": 0, 00:14:13.685 "rw_mbytes_per_sec": 0, 00:14:13.685 "r_mbytes_per_sec": 0, 00:14:13.685 "w_mbytes_per_sec": 0 00:14:13.685 }, 00:14:13.685 "claimed": true, 00:14:13.685 "claim_type": "exclusive_write", 00:14:13.685 "zoned": false, 00:14:13.685 "supported_io_types": { 00:14:13.685 "read": true, 00:14:13.685 "write": true, 00:14:13.685 "unmap": true, 00:14:13.685 "flush": true, 00:14:13.685 "reset": true, 00:14:13.685 "nvme_admin": false, 00:14:13.685 "nvme_io": false, 00:14:13.685 "nvme_io_md": false, 00:14:13.685 "write_zeroes": true, 00:14:13.685 "zcopy": true, 00:14:13.685 "get_zone_info": false, 00:14:13.685 "zone_management": false, 00:14:13.685 "zone_append": false, 00:14:13.685 "compare": false, 00:14:13.685 "compare_and_write": false, 00:14:13.685 "abort": true, 00:14:13.685 "seek_hole": false, 00:14:13.685 "seek_data": false, 00:14:13.685 "copy": true, 00:14:13.685 "nvme_iov_md": false 00:14:13.685 }, 00:14:13.685 "memory_domains": [ 00:14:13.685 { 00:14:13.685 "dma_device_id": "system", 00:14:13.685 "dma_device_type": 1 00:14:13.685 }, 00:14:13.685 { 00:14:13.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.685 "dma_device_type": 2 00:14:13.685 } 00:14:13.685 ], 00:14:13.685 "driver_specific": {} 00:14:13.685 }' 00:14:13.685 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.944 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.944 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:13.944 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.944 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.944 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:13.944 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.944 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.944 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:13.944 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.944 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.944 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:13.944 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:13.944 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:13.944 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:14.204 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:14.204 "name": "BaseBdev2", 00:14:14.204 "aliases": [ 00:14:14.204 "d037babd-969d-4cfd-a82e-7c5aa921afd8" 00:14:14.204 ], 00:14:14.204 "product_name": "Malloc disk", 00:14:14.204 "block_size": 512, 00:14:14.204 "num_blocks": 65536, 00:14:14.204 "uuid": "d037babd-969d-4cfd-a82e-7c5aa921afd8", 00:14:14.204 "assigned_rate_limits": { 00:14:14.204 "rw_ios_per_sec": 0, 00:14:14.204 "rw_mbytes_per_sec": 0, 00:14:14.204 "r_mbytes_per_sec": 0, 00:14:14.204 "w_mbytes_per_sec": 0 00:14:14.204 }, 00:14:14.204 "claimed": true, 00:14:14.204 "claim_type": "exclusive_write", 00:14:14.204 "zoned": false, 00:14:14.204 "supported_io_types": { 00:14:14.204 "read": true, 00:14:14.204 "write": true, 00:14:14.204 "unmap": true, 00:14:14.204 "flush": true, 00:14:14.204 "reset": true, 00:14:14.204 "nvme_admin": false, 00:14:14.204 "nvme_io": false, 00:14:14.204 "nvme_io_md": false, 00:14:14.204 "write_zeroes": true, 00:14:14.204 "zcopy": true, 00:14:14.204 "get_zone_info": false, 00:14:14.204 "zone_management": false, 00:14:14.204 "zone_append": false, 00:14:14.204 "compare": false, 00:14:14.204 "compare_and_write": false, 00:14:14.204 "abort": true, 00:14:14.204 "seek_hole": false, 00:14:14.204 "seek_data": false, 00:14:14.204 "copy": true, 00:14:14.204 "nvme_iov_md": false 00:14:14.204 }, 00:14:14.204 "memory_domains": [ 00:14:14.204 { 00:14:14.204 "dma_device_id": "system", 00:14:14.204 "dma_device_type": 1 00:14:14.204 }, 00:14:14.204 { 00:14:14.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:14.204 "dma_device_type": 2 00:14:14.204 } 00:14:14.204 ], 00:14:14.204 "driver_specific": {} 00:14:14.204 }' 00:14:14.204 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:14.204 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:14.204 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:14.204 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:14.204 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:14.463 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:14.463 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:14.463 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:14.463 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:14.463 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:14.463 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:14.463 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:14.463 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:14.463 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:14.463 08:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:14.722 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:14.722 "name": "BaseBdev3", 00:14:14.722 "aliases": [ 00:14:14.722 "c4f57763-9fd4-4c2d-ab08-8af1be1bd71f" 00:14:14.722 ], 00:14:14.722 "product_name": "Malloc disk", 00:14:14.722 "block_size": 512, 00:14:14.722 "num_blocks": 65536, 00:14:14.722 "uuid": "c4f57763-9fd4-4c2d-ab08-8af1be1bd71f", 00:14:14.722 "assigned_rate_limits": { 00:14:14.722 "rw_ios_per_sec": 0, 00:14:14.722 "rw_mbytes_per_sec": 0, 00:14:14.722 "r_mbytes_per_sec": 0, 00:14:14.722 "w_mbytes_per_sec": 0 00:14:14.722 }, 00:14:14.722 "claimed": true, 00:14:14.722 "claim_type": "exclusive_write", 00:14:14.722 "zoned": false, 00:14:14.722 "supported_io_types": { 00:14:14.722 "read": true, 00:14:14.722 "write": true, 00:14:14.722 "unmap": true, 00:14:14.722 "flush": true, 00:14:14.722 "reset": true, 00:14:14.722 "nvme_admin": false, 00:14:14.722 "nvme_io": false, 00:14:14.722 "nvme_io_md": false, 00:14:14.722 "write_zeroes": true, 00:14:14.722 "zcopy": true, 00:14:14.722 "get_zone_info": false, 00:14:14.722 "zone_management": false, 00:14:14.722 "zone_append": false, 00:14:14.722 "compare": false, 00:14:14.722 "compare_and_write": false, 00:14:14.722 "abort": true, 00:14:14.722 "seek_hole": false, 00:14:14.722 "seek_data": false, 00:14:14.722 "copy": true, 00:14:14.722 "nvme_iov_md": false 00:14:14.722 }, 00:14:14.722 "memory_domains": [ 00:14:14.722 { 00:14:14.722 "dma_device_id": "system", 00:14:14.722 "dma_device_type": 1 00:14:14.722 }, 00:14:14.722 { 00:14:14.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:14.722 "dma_device_type": 2 00:14:14.722 } 00:14:14.722 ], 00:14:14.722 "driver_specific": {} 00:14:14.722 }' 00:14:14.722 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:14.722 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:14.722 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:14.722 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:14.722 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:14.722 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:14.722 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:14.722 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:14.722 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:14.722 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:14.982 [2024-07-23 08:27:27.443826] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:14.982 [2024-07-23 08:27:27.443853] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:14.982 [2024-07-23 08:27:27.443904] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.982 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:15.241 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:15.241 "name": "Existed_Raid", 00:14:15.241 "uuid": "b0abe2fc-f9fd-4189-a909-8fc1e7f7ee63", 00:14:15.241 "strip_size_kb": 64, 00:14:15.241 "state": "offline", 00:14:15.241 "raid_level": "raid0", 00:14:15.241 "superblock": true, 00:14:15.241 "num_base_bdevs": 3, 00:14:15.241 "num_base_bdevs_discovered": 2, 00:14:15.241 "num_base_bdevs_operational": 2, 00:14:15.241 "base_bdevs_list": [ 00:14:15.241 { 00:14:15.241 "name": null, 00:14:15.241 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:15.241 "is_configured": false, 00:14:15.241 "data_offset": 2048, 00:14:15.241 "data_size": 63488 00:14:15.241 }, 00:14:15.241 { 00:14:15.241 "name": "BaseBdev2", 00:14:15.241 "uuid": "d037babd-969d-4cfd-a82e-7c5aa921afd8", 00:14:15.241 "is_configured": true, 00:14:15.241 "data_offset": 2048, 00:14:15.241 "data_size": 63488 00:14:15.241 }, 00:14:15.241 { 00:14:15.241 "name": "BaseBdev3", 00:14:15.241 "uuid": "c4f57763-9fd4-4c2d-ab08-8af1be1bd71f", 00:14:15.241 "is_configured": true, 00:14:15.241 "data_offset": 2048, 00:14:15.241 "data_size": 63488 00:14:15.241 } 00:14:15.241 ] 00:14:15.241 }' 00:14:15.241 08:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:15.241 08:27:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:15.809 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:15.809 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:15.809 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.809 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:15.809 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:15.809 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:15.809 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:16.068 [2024-07-23 08:27:28.427736] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:16.068 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:16.068 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:16.068 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:16.068 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.326 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:16.326 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:16.326 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:16.585 [2024-07-23 08:27:28.850565] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:16.585 [2024-07-23 08:27:28.850623] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:14:16.585 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:16.585 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:16.585 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:16.585 08:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.844 08:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:16.844 08:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:16.844 08:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:16.844 08:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:16.844 08:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:16.844 08:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:16.844 BaseBdev2 00:14:16.844 08:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:16.844 08:27:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:16.844 08:27:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:16.844 08:27:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:16.844 08:27:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:16.844 08:27:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:16.844 08:27:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:17.103 08:27:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:17.361 [ 00:14:17.361 { 00:14:17.361 "name": "BaseBdev2", 00:14:17.361 "aliases": [ 00:14:17.361 "40b2b37f-e410-4abc-847a-11be2f7fc9f7" 00:14:17.361 ], 00:14:17.361 "product_name": "Malloc disk", 00:14:17.361 "block_size": 512, 00:14:17.361 "num_blocks": 65536, 00:14:17.361 "uuid": "40b2b37f-e410-4abc-847a-11be2f7fc9f7", 00:14:17.361 "assigned_rate_limits": { 00:14:17.361 "rw_ios_per_sec": 0, 00:14:17.361 "rw_mbytes_per_sec": 0, 00:14:17.361 "r_mbytes_per_sec": 0, 00:14:17.361 "w_mbytes_per_sec": 0 00:14:17.361 }, 00:14:17.361 "claimed": false, 00:14:17.361 "zoned": false, 00:14:17.361 "supported_io_types": { 00:14:17.361 "read": true, 00:14:17.361 "write": true, 00:14:17.361 "unmap": true, 00:14:17.361 "flush": true, 00:14:17.361 "reset": true, 00:14:17.361 "nvme_admin": false, 00:14:17.361 "nvme_io": false, 00:14:17.361 "nvme_io_md": false, 00:14:17.361 "write_zeroes": true, 00:14:17.361 "zcopy": true, 00:14:17.361 "get_zone_info": false, 00:14:17.361 "zone_management": false, 00:14:17.361 "zone_append": false, 00:14:17.362 "compare": false, 00:14:17.362 "compare_and_write": false, 00:14:17.362 "abort": true, 00:14:17.362 "seek_hole": false, 00:14:17.362 "seek_data": false, 00:14:17.362 "copy": true, 00:14:17.362 "nvme_iov_md": false 00:14:17.362 }, 00:14:17.362 "memory_domains": [ 00:14:17.362 { 00:14:17.362 "dma_device_id": "system", 00:14:17.362 "dma_device_type": 1 00:14:17.362 }, 00:14:17.362 { 00:14:17.362 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.362 "dma_device_type": 2 00:14:17.362 } 00:14:17.362 ], 00:14:17.362 "driver_specific": {} 00:14:17.362 } 00:14:17.362 ] 00:14:17.362 08:27:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:17.362 08:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:17.362 08:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:17.362 08:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:17.362 BaseBdev3 00:14:17.362 08:27:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:17.362 08:27:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:17.362 08:27:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:17.362 08:27:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:17.362 08:27:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:17.362 08:27:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:17.362 08:27:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:17.620 08:27:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:17.879 [ 00:14:17.879 { 00:14:17.879 "name": "BaseBdev3", 00:14:17.879 "aliases": [ 00:14:17.879 "4298a9a5-2952-481a-8888-69e360dafc92" 00:14:17.879 ], 00:14:17.879 "product_name": "Malloc disk", 00:14:17.879 "block_size": 512, 00:14:17.879 "num_blocks": 65536, 00:14:17.879 "uuid": "4298a9a5-2952-481a-8888-69e360dafc92", 00:14:17.879 "assigned_rate_limits": { 00:14:17.879 "rw_ios_per_sec": 0, 00:14:17.879 "rw_mbytes_per_sec": 0, 00:14:17.879 "r_mbytes_per_sec": 0, 00:14:17.879 "w_mbytes_per_sec": 0 00:14:17.879 }, 00:14:17.879 "claimed": false, 00:14:17.879 "zoned": false, 00:14:17.879 "supported_io_types": { 00:14:17.879 "read": true, 00:14:17.879 "write": true, 00:14:17.879 "unmap": true, 00:14:17.879 "flush": true, 00:14:17.879 "reset": true, 00:14:17.879 "nvme_admin": false, 00:14:17.879 "nvme_io": false, 00:14:17.879 "nvme_io_md": false, 00:14:17.879 "write_zeroes": true, 00:14:17.879 "zcopy": true, 00:14:17.879 "get_zone_info": false, 00:14:17.879 "zone_management": false, 00:14:17.879 "zone_append": false, 00:14:17.879 "compare": false, 00:14:17.879 "compare_and_write": false, 00:14:17.879 "abort": true, 00:14:17.879 "seek_hole": false, 00:14:17.879 "seek_data": false, 00:14:17.879 "copy": true, 00:14:17.879 "nvme_iov_md": false 00:14:17.879 }, 00:14:17.879 "memory_domains": [ 00:14:17.879 { 00:14:17.879 "dma_device_id": "system", 00:14:17.879 "dma_device_type": 1 00:14:17.879 }, 00:14:17.879 { 00:14:17.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.879 "dma_device_type": 2 00:14:17.879 } 00:14:17.879 ], 00:14:17.879 "driver_specific": {} 00:14:17.879 } 00:14:17.879 ] 00:14:17.879 08:27:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:17.879 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:17.879 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:17.879 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:17.879 [2024-07-23 08:27:30.378946] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:17.879 [2024-07-23 08:27:30.378983] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:17.879 [2024-07-23 08:27:30.379024] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:17.879 [2024-07-23 08:27:30.380643] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:17.879 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:17.879 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:17.879 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:17.879 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:17.879 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:17.879 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:17.879 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:17.879 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:17.879 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:17.879 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:18.174 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.174 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:18.174 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:18.174 "name": "Existed_Raid", 00:14:18.174 "uuid": "64c907a1-9991-4d76-93b5-4ce0ff1e9a6c", 00:14:18.174 "strip_size_kb": 64, 00:14:18.174 "state": "configuring", 00:14:18.174 "raid_level": "raid0", 00:14:18.174 "superblock": true, 00:14:18.174 "num_base_bdevs": 3, 00:14:18.174 "num_base_bdevs_discovered": 2, 00:14:18.174 "num_base_bdevs_operational": 3, 00:14:18.174 "base_bdevs_list": [ 00:14:18.174 { 00:14:18.174 "name": "BaseBdev1", 00:14:18.174 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:18.175 "is_configured": false, 00:14:18.175 "data_offset": 0, 00:14:18.175 "data_size": 0 00:14:18.175 }, 00:14:18.175 { 00:14:18.175 "name": "BaseBdev2", 00:14:18.175 "uuid": "40b2b37f-e410-4abc-847a-11be2f7fc9f7", 00:14:18.175 "is_configured": true, 00:14:18.175 "data_offset": 2048, 00:14:18.175 "data_size": 63488 00:14:18.175 }, 00:14:18.175 { 00:14:18.175 "name": "BaseBdev3", 00:14:18.175 "uuid": "4298a9a5-2952-481a-8888-69e360dafc92", 00:14:18.175 "is_configured": true, 00:14:18.175 "data_offset": 2048, 00:14:18.175 "data_size": 63488 00:14:18.175 } 00:14:18.175 ] 00:14:18.175 }' 00:14:18.175 08:27:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:18.175 08:27:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:18.762 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:18.762 [2024-07-23 08:27:31.185034] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:18.762 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:18.762 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:18.762 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:18.762 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:18.762 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:18.762 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:18.762 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:18.762 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:18.762 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:18.762 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:18.762 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.762 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:19.021 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:19.021 "name": "Existed_Raid", 00:14:19.021 "uuid": "64c907a1-9991-4d76-93b5-4ce0ff1e9a6c", 00:14:19.021 "strip_size_kb": 64, 00:14:19.021 "state": "configuring", 00:14:19.021 "raid_level": "raid0", 00:14:19.021 "superblock": true, 00:14:19.021 "num_base_bdevs": 3, 00:14:19.021 "num_base_bdevs_discovered": 1, 00:14:19.021 "num_base_bdevs_operational": 3, 00:14:19.021 "base_bdevs_list": [ 00:14:19.021 { 00:14:19.021 "name": "BaseBdev1", 00:14:19.021 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:19.021 "is_configured": false, 00:14:19.021 "data_offset": 0, 00:14:19.021 "data_size": 0 00:14:19.021 }, 00:14:19.021 { 00:14:19.021 "name": null, 00:14:19.021 "uuid": "40b2b37f-e410-4abc-847a-11be2f7fc9f7", 00:14:19.021 "is_configured": false, 00:14:19.021 "data_offset": 2048, 00:14:19.021 "data_size": 63488 00:14:19.021 }, 00:14:19.021 { 00:14:19.021 "name": "BaseBdev3", 00:14:19.021 "uuid": "4298a9a5-2952-481a-8888-69e360dafc92", 00:14:19.021 "is_configured": true, 00:14:19.021 "data_offset": 2048, 00:14:19.021 "data_size": 63488 00:14:19.021 } 00:14:19.021 ] 00:14:19.021 }' 00:14:19.021 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:19.021 08:27:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:19.591 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:19.591 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.591 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:19.591 08:27:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:19.849 [2024-07-23 08:27:32.173980] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:19.849 BaseBdev1 00:14:19.849 08:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:19.849 08:27:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:19.849 08:27:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:19.849 08:27:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:19.849 08:27:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:19.849 08:27:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:19.850 08:27:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:19.850 08:27:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:20.108 [ 00:14:20.108 { 00:14:20.108 "name": "BaseBdev1", 00:14:20.108 "aliases": [ 00:14:20.108 "a0a7e145-8f16-4393-b81c-c55dea9d55db" 00:14:20.108 ], 00:14:20.108 "product_name": "Malloc disk", 00:14:20.108 "block_size": 512, 00:14:20.108 "num_blocks": 65536, 00:14:20.108 "uuid": "a0a7e145-8f16-4393-b81c-c55dea9d55db", 00:14:20.108 "assigned_rate_limits": { 00:14:20.108 "rw_ios_per_sec": 0, 00:14:20.108 "rw_mbytes_per_sec": 0, 00:14:20.108 "r_mbytes_per_sec": 0, 00:14:20.108 "w_mbytes_per_sec": 0 00:14:20.108 }, 00:14:20.108 "claimed": true, 00:14:20.108 "claim_type": "exclusive_write", 00:14:20.108 "zoned": false, 00:14:20.108 "supported_io_types": { 00:14:20.108 "read": true, 00:14:20.108 "write": true, 00:14:20.108 "unmap": true, 00:14:20.108 "flush": true, 00:14:20.108 "reset": true, 00:14:20.108 "nvme_admin": false, 00:14:20.108 "nvme_io": false, 00:14:20.108 "nvme_io_md": false, 00:14:20.108 "write_zeroes": true, 00:14:20.108 "zcopy": true, 00:14:20.108 "get_zone_info": false, 00:14:20.108 "zone_management": false, 00:14:20.108 "zone_append": false, 00:14:20.108 "compare": false, 00:14:20.108 "compare_and_write": false, 00:14:20.108 "abort": true, 00:14:20.108 "seek_hole": false, 00:14:20.108 "seek_data": false, 00:14:20.108 "copy": true, 00:14:20.108 "nvme_iov_md": false 00:14:20.108 }, 00:14:20.108 "memory_domains": [ 00:14:20.108 { 00:14:20.108 "dma_device_id": "system", 00:14:20.108 "dma_device_type": 1 00:14:20.108 }, 00:14:20.108 { 00:14:20.108 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.108 "dma_device_type": 2 00:14:20.108 } 00:14:20.108 ], 00:14:20.108 "driver_specific": {} 00:14:20.108 } 00:14:20.108 ] 00:14:20.108 08:27:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:20.108 08:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:20.108 08:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:20.108 08:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:20.108 08:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:20.108 08:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:20.108 08:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:20.108 08:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:20.108 08:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:20.108 08:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:20.108 08:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:20.108 08:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.108 08:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:20.367 08:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:20.367 "name": "Existed_Raid", 00:14:20.367 "uuid": "64c907a1-9991-4d76-93b5-4ce0ff1e9a6c", 00:14:20.367 "strip_size_kb": 64, 00:14:20.367 "state": "configuring", 00:14:20.367 "raid_level": "raid0", 00:14:20.367 "superblock": true, 00:14:20.367 "num_base_bdevs": 3, 00:14:20.367 "num_base_bdevs_discovered": 2, 00:14:20.367 "num_base_bdevs_operational": 3, 00:14:20.367 "base_bdevs_list": [ 00:14:20.367 { 00:14:20.367 "name": "BaseBdev1", 00:14:20.367 "uuid": "a0a7e145-8f16-4393-b81c-c55dea9d55db", 00:14:20.367 "is_configured": true, 00:14:20.367 "data_offset": 2048, 00:14:20.367 "data_size": 63488 00:14:20.367 }, 00:14:20.367 { 00:14:20.367 "name": null, 00:14:20.367 "uuid": "40b2b37f-e410-4abc-847a-11be2f7fc9f7", 00:14:20.367 "is_configured": false, 00:14:20.367 "data_offset": 2048, 00:14:20.367 "data_size": 63488 00:14:20.367 }, 00:14:20.367 { 00:14:20.367 "name": "BaseBdev3", 00:14:20.367 "uuid": "4298a9a5-2952-481a-8888-69e360dafc92", 00:14:20.367 "is_configured": true, 00:14:20.367 "data_offset": 2048, 00:14:20.367 "data_size": 63488 00:14:20.367 } 00:14:20.367 ] 00:14:20.367 }' 00:14:20.367 08:27:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:20.367 08:27:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:20.935 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.935 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:20.935 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:20.935 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:20.935 [2024-07-23 08:27:33.453431] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:21.195 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:21.195 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:21.195 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:21.195 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:21.195 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:21.195 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:21.195 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:21.195 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:21.195 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:21.195 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:21.195 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.195 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:21.195 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:21.195 "name": "Existed_Raid", 00:14:21.195 "uuid": "64c907a1-9991-4d76-93b5-4ce0ff1e9a6c", 00:14:21.195 "strip_size_kb": 64, 00:14:21.195 "state": "configuring", 00:14:21.195 "raid_level": "raid0", 00:14:21.195 "superblock": true, 00:14:21.195 "num_base_bdevs": 3, 00:14:21.195 "num_base_bdevs_discovered": 1, 00:14:21.195 "num_base_bdevs_operational": 3, 00:14:21.195 "base_bdevs_list": [ 00:14:21.195 { 00:14:21.195 "name": "BaseBdev1", 00:14:21.195 "uuid": "a0a7e145-8f16-4393-b81c-c55dea9d55db", 00:14:21.195 "is_configured": true, 00:14:21.195 "data_offset": 2048, 00:14:21.195 "data_size": 63488 00:14:21.195 }, 00:14:21.195 { 00:14:21.195 "name": null, 00:14:21.195 "uuid": "40b2b37f-e410-4abc-847a-11be2f7fc9f7", 00:14:21.195 "is_configured": false, 00:14:21.195 "data_offset": 2048, 00:14:21.195 "data_size": 63488 00:14:21.195 }, 00:14:21.195 { 00:14:21.195 "name": null, 00:14:21.195 "uuid": "4298a9a5-2952-481a-8888-69e360dafc92", 00:14:21.195 "is_configured": false, 00:14:21.195 "data_offset": 2048, 00:14:21.195 "data_size": 63488 00:14:21.195 } 00:14:21.195 ] 00:14:21.195 }' 00:14:21.195 08:27:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:21.195 08:27:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:21.763 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.763 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:21.763 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:21.763 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:22.022 [2024-07-23 08:27:34.359813] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:22.022 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:22.022 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:22.022 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:22.022 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:22.022 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:22.022 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:22.022 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:22.022 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:22.022 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:22.022 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:22.022 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.022 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:22.022 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:22.022 "name": "Existed_Raid", 00:14:22.022 "uuid": "64c907a1-9991-4d76-93b5-4ce0ff1e9a6c", 00:14:22.022 "strip_size_kb": 64, 00:14:22.022 "state": "configuring", 00:14:22.022 "raid_level": "raid0", 00:14:22.022 "superblock": true, 00:14:22.022 "num_base_bdevs": 3, 00:14:22.022 "num_base_bdevs_discovered": 2, 00:14:22.022 "num_base_bdevs_operational": 3, 00:14:22.022 "base_bdevs_list": [ 00:14:22.022 { 00:14:22.022 "name": "BaseBdev1", 00:14:22.022 "uuid": "a0a7e145-8f16-4393-b81c-c55dea9d55db", 00:14:22.022 "is_configured": true, 00:14:22.022 "data_offset": 2048, 00:14:22.022 "data_size": 63488 00:14:22.022 }, 00:14:22.022 { 00:14:22.022 "name": null, 00:14:22.022 "uuid": "40b2b37f-e410-4abc-847a-11be2f7fc9f7", 00:14:22.022 "is_configured": false, 00:14:22.022 "data_offset": 2048, 00:14:22.022 "data_size": 63488 00:14:22.022 }, 00:14:22.022 { 00:14:22.022 "name": "BaseBdev3", 00:14:22.022 "uuid": "4298a9a5-2952-481a-8888-69e360dafc92", 00:14:22.022 "is_configured": true, 00:14:22.022 "data_offset": 2048, 00:14:22.022 "data_size": 63488 00:14:22.022 } 00:14:22.022 ] 00:14:22.022 }' 00:14:22.022 08:27:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:22.022 08:27:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:22.588 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.588 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:22.847 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:22.847 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:22.847 [2024-07-23 08:27:35.326462] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:23.106 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:23.106 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:23.106 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:23.106 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:23.106 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:23.106 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:23.106 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:23.106 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:23.106 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:23.106 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:23.106 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.106 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:23.106 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:23.106 "name": "Existed_Raid", 00:14:23.106 "uuid": "64c907a1-9991-4d76-93b5-4ce0ff1e9a6c", 00:14:23.106 "strip_size_kb": 64, 00:14:23.106 "state": "configuring", 00:14:23.106 "raid_level": "raid0", 00:14:23.106 "superblock": true, 00:14:23.106 "num_base_bdevs": 3, 00:14:23.106 "num_base_bdevs_discovered": 1, 00:14:23.106 "num_base_bdevs_operational": 3, 00:14:23.106 "base_bdevs_list": [ 00:14:23.106 { 00:14:23.106 "name": null, 00:14:23.106 "uuid": "a0a7e145-8f16-4393-b81c-c55dea9d55db", 00:14:23.106 "is_configured": false, 00:14:23.106 "data_offset": 2048, 00:14:23.106 "data_size": 63488 00:14:23.106 }, 00:14:23.106 { 00:14:23.106 "name": null, 00:14:23.106 "uuid": "40b2b37f-e410-4abc-847a-11be2f7fc9f7", 00:14:23.106 "is_configured": false, 00:14:23.106 "data_offset": 2048, 00:14:23.106 "data_size": 63488 00:14:23.106 }, 00:14:23.106 { 00:14:23.106 "name": "BaseBdev3", 00:14:23.106 "uuid": "4298a9a5-2952-481a-8888-69e360dafc92", 00:14:23.106 "is_configured": true, 00:14:23.106 "data_offset": 2048, 00:14:23.106 "data_size": 63488 00:14:23.106 } 00:14:23.106 ] 00:14:23.106 }' 00:14:23.106 08:27:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:23.106 08:27:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:23.674 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:23.674 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.933 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:23.933 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:23.933 [2024-07-23 08:27:36.394505] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:23.933 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:23.933 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:23.933 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:23.933 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:23.934 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:23.934 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:23.934 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:23.934 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:23.934 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:23.934 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:23.934 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.934 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:24.193 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:24.193 "name": "Existed_Raid", 00:14:24.193 "uuid": "64c907a1-9991-4d76-93b5-4ce0ff1e9a6c", 00:14:24.193 "strip_size_kb": 64, 00:14:24.193 "state": "configuring", 00:14:24.193 "raid_level": "raid0", 00:14:24.193 "superblock": true, 00:14:24.193 "num_base_bdevs": 3, 00:14:24.193 "num_base_bdevs_discovered": 2, 00:14:24.193 "num_base_bdevs_operational": 3, 00:14:24.193 "base_bdevs_list": [ 00:14:24.193 { 00:14:24.193 "name": null, 00:14:24.193 "uuid": "a0a7e145-8f16-4393-b81c-c55dea9d55db", 00:14:24.193 "is_configured": false, 00:14:24.193 "data_offset": 2048, 00:14:24.193 "data_size": 63488 00:14:24.193 }, 00:14:24.193 { 00:14:24.193 "name": "BaseBdev2", 00:14:24.193 "uuid": "40b2b37f-e410-4abc-847a-11be2f7fc9f7", 00:14:24.193 "is_configured": true, 00:14:24.193 "data_offset": 2048, 00:14:24.193 "data_size": 63488 00:14:24.193 }, 00:14:24.193 { 00:14:24.193 "name": "BaseBdev3", 00:14:24.193 "uuid": "4298a9a5-2952-481a-8888-69e360dafc92", 00:14:24.193 "is_configured": true, 00:14:24.193 "data_offset": 2048, 00:14:24.193 "data_size": 63488 00:14:24.193 } 00:14:24.193 ] 00:14:24.193 }' 00:14:24.193 08:27:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:24.193 08:27:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:24.761 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.761 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:24.761 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:24.761 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.761 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:25.020 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u a0a7e145-8f16-4393-b81c-c55dea9d55db 00:14:25.279 [2024-07-23 08:27:37.603175] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:25.279 [2024-07-23 08:27:37.603391] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036980 00:14:25.279 [2024-07-23 08:27:37.603407] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:25.279 [2024-07-23 08:27:37.603692] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c200 00:14:25.280 [2024-07-23 08:27:37.603857] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036980 00:14:25.280 [2024-07-23 08:27:37.603866] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000036980 00:14:25.280 [2024-07-23 08:27:37.604005] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:25.280 NewBaseBdev 00:14:25.280 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:25.280 08:27:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:25.280 08:27:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:25.280 08:27:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:25.280 08:27:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:25.280 08:27:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:25.280 08:27:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:25.280 08:27:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:25.539 [ 00:14:25.539 { 00:14:25.539 "name": "NewBaseBdev", 00:14:25.539 "aliases": [ 00:14:25.539 "a0a7e145-8f16-4393-b81c-c55dea9d55db" 00:14:25.539 ], 00:14:25.539 "product_name": "Malloc disk", 00:14:25.539 "block_size": 512, 00:14:25.539 "num_blocks": 65536, 00:14:25.539 "uuid": "a0a7e145-8f16-4393-b81c-c55dea9d55db", 00:14:25.539 "assigned_rate_limits": { 00:14:25.539 "rw_ios_per_sec": 0, 00:14:25.539 "rw_mbytes_per_sec": 0, 00:14:25.539 "r_mbytes_per_sec": 0, 00:14:25.539 "w_mbytes_per_sec": 0 00:14:25.539 }, 00:14:25.539 "claimed": true, 00:14:25.539 "claim_type": "exclusive_write", 00:14:25.539 "zoned": false, 00:14:25.539 "supported_io_types": { 00:14:25.539 "read": true, 00:14:25.539 "write": true, 00:14:25.539 "unmap": true, 00:14:25.539 "flush": true, 00:14:25.539 "reset": true, 00:14:25.539 "nvme_admin": false, 00:14:25.539 "nvme_io": false, 00:14:25.539 "nvme_io_md": false, 00:14:25.539 "write_zeroes": true, 00:14:25.539 "zcopy": true, 00:14:25.539 "get_zone_info": false, 00:14:25.539 "zone_management": false, 00:14:25.539 "zone_append": false, 00:14:25.539 "compare": false, 00:14:25.539 "compare_and_write": false, 00:14:25.539 "abort": true, 00:14:25.539 "seek_hole": false, 00:14:25.539 "seek_data": false, 00:14:25.539 "copy": true, 00:14:25.539 "nvme_iov_md": false 00:14:25.539 }, 00:14:25.539 "memory_domains": [ 00:14:25.539 { 00:14:25.540 "dma_device_id": "system", 00:14:25.540 "dma_device_type": 1 00:14:25.540 }, 00:14:25.540 { 00:14:25.540 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:25.540 "dma_device_type": 2 00:14:25.540 } 00:14:25.540 ], 00:14:25.540 "driver_specific": {} 00:14:25.540 } 00:14:25.540 ] 00:14:25.540 08:27:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:25.540 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:25.540 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:25.540 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:25.540 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:25.540 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:25.540 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:25.540 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:25.540 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:25.540 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:25.540 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:25.540 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.540 08:27:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:25.851 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:25.851 "name": "Existed_Raid", 00:14:25.851 "uuid": "64c907a1-9991-4d76-93b5-4ce0ff1e9a6c", 00:14:25.851 "strip_size_kb": 64, 00:14:25.851 "state": "online", 00:14:25.851 "raid_level": "raid0", 00:14:25.851 "superblock": true, 00:14:25.851 "num_base_bdevs": 3, 00:14:25.852 "num_base_bdevs_discovered": 3, 00:14:25.852 "num_base_bdevs_operational": 3, 00:14:25.852 "base_bdevs_list": [ 00:14:25.852 { 00:14:25.852 "name": "NewBaseBdev", 00:14:25.852 "uuid": "a0a7e145-8f16-4393-b81c-c55dea9d55db", 00:14:25.852 "is_configured": true, 00:14:25.852 "data_offset": 2048, 00:14:25.852 "data_size": 63488 00:14:25.852 }, 00:14:25.852 { 00:14:25.852 "name": "BaseBdev2", 00:14:25.852 "uuid": "40b2b37f-e410-4abc-847a-11be2f7fc9f7", 00:14:25.852 "is_configured": true, 00:14:25.852 "data_offset": 2048, 00:14:25.852 "data_size": 63488 00:14:25.852 }, 00:14:25.852 { 00:14:25.852 "name": "BaseBdev3", 00:14:25.852 "uuid": "4298a9a5-2952-481a-8888-69e360dafc92", 00:14:25.852 "is_configured": true, 00:14:25.852 "data_offset": 2048, 00:14:25.852 "data_size": 63488 00:14:25.852 } 00:14:25.852 ] 00:14:25.852 }' 00:14:25.852 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:25.852 08:27:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:26.110 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:26.110 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:26.110 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:26.110 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:26.110 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:26.110 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:26.110 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:26.110 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:26.368 [2024-07-23 08:27:38.746524] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:26.368 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:26.368 "name": "Existed_Raid", 00:14:26.368 "aliases": [ 00:14:26.368 "64c907a1-9991-4d76-93b5-4ce0ff1e9a6c" 00:14:26.368 ], 00:14:26.368 "product_name": "Raid Volume", 00:14:26.368 "block_size": 512, 00:14:26.368 "num_blocks": 190464, 00:14:26.368 "uuid": "64c907a1-9991-4d76-93b5-4ce0ff1e9a6c", 00:14:26.368 "assigned_rate_limits": { 00:14:26.368 "rw_ios_per_sec": 0, 00:14:26.368 "rw_mbytes_per_sec": 0, 00:14:26.368 "r_mbytes_per_sec": 0, 00:14:26.368 "w_mbytes_per_sec": 0 00:14:26.368 }, 00:14:26.368 "claimed": false, 00:14:26.368 "zoned": false, 00:14:26.368 "supported_io_types": { 00:14:26.368 "read": true, 00:14:26.368 "write": true, 00:14:26.368 "unmap": true, 00:14:26.368 "flush": true, 00:14:26.368 "reset": true, 00:14:26.368 "nvme_admin": false, 00:14:26.368 "nvme_io": false, 00:14:26.368 "nvme_io_md": false, 00:14:26.368 "write_zeroes": true, 00:14:26.368 "zcopy": false, 00:14:26.368 "get_zone_info": false, 00:14:26.368 "zone_management": false, 00:14:26.368 "zone_append": false, 00:14:26.368 "compare": false, 00:14:26.368 "compare_and_write": false, 00:14:26.368 "abort": false, 00:14:26.368 "seek_hole": false, 00:14:26.368 "seek_data": false, 00:14:26.368 "copy": false, 00:14:26.368 "nvme_iov_md": false 00:14:26.368 }, 00:14:26.368 "memory_domains": [ 00:14:26.368 { 00:14:26.368 "dma_device_id": "system", 00:14:26.368 "dma_device_type": 1 00:14:26.368 }, 00:14:26.368 { 00:14:26.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.368 "dma_device_type": 2 00:14:26.368 }, 00:14:26.368 { 00:14:26.368 "dma_device_id": "system", 00:14:26.368 "dma_device_type": 1 00:14:26.368 }, 00:14:26.368 { 00:14:26.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.368 "dma_device_type": 2 00:14:26.368 }, 00:14:26.368 { 00:14:26.368 "dma_device_id": "system", 00:14:26.368 "dma_device_type": 1 00:14:26.368 }, 00:14:26.368 { 00:14:26.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.368 "dma_device_type": 2 00:14:26.368 } 00:14:26.368 ], 00:14:26.368 "driver_specific": { 00:14:26.368 "raid": { 00:14:26.368 "uuid": "64c907a1-9991-4d76-93b5-4ce0ff1e9a6c", 00:14:26.368 "strip_size_kb": 64, 00:14:26.368 "state": "online", 00:14:26.368 "raid_level": "raid0", 00:14:26.368 "superblock": true, 00:14:26.368 "num_base_bdevs": 3, 00:14:26.368 "num_base_bdevs_discovered": 3, 00:14:26.368 "num_base_bdevs_operational": 3, 00:14:26.368 "base_bdevs_list": [ 00:14:26.368 { 00:14:26.368 "name": "NewBaseBdev", 00:14:26.368 "uuid": "a0a7e145-8f16-4393-b81c-c55dea9d55db", 00:14:26.368 "is_configured": true, 00:14:26.368 "data_offset": 2048, 00:14:26.368 "data_size": 63488 00:14:26.368 }, 00:14:26.368 { 00:14:26.368 "name": "BaseBdev2", 00:14:26.368 "uuid": "40b2b37f-e410-4abc-847a-11be2f7fc9f7", 00:14:26.368 "is_configured": true, 00:14:26.368 "data_offset": 2048, 00:14:26.368 "data_size": 63488 00:14:26.368 }, 00:14:26.368 { 00:14:26.368 "name": "BaseBdev3", 00:14:26.368 "uuid": "4298a9a5-2952-481a-8888-69e360dafc92", 00:14:26.368 "is_configured": true, 00:14:26.368 "data_offset": 2048, 00:14:26.368 "data_size": 63488 00:14:26.368 } 00:14:26.368 ] 00:14:26.368 } 00:14:26.368 } 00:14:26.368 }' 00:14:26.368 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:26.368 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:26.368 BaseBdev2 00:14:26.368 BaseBdev3' 00:14:26.368 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:26.368 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:26.369 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:26.627 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:26.627 "name": "NewBaseBdev", 00:14:26.627 "aliases": [ 00:14:26.627 "a0a7e145-8f16-4393-b81c-c55dea9d55db" 00:14:26.627 ], 00:14:26.627 "product_name": "Malloc disk", 00:14:26.627 "block_size": 512, 00:14:26.627 "num_blocks": 65536, 00:14:26.627 "uuid": "a0a7e145-8f16-4393-b81c-c55dea9d55db", 00:14:26.627 "assigned_rate_limits": { 00:14:26.627 "rw_ios_per_sec": 0, 00:14:26.627 "rw_mbytes_per_sec": 0, 00:14:26.627 "r_mbytes_per_sec": 0, 00:14:26.627 "w_mbytes_per_sec": 0 00:14:26.627 }, 00:14:26.627 "claimed": true, 00:14:26.627 "claim_type": "exclusive_write", 00:14:26.627 "zoned": false, 00:14:26.627 "supported_io_types": { 00:14:26.627 "read": true, 00:14:26.627 "write": true, 00:14:26.627 "unmap": true, 00:14:26.627 "flush": true, 00:14:26.627 "reset": true, 00:14:26.627 "nvme_admin": false, 00:14:26.627 "nvme_io": false, 00:14:26.627 "nvme_io_md": false, 00:14:26.627 "write_zeroes": true, 00:14:26.627 "zcopy": true, 00:14:26.627 "get_zone_info": false, 00:14:26.627 "zone_management": false, 00:14:26.627 "zone_append": false, 00:14:26.627 "compare": false, 00:14:26.627 "compare_and_write": false, 00:14:26.627 "abort": true, 00:14:26.627 "seek_hole": false, 00:14:26.627 "seek_data": false, 00:14:26.627 "copy": true, 00:14:26.627 "nvme_iov_md": false 00:14:26.627 }, 00:14:26.627 "memory_domains": [ 00:14:26.627 { 00:14:26.627 "dma_device_id": "system", 00:14:26.627 "dma_device_type": 1 00:14:26.627 }, 00:14:26.627 { 00:14:26.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.627 "dma_device_type": 2 00:14:26.627 } 00:14:26.627 ], 00:14:26.627 "driver_specific": {} 00:14:26.627 }' 00:14:26.627 08:27:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:26.627 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:26.627 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:26.627 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:26.627 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:26.627 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:26.627 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:26.886 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:26.886 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:26.886 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:26.886 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:26.886 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:26.886 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:26.886 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:26.886 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:27.144 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:27.144 "name": "BaseBdev2", 00:14:27.144 "aliases": [ 00:14:27.144 "40b2b37f-e410-4abc-847a-11be2f7fc9f7" 00:14:27.144 ], 00:14:27.144 "product_name": "Malloc disk", 00:14:27.144 "block_size": 512, 00:14:27.144 "num_blocks": 65536, 00:14:27.144 "uuid": "40b2b37f-e410-4abc-847a-11be2f7fc9f7", 00:14:27.144 "assigned_rate_limits": { 00:14:27.144 "rw_ios_per_sec": 0, 00:14:27.144 "rw_mbytes_per_sec": 0, 00:14:27.144 "r_mbytes_per_sec": 0, 00:14:27.144 "w_mbytes_per_sec": 0 00:14:27.144 }, 00:14:27.144 "claimed": true, 00:14:27.144 "claim_type": "exclusive_write", 00:14:27.144 "zoned": false, 00:14:27.144 "supported_io_types": { 00:14:27.144 "read": true, 00:14:27.144 "write": true, 00:14:27.144 "unmap": true, 00:14:27.144 "flush": true, 00:14:27.144 "reset": true, 00:14:27.144 "nvme_admin": false, 00:14:27.144 "nvme_io": false, 00:14:27.144 "nvme_io_md": false, 00:14:27.144 "write_zeroes": true, 00:14:27.144 "zcopy": true, 00:14:27.144 "get_zone_info": false, 00:14:27.144 "zone_management": false, 00:14:27.144 "zone_append": false, 00:14:27.144 "compare": false, 00:14:27.144 "compare_and_write": false, 00:14:27.144 "abort": true, 00:14:27.144 "seek_hole": false, 00:14:27.144 "seek_data": false, 00:14:27.144 "copy": true, 00:14:27.144 "nvme_iov_md": false 00:14:27.144 }, 00:14:27.144 "memory_domains": [ 00:14:27.144 { 00:14:27.144 "dma_device_id": "system", 00:14:27.144 "dma_device_type": 1 00:14:27.144 }, 00:14:27.144 { 00:14:27.144 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:27.144 "dma_device_type": 2 00:14:27.144 } 00:14:27.144 ], 00:14:27.144 "driver_specific": {} 00:14:27.144 }' 00:14:27.144 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.144 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.144 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:27.144 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.144 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.144 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:27.144 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.144 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.144 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:27.144 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:27.144 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:27.402 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:27.402 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:27.402 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:27.403 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:27.403 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:27.403 "name": "BaseBdev3", 00:14:27.403 "aliases": [ 00:14:27.403 "4298a9a5-2952-481a-8888-69e360dafc92" 00:14:27.403 ], 00:14:27.403 "product_name": "Malloc disk", 00:14:27.403 "block_size": 512, 00:14:27.403 "num_blocks": 65536, 00:14:27.403 "uuid": "4298a9a5-2952-481a-8888-69e360dafc92", 00:14:27.403 "assigned_rate_limits": { 00:14:27.403 "rw_ios_per_sec": 0, 00:14:27.403 "rw_mbytes_per_sec": 0, 00:14:27.403 "r_mbytes_per_sec": 0, 00:14:27.403 "w_mbytes_per_sec": 0 00:14:27.403 }, 00:14:27.403 "claimed": true, 00:14:27.403 "claim_type": "exclusive_write", 00:14:27.403 "zoned": false, 00:14:27.403 "supported_io_types": { 00:14:27.403 "read": true, 00:14:27.403 "write": true, 00:14:27.403 "unmap": true, 00:14:27.403 "flush": true, 00:14:27.403 "reset": true, 00:14:27.403 "nvme_admin": false, 00:14:27.403 "nvme_io": false, 00:14:27.403 "nvme_io_md": false, 00:14:27.403 "write_zeroes": true, 00:14:27.403 "zcopy": true, 00:14:27.403 "get_zone_info": false, 00:14:27.403 "zone_management": false, 00:14:27.403 "zone_append": false, 00:14:27.403 "compare": false, 00:14:27.403 "compare_and_write": false, 00:14:27.403 "abort": true, 00:14:27.403 "seek_hole": false, 00:14:27.403 "seek_data": false, 00:14:27.403 "copy": true, 00:14:27.403 "nvme_iov_md": false 00:14:27.403 }, 00:14:27.403 "memory_domains": [ 00:14:27.403 { 00:14:27.403 "dma_device_id": "system", 00:14:27.403 "dma_device_type": 1 00:14:27.403 }, 00:14:27.403 { 00:14:27.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:27.403 "dma_device_type": 2 00:14:27.403 } 00:14:27.403 ], 00:14:27.403 "driver_specific": {} 00:14:27.403 }' 00:14:27.403 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.403 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.661 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:27.661 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.661 08:27:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.661 08:27:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:27.661 08:27:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.661 08:27:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.661 08:27:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:27.661 08:27:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:27.661 08:27:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:27.661 08:27:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:27.661 08:27:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:27.920 [2024-07-23 08:27:40.330455] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:27.920 [2024-07-23 08:27:40.330482] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:27.920 [2024-07-23 08:27:40.330560] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:27.920 [2024-07-23 08:27:40.330621] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:27.920 [2024-07-23 08:27:40.330639] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036980 name Existed_Raid, state offline 00:14:27.920 08:27:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1428905 00:14:27.920 08:27:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1428905 ']' 00:14:27.920 08:27:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1428905 00:14:27.920 08:27:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:14:27.920 08:27:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:27.920 08:27:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1428905 00:14:27.920 08:27:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:27.920 08:27:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:27.920 08:27:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1428905' 00:14:27.920 killing process with pid 1428905 00:14:27.920 08:27:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1428905 00:14:27.920 [2024-07-23 08:27:40.388215] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:27.920 08:27:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1428905 00:14:28.178 [2024-07-23 08:27:40.638797] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:29.555 08:27:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:29.555 00:14:29.555 real 0m22.641s 00:14:29.555 user 0m40.375s 00:14:29.555 sys 0m3.233s 00:14:29.555 08:27:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:29.555 08:27:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:29.555 ************************************ 00:14:29.555 END TEST raid_state_function_test_sb 00:14:29.555 ************************************ 00:14:29.555 08:27:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:29.555 08:27:41 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:14:29.555 08:27:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:29.555 08:27:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:29.555 08:27:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:29.555 ************************************ 00:14:29.555 START TEST raid_superblock_test 00:14:29.555 ************************************ 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1433787 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1433787 /var/tmp/spdk-raid.sock 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1433787 ']' 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:29.555 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:29.555 08:27:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:29.555 [2024-07-23 08:27:42.010365] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:14:29.555 [2024-07-23 08:27:42.010467] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1433787 ] 00:14:29.814 [2024-07-23 08:27:42.134518] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:30.072 [2024-07-23 08:27:42.344381] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:30.330 [2024-07-23 08:27:42.602240] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:30.330 [2024-07-23 08:27:42.602274] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:30.330 08:27:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:30.330 08:27:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:14:30.330 08:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:30.330 08:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:30.330 08:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:30.330 08:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:30.330 08:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:30.330 08:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:30.330 08:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:30.330 08:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:30.330 08:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:30.588 malloc1 00:14:30.588 08:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:30.847 [2024-07-23 08:27:43.133266] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:30.847 [2024-07-23 08:27:43.133316] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:30.847 [2024-07-23 08:27:43.133336] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:14:30.847 [2024-07-23 08:27:43.133364] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:30.847 [2024-07-23 08:27:43.135262] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:30.847 [2024-07-23 08:27:43.135289] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:30.847 pt1 00:14:30.847 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:30.847 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:30.847 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:30.847 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:30.847 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:30.847 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:30.847 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:30.847 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:30.848 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:30.848 malloc2 00:14:30.848 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:31.106 [2024-07-23 08:27:43.489995] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:31.106 [2024-07-23 08:27:43.490045] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:31.106 [2024-07-23 08:27:43.490080] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:14:31.106 [2024-07-23 08:27:43.490092] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:31.106 [2024-07-23 08:27:43.492096] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:31.106 [2024-07-23 08:27:43.492123] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:31.106 pt2 00:14:31.106 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:31.106 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:31.106 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:31.106 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:31.106 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:31.106 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:31.106 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:31.106 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:31.106 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:31.364 malloc3 00:14:31.364 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:31.364 [2024-07-23 08:27:43.858340] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:31.364 [2024-07-23 08:27:43.858392] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:31.364 [2024-07-23 08:27:43.858414] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036080 00:14:31.364 [2024-07-23 08:27:43.858423] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:31.364 [2024-07-23 08:27:43.860469] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:31.364 [2024-07-23 08:27:43.860496] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:31.364 pt3 00:14:31.364 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:31.364 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:31.364 08:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:31.622 [2024-07-23 08:27:44.022827] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:31.622 [2024-07-23 08:27:44.024478] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:31.622 [2024-07-23 08:27:44.024542] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:31.622 [2024-07-23 08:27:44.024732] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036680 00:14:31.622 [2024-07-23 08:27:44.024748] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:31.622 [2024-07-23 08:27:44.025013] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:14:31.622 [2024-07-23 08:27:44.025210] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036680 00:14:31.622 [2024-07-23 08:27:44.025220] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036680 00:14:31.622 [2024-07-23 08:27:44.025398] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:31.622 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:31.622 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:31.623 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:31.623 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:31.623 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:31.623 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:31.623 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:31.623 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:31.623 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:31.623 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:31.623 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.623 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:31.880 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:31.880 "name": "raid_bdev1", 00:14:31.880 "uuid": "bf3b9e45-0975-4669-928d-1dda3561662b", 00:14:31.880 "strip_size_kb": 64, 00:14:31.880 "state": "online", 00:14:31.880 "raid_level": "raid0", 00:14:31.880 "superblock": true, 00:14:31.880 "num_base_bdevs": 3, 00:14:31.880 "num_base_bdevs_discovered": 3, 00:14:31.880 "num_base_bdevs_operational": 3, 00:14:31.880 "base_bdevs_list": [ 00:14:31.880 { 00:14:31.880 "name": "pt1", 00:14:31.880 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:31.880 "is_configured": true, 00:14:31.880 "data_offset": 2048, 00:14:31.880 "data_size": 63488 00:14:31.880 }, 00:14:31.880 { 00:14:31.880 "name": "pt2", 00:14:31.880 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:31.880 "is_configured": true, 00:14:31.880 "data_offset": 2048, 00:14:31.880 "data_size": 63488 00:14:31.880 }, 00:14:31.880 { 00:14:31.880 "name": "pt3", 00:14:31.880 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:31.880 "is_configured": true, 00:14:31.880 "data_offset": 2048, 00:14:31.880 "data_size": 63488 00:14:31.880 } 00:14:31.880 ] 00:14:31.880 }' 00:14:31.880 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:31.880 08:27:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:32.447 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:32.447 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:32.447 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:32.447 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:32.447 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:32.447 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:32.447 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:32.447 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:32.447 [2024-07-23 08:27:44.841170] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:32.447 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:32.447 "name": "raid_bdev1", 00:14:32.447 "aliases": [ 00:14:32.447 "bf3b9e45-0975-4669-928d-1dda3561662b" 00:14:32.447 ], 00:14:32.447 "product_name": "Raid Volume", 00:14:32.447 "block_size": 512, 00:14:32.447 "num_blocks": 190464, 00:14:32.447 "uuid": "bf3b9e45-0975-4669-928d-1dda3561662b", 00:14:32.447 "assigned_rate_limits": { 00:14:32.447 "rw_ios_per_sec": 0, 00:14:32.447 "rw_mbytes_per_sec": 0, 00:14:32.447 "r_mbytes_per_sec": 0, 00:14:32.447 "w_mbytes_per_sec": 0 00:14:32.447 }, 00:14:32.447 "claimed": false, 00:14:32.447 "zoned": false, 00:14:32.447 "supported_io_types": { 00:14:32.447 "read": true, 00:14:32.447 "write": true, 00:14:32.447 "unmap": true, 00:14:32.447 "flush": true, 00:14:32.447 "reset": true, 00:14:32.447 "nvme_admin": false, 00:14:32.447 "nvme_io": false, 00:14:32.447 "nvme_io_md": false, 00:14:32.447 "write_zeroes": true, 00:14:32.447 "zcopy": false, 00:14:32.447 "get_zone_info": false, 00:14:32.447 "zone_management": false, 00:14:32.447 "zone_append": false, 00:14:32.447 "compare": false, 00:14:32.447 "compare_and_write": false, 00:14:32.447 "abort": false, 00:14:32.447 "seek_hole": false, 00:14:32.447 "seek_data": false, 00:14:32.447 "copy": false, 00:14:32.447 "nvme_iov_md": false 00:14:32.447 }, 00:14:32.447 "memory_domains": [ 00:14:32.447 { 00:14:32.447 "dma_device_id": "system", 00:14:32.447 "dma_device_type": 1 00:14:32.447 }, 00:14:32.447 { 00:14:32.447 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.447 "dma_device_type": 2 00:14:32.447 }, 00:14:32.447 { 00:14:32.447 "dma_device_id": "system", 00:14:32.447 "dma_device_type": 1 00:14:32.447 }, 00:14:32.447 { 00:14:32.447 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.447 "dma_device_type": 2 00:14:32.447 }, 00:14:32.447 { 00:14:32.447 "dma_device_id": "system", 00:14:32.447 "dma_device_type": 1 00:14:32.447 }, 00:14:32.447 { 00:14:32.447 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.447 "dma_device_type": 2 00:14:32.447 } 00:14:32.447 ], 00:14:32.447 "driver_specific": { 00:14:32.447 "raid": { 00:14:32.447 "uuid": "bf3b9e45-0975-4669-928d-1dda3561662b", 00:14:32.447 "strip_size_kb": 64, 00:14:32.447 "state": "online", 00:14:32.447 "raid_level": "raid0", 00:14:32.447 "superblock": true, 00:14:32.447 "num_base_bdevs": 3, 00:14:32.447 "num_base_bdevs_discovered": 3, 00:14:32.447 "num_base_bdevs_operational": 3, 00:14:32.447 "base_bdevs_list": [ 00:14:32.447 { 00:14:32.447 "name": "pt1", 00:14:32.447 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:32.447 "is_configured": true, 00:14:32.447 "data_offset": 2048, 00:14:32.447 "data_size": 63488 00:14:32.448 }, 00:14:32.448 { 00:14:32.448 "name": "pt2", 00:14:32.448 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:32.448 "is_configured": true, 00:14:32.448 "data_offset": 2048, 00:14:32.448 "data_size": 63488 00:14:32.448 }, 00:14:32.448 { 00:14:32.448 "name": "pt3", 00:14:32.448 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:32.448 "is_configured": true, 00:14:32.448 "data_offset": 2048, 00:14:32.448 "data_size": 63488 00:14:32.448 } 00:14:32.448 ] 00:14:32.448 } 00:14:32.448 } 00:14:32.448 }' 00:14:32.448 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:32.448 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:32.448 pt2 00:14:32.448 pt3' 00:14:32.448 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:32.448 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:32.448 08:27:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:32.715 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:32.715 "name": "pt1", 00:14:32.715 "aliases": [ 00:14:32.715 "00000000-0000-0000-0000-000000000001" 00:14:32.715 ], 00:14:32.715 "product_name": "passthru", 00:14:32.715 "block_size": 512, 00:14:32.715 "num_blocks": 65536, 00:14:32.715 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:32.715 "assigned_rate_limits": { 00:14:32.715 "rw_ios_per_sec": 0, 00:14:32.715 "rw_mbytes_per_sec": 0, 00:14:32.715 "r_mbytes_per_sec": 0, 00:14:32.715 "w_mbytes_per_sec": 0 00:14:32.715 }, 00:14:32.715 "claimed": true, 00:14:32.715 "claim_type": "exclusive_write", 00:14:32.715 "zoned": false, 00:14:32.715 "supported_io_types": { 00:14:32.715 "read": true, 00:14:32.715 "write": true, 00:14:32.715 "unmap": true, 00:14:32.715 "flush": true, 00:14:32.715 "reset": true, 00:14:32.715 "nvme_admin": false, 00:14:32.715 "nvme_io": false, 00:14:32.715 "nvme_io_md": false, 00:14:32.715 "write_zeroes": true, 00:14:32.715 "zcopy": true, 00:14:32.715 "get_zone_info": false, 00:14:32.715 "zone_management": false, 00:14:32.715 "zone_append": false, 00:14:32.715 "compare": false, 00:14:32.715 "compare_and_write": false, 00:14:32.715 "abort": true, 00:14:32.715 "seek_hole": false, 00:14:32.715 "seek_data": false, 00:14:32.715 "copy": true, 00:14:32.715 "nvme_iov_md": false 00:14:32.715 }, 00:14:32.715 "memory_domains": [ 00:14:32.715 { 00:14:32.715 "dma_device_id": "system", 00:14:32.715 "dma_device_type": 1 00:14:32.715 }, 00:14:32.715 { 00:14:32.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.715 "dma_device_type": 2 00:14:32.715 } 00:14:32.715 ], 00:14:32.715 "driver_specific": { 00:14:32.715 "passthru": { 00:14:32.715 "name": "pt1", 00:14:32.715 "base_bdev_name": "malloc1" 00:14:32.715 } 00:14:32.715 } 00:14:32.715 }' 00:14:32.715 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:32.715 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:32.715 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:32.715 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:32.715 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:32.715 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:32.715 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:32.973 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:32.973 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:32.973 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:32.973 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:32.973 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:32.973 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:32.973 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:32.973 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:33.231 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:33.231 "name": "pt2", 00:14:33.231 "aliases": [ 00:14:33.231 "00000000-0000-0000-0000-000000000002" 00:14:33.231 ], 00:14:33.231 "product_name": "passthru", 00:14:33.231 "block_size": 512, 00:14:33.231 "num_blocks": 65536, 00:14:33.231 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:33.231 "assigned_rate_limits": { 00:14:33.231 "rw_ios_per_sec": 0, 00:14:33.231 "rw_mbytes_per_sec": 0, 00:14:33.231 "r_mbytes_per_sec": 0, 00:14:33.231 "w_mbytes_per_sec": 0 00:14:33.231 }, 00:14:33.231 "claimed": true, 00:14:33.231 "claim_type": "exclusive_write", 00:14:33.231 "zoned": false, 00:14:33.231 "supported_io_types": { 00:14:33.231 "read": true, 00:14:33.231 "write": true, 00:14:33.231 "unmap": true, 00:14:33.231 "flush": true, 00:14:33.231 "reset": true, 00:14:33.231 "nvme_admin": false, 00:14:33.231 "nvme_io": false, 00:14:33.231 "nvme_io_md": false, 00:14:33.231 "write_zeroes": true, 00:14:33.231 "zcopy": true, 00:14:33.231 "get_zone_info": false, 00:14:33.231 "zone_management": false, 00:14:33.231 "zone_append": false, 00:14:33.231 "compare": false, 00:14:33.231 "compare_and_write": false, 00:14:33.231 "abort": true, 00:14:33.231 "seek_hole": false, 00:14:33.231 "seek_data": false, 00:14:33.231 "copy": true, 00:14:33.231 "nvme_iov_md": false 00:14:33.231 }, 00:14:33.231 "memory_domains": [ 00:14:33.231 { 00:14:33.231 "dma_device_id": "system", 00:14:33.231 "dma_device_type": 1 00:14:33.231 }, 00:14:33.231 { 00:14:33.231 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.231 "dma_device_type": 2 00:14:33.231 } 00:14:33.231 ], 00:14:33.231 "driver_specific": { 00:14:33.231 "passthru": { 00:14:33.231 "name": "pt2", 00:14:33.231 "base_bdev_name": "malloc2" 00:14:33.231 } 00:14:33.231 } 00:14:33.231 }' 00:14:33.231 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:33.231 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:33.231 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:33.231 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:33.231 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:33.231 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:33.231 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:33.231 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:33.231 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:33.231 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:33.491 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:33.491 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:33.491 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:33.491 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:33.491 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:33.491 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:33.491 "name": "pt3", 00:14:33.491 "aliases": [ 00:14:33.491 "00000000-0000-0000-0000-000000000003" 00:14:33.491 ], 00:14:33.491 "product_name": "passthru", 00:14:33.491 "block_size": 512, 00:14:33.491 "num_blocks": 65536, 00:14:33.491 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:33.491 "assigned_rate_limits": { 00:14:33.491 "rw_ios_per_sec": 0, 00:14:33.491 "rw_mbytes_per_sec": 0, 00:14:33.491 "r_mbytes_per_sec": 0, 00:14:33.491 "w_mbytes_per_sec": 0 00:14:33.491 }, 00:14:33.491 "claimed": true, 00:14:33.491 "claim_type": "exclusive_write", 00:14:33.491 "zoned": false, 00:14:33.491 "supported_io_types": { 00:14:33.491 "read": true, 00:14:33.491 "write": true, 00:14:33.491 "unmap": true, 00:14:33.491 "flush": true, 00:14:33.491 "reset": true, 00:14:33.491 "nvme_admin": false, 00:14:33.491 "nvme_io": false, 00:14:33.491 "nvme_io_md": false, 00:14:33.491 "write_zeroes": true, 00:14:33.491 "zcopy": true, 00:14:33.491 "get_zone_info": false, 00:14:33.491 "zone_management": false, 00:14:33.491 "zone_append": false, 00:14:33.491 "compare": false, 00:14:33.491 "compare_and_write": false, 00:14:33.491 "abort": true, 00:14:33.491 "seek_hole": false, 00:14:33.491 "seek_data": false, 00:14:33.491 "copy": true, 00:14:33.491 "nvme_iov_md": false 00:14:33.491 }, 00:14:33.491 "memory_domains": [ 00:14:33.491 { 00:14:33.491 "dma_device_id": "system", 00:14:33.491 "dma_device_type": 1 00:14:33.491 }, 00:14:33.491 { 00:14:33.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.491 "dma_device_type": 2 00:14:33.491 } 00:14:33.491 ], 00:14:33.491 "driver_specific": { 00:14:33.491 "passthru": { 00:14:33.491 "name": "pt3", 00:14:33.491 "base_bdev_name": "malloc3" 00:14:33.491 } 00:14:33.491 } 00:14:33.491 }' 00:14:33.491 08:27:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:33.750 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:33.750 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:33.750 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:33.750 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:33.750 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:33.750 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:33.750 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:33.750 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:33.750 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:34.008 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:34.008 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:34.008 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:34.008 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:34.008 [2024-07-23 08:27:46.457439] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:34.008 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=bf3b9e45-0975-4669-928d-1dda3561662b 00:14:34.008 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z bf3b9e45-0975-4669-928d-1dda3561662b ']' 00:14:34.008 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:34.266 [2024-07-23 08:27:46.625623] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:34.266 [2024-07-23 08:27:46.625651] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:34.266 [2024-07-23 08:27:46.625724] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:34.266 [2024-07-23 08:27:46.625783] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:34.266 [2024-07-23 08:27:46.625794] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036680 name raid_bdev1, state offline 00:14:34.266 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:34.266 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.524 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:34.524 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:34.524 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:34.524 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:34.524 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:34.524 08:27:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:34.783 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:34.783 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:34.783 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:34.783 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:35.042 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:35.042 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:35.042 08:27:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:35.042 08:27:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:35.042 08:27:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:35.042 08:27:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:35.042 08:27:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:35.042 08:27:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:35.042 08:27:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:35.042 08:27:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:35.042 08:27:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:35.042 08:27:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:35.042 08:27:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:35.301 [2024-07-23 08:27:47.640255] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:35.301 [2024-07-23 08:27:47.641866] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:35.301 [2024-07-23 08:27:47.641915] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:35.301 [2024-07-23 08:27:47.641961] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:35.301 [2024-07-23 08:27:47.642004] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:35.301 [2024-07-23 08:27:47.642038] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:35.301 [2024-07-23 08:27:47.642053] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:35.301 [2024-07-23 08:27:47.642063] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036c80 name raid_bdev1, state configuring 00:14:35.301 request: 00:14:35.301 { 00:14:35.301 "name": "raid_bdev1", 00:14:35.301 "raid_level": "raid0", 00:14:35.301 "base_bdevs": [ 00:14:35.301 "malloc1", 00:14:35.301 "malloc2", 00:14:35.301 "malloc3" 00:14:35.301 ], 00:14:35.301 "strip_size_kb": 64, 00:14:35.301 "superblock": false, 00:14:35.301 "method": "bdev_raid_create", 00:14:35.301 "req_id": 1 00:14:35.301 } 00:14:35.301 Got JSON-RPC error response 00:14:35.301 response: 00:14:35.301 { 00:14:35.301 "code": -17, 00:14:35.301 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:35.301 } 00:14:35.301 08:27:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:35.301 08:27:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:35.301 08:27:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:35.301 08:27:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:35.301 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.301 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:35.560 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:35.560 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:35.560 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:35.560 [2024-07-23 08:27:47.985119] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:35.560 [2024-07-23 08:27:47.985171] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:35.560 [2024-07-23 08:27:47.985190] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037280 00:14:35.560 [2024-07-23 08:27:47.985199] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:35.560 [2024-07-23 08:27:47.987197] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:35.560 [2024-07-23 08:27:47.987223] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:35.560 [2024-07-23 08:27:47.987300] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:35.560 [2024-07-23 08:27:47.987365] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:35.560 pt1 00:14:35.560 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:35.560 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:35.560 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:35.560 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:35.560 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:35.560 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:35.560 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:35.560 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:35.560 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:35.560 08:27:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:35.560 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.560 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:35.819 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:35.819 "name": "raid_bdev1", 00:14:35.819 "uuid": "bf3b9e45-0975-4669-928d-1dda3561662b", 00:14:35.819 "strip_size_kb": 64, 00:14:35.819 "state": "configuring", 00:14:35.819 "raid_level": "raid0", 00:14:35.819 "superblock": true, 00:14:35.819 "num_base_bdevs": 3, 00:14:35.819 "num_base_bdevs_discovered": 1, 00:14:35.819 "num_base_bdevs_operational": 3, 00:14:35.819 "base_bdevs_list": [ 00:14:35.819 { 00:14:35.819 "name": "pt1", 00:14:35.819 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:35.819 "is_configured": true, 00:14:35.819 "data_offset": 2048, 00:14:35.819 "data_size": 63488 00:14:35.819 }, 00:14:35.819 { 00:14:35.819 "name": null, 00:14:35.819 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:35.819 "is_configured": false, 00:14:35.819 "data_offset": 2048, 00:14:35.819 "data_size": 63488 00:14:35.819 }, 00:14:35.819 { 00:14:35.819 "name": null, 00:14:35.819 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:35.819 "is_configured": false, 00:14:35.819 "data_offset": 2048, 00:14:35.819 "data_size": 63488 00:14:35.819 } 00:14:35.819 ] 00:14:35.819 }' 00:14:35.819 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:35.819 08:27:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:36.387 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:14:36.387 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:36.387 [2024-07-23 08:27:48.787241] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:36.387 [2024-07-23 08:27:48.787312] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:36.387 [2024-07-23 08:27:48.787333] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037b80 00:14:36.387 [2024-07-23 08:27:48.787342] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:36.387 [2024-07-23 08:27:48.787808] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:36.387 [2024-07-23 08:27:48.787826] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:36.387 [2024-07-23 08:27:48.787903] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:36.387 [2024-07-23 08:27:48.787924] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:36.387 pt2 00:14:36.387 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:36.646 [2024-07-23 08:27:48.955727] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:36.646 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:36.646 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:36.646 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:36.646 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:36.646 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:36.646 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:36.646 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:36.646 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:36.646 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:36.646 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:36.646 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.646 08:27:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:36.646 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:36.646 "name": "raid_bdev1", 00:14:36.646 "uuid": "bf3b9e45-0975-4669-928d-1dda3561662b", 00:14:36.646 "strip_size_kb": 64, 00:14:36.646 "state": "configuring", 00:14:36.646 "raid_level": "raid0", 00:14:36.646 "superblock": true, 00:14:36.646 "num_base_bdevs": 3, 00:14:36.646 "num_base_bdevs_discovered": 1, 00:14:36.646 "num_base_bdevs_operational": 3, 00:14:36.646 "base_bdevs_list": [ 00:14:36.646 { 00:14:36.646 "name": "pt1", 00:14:36.646 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:36.646 "is_configured": true, 00:14:36.646 "data_offset": 2048, 00:14:36.646 "data_size": 63488 00:14:36.646 }, 00:14:36.646 { 00:14:36.646 "name": null, 00:14:36.646 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:36.646 "is_configured": false, 00:14:36.646 "data_offset": 2048, 00:14:36.646 "data_size": 63488 00:14:36.646 }, 00:14:36.646 { 00:14:36.646 "name": null, 00:14:36.646 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:36.646 "is_configured": false, 00:14:36.646 "data_offset": 2048, 00:14:36.646 "data_size": 63488 00:14:36.646 } 00:14:36.646 ] 00:14:36.646 }' 00:14:36.646 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:36.646 08:27:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:37.213 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:37.213 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:37.214 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:37.472 [2024-07-23 08:27:49.777850] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:37.472 [2024-07-23 08:27:49.777913] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:37.472 [2024-07-23 08:27:49.777929] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037e80 00:14:37.472 [2024-07-23 08:27:49.777940] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:37.472 [2024-07-23 08:27:49.778397] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:37.472 [2024-07-23 08:27:49.778418] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:37.472 [2024-07-23 08:27:49.778488] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:37.472 [2024-07-23 08:27:49.778512] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:37.472 pt2 00:14:37.472 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:37.472 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:37.472 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:37.472 [2024-07-23 08:27:49.934252] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:37.472 [2024-07-23 08:27:49.934299] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:37.472 [2024-07-23 08:27:49.934329] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038180 00:14:37.472 [2024-07-23 08:27:49.934340] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:37.472 [2024-07-23 08:27:49.934791] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:37.472 [2024-07-23 08:27:49.934810] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:37.472 [2024-07-23 08:27:49.934876] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:37.472 [2024-07-23 08:27:49.934902] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:37.472 [2024-07-23 08:27:49.935040] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037880 00:14:37.472 [2024-07-23 08:27:49.935052] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:37.472 [2024-07-23 08:27:49.935313] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:14:37.472 [2024-07-23 08:27:49.935491] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037880 00:14:37.472 [2024-07-23 08:27:49.935500] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037880 00:14:37.472 [2024-07-23 08:27:49.935691] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:37.472 pt3 00:14:37.472 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:37.472 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:37.472 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:37.472 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:37.472 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:37.472 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:37.473 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:37.473 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:37.473 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.473 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.473 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.473 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.473 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.473 08:27:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:37.732 08:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:37.732 "name": "raid_bdev1", 00:14:37.732 "uuid": "bf3b9e45-0975-4669-928d-1dda3561662b", 00:14:37.732 "strip_size_kb": 64, 00:14:37.732 "state": "online", 00:14:37.732 "raid_level": "raid0", 00:14:37.732 "superblock": true, 00:14:37.732 "num_base_bdevs": 3, 00:14:37.732 "num_base_bdevs_discovered": 3, 00:14:37.732 "num_base_bdevs_operational": 3, 00:14:37.732 "base_bdevs_list": [ 00:14:37.732 { 00:14:37.732 "name": "pt1", 00:14:37.732 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:37.732 "is_configured": true, 00:14:37.732 "data_offset": 2048, 00:14:37.732 "data_size": 63488 00:14:37.732 }, 00:14:37.732 { 00:14:37.732 "name": "pt2", 00:14:37.732 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:37.732 "is_configured": true, 00:14:37.732 "data_offset": 2048, 00:14:37.732 "data_size": 63488 00:14:37.732 }, 00:14:37.732 { 00:14:37.732 "name": "pt3", 00:14:37.732 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:37.732 "is_configured": true, 00:14:37.732 "data_offset": 2048, 00:14:37.732 "data_size": 63488 00:14:37.732 } 00:14:37.732 ] 00:14:37.732 }' 00:14:37.732 08:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:37.732 08:27:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:38.299 08:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:38.299 08:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:38.299 08:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:38.299 08:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:38.299 08:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:38.299 08:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:38.299 08:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:38.299 08:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:38.299 [2024-07-23 08:27:50.776761] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:38.299 08:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:38.299 "name": "raid_bdev1", 00:14:38.299 "aliases": [ 00:14:38.299 "bf3b9e45-0975-4669-928d-1dda3561662b" 00:14:38.299 ], 00:14:38.299 "product_name": "Raid Volume", 00:14:38.299 "block_size": 512, 00:14:38.299 "num_blocks": 190464, 00:14:38.299 "uuid": "bf3b9e45-0975-4669-928d-1dda3561662b", 00:14:38.299 "assigned_rate_limits": { 00:14:38.299 "rw_ios_per_sec": 0, 00:14:38.299 "rw_mbytes_per_sec": 0, 00:14:38.299 "r_mbytes_per_sec": 0, 00:14:38.299 "w_mbytes_per_sec": 0 00:14:38.299 }, 00:14:38.299 "claimed": false, 00:14:38.299 "zoned": false, 00:14:38.299 "supported_io_types": { 00:14:38.299 "read": true, 00:14:38.299 "write": true, 00:14:38.299 "unmap": true, 00:14:38.299 "flush": true, 00:14:38.299 "reset": true, 00:14:38.299 "nvme_admin": false, 00:14:38.299 "nvme_io": false, 00:14:38.299 "nvme_io_md": false, 00:14:38.299 "write_zeroes": true, 00:14:38.299 "zcopy": false, 00:14:38.299 "get_zone_info": false, 00:14:38.299 "zone_management": false, 00:14:38.299 "zone_append": false, 00:14:38.299 "compare": false, 00:14:38.299 "compare_and_write": false, 00:14:38.299 "abort": false, 00:14:38.299 "seek_hole": false, 00:14:38.299 "seek_data": false, 00:14:38.299 "copy": false, 00:14:38.299 "nvme_iov_md": false 00:14:38.300 }, 00:14:38.300 "memory_domains": [ 00:14:38.300 { 00:14:38.300 "dma_device_id": "system", 00:14:38.300 "dma_device_type": 1 00:14:38.300 }, 00:14:38.300 { 00:14:38.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:38.300 "dma_device_type": 2 00:14:38.300 }, 00:14:38.300 { 00:14:38.300 "dma_device_id": "system", 00:14:38.300 "dma_device_type": 1 00:14:38.300 }, 00:14:38.300 { 00:14:38.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:38.300 "dma_device_type": 2 00:14:38.300 }, 00:14:38.300 { 00:14:38.300 "dma_device_id": "system", 00:14:38.300 "dma_device_type": 1 00:14:38.300 }, 00:14:38.300 { 00:14:38.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:38.300 "dma_device_type": 2 00:14:38.300 } 00:14:38.300 ], 00:14:38.300 "driver_specific": { 00:14:38.300 "raid": { 00:14:38.300 "uuid": "bf3b9e45-0975-4669-928d-1dda3561662b", 00:14:38.300 "strip_size_kb": 64, 00:14:38.300 "state": "online", 00:14:38.300 "raid_level": "raid0", 00:14:38.300 "superblock": true, 00:14:38.300 "num_base_bdevs": 3, 00:14:38.300 "num_base_bdevs_discovered": 3, 00:14:38.300 "num_base_bdevs_operational": 3, 00:14:38.300 "base_bdevs_list": [ 00:14:38.300 { 00:14:38.300 "name": "pt1", 00:14:38.300 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:38.300 "is_configured": true, 00:14:38.300 "data_offset": 2048, 00:14:38.300 "data_size": 63488 00:14:38.300 }, 00:14:38.300 { 00:14:38.300 "name": "pt2", 00:14:38.300 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:38.300 "is_configured": true, 00:14:38.300 "data_offset": 2048, 00:14:38.300 "data_size": 63488 00:14:38.300 }, 00:14:38.300 { 00:14:38.300 "name": "pt3", 00:14:38.300 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:38.300 "is_configured": true, 00:14:38.300 "data_offset": 2048, 00:14:38.300 "data_size": 63488 00:14:38.300 } 00:14:38.300 ] 00:14:38.300 } 00:14:38.300 } 00:14:38.300 }' 00:14:38.300 08:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:38.559 08:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:38.559 pt2 00:14:38.559 pt3' 00:14:38.559 08:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:38.559 08:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:38.559 08:27:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:38.559 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:38.559 "name": "pt1", 00:14:38.559 "aliases": [ 00:14:38.559 "00000000-0000-0000-0000-000000000001" 00:14:38.559 ], 00:14:38.559 "product_name": "passthru", 00:14:38.559 "block_size": 512, 00:14:38.559 "num_blocks": 65536, 00:14:38.559 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:38.559 "assigned_rate_limits": { 00:14:38.559 "rw_ios_per_sec": 0, 00:14:38.559 "rw_mbytes_per_sec": 0, 00:14:38.559 "r_mbytes_per_sec": 0, 00:14:38.559 "w_mbytes_per_sec": 0 00:14:38.559 }, 00:14:38.559 "claimed": true, 00:14:38.559 "claim_type": "exclusive_write", 00:14:38.559 "zoned": false, 00:14:38.559 "supported_io_types": { 00:14:38.559 "read": true, 00:14:38.559 "write": true, 00:14:38.559 "unmap": true, 00:14:38.559 "flush": true, 00:14:38.559 "reset": true, 00:14:38.559 "nvme_admin": false, 00:14:38.559 "nvme_io": false, 00:14:38.559 "nvme_io_md": false, 00:14:38.559 "write_zeroes": true, 00:14:38.559 "zcopy": true, 00:14:38.559 "get_zone_info": false, 00:14:38.559 "zone_management": false, 00:14:38.559 "zone_append": false, 00:14:38.559 "compare": false, 00:14:38.559 "compare_and_write": false, 00:14:38.559 "abort": true, 00:14:38.559 "seek_hole": false, 00:14:38.559 "seek_data": false, 00:14:38.559 "copy": true, 00:14:38.559 "nvme_iov_md": false 00:14:38.559 }, 00:14:38.559 "memory_domains": [ 00:14:38.559 { 00:14:38.559 "dma_device_id": "system", 00:14:38.559 "dma_device_type": 1 00:14:38.559 }, 00:14:38.559 { 00:14:38.559 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:38.559 "dma_device_type": 2 00:14:38.559 } 00:14:38.559 ], 00:14:38.559 "driver_specific": { 00:14:38.559 "passthru": { 00:14:38.559 "name": "pt1", 00:14:38.559 "base_bdev_name": "malloc1" 00:14:38.559 } 00:14:38.559 } 00:14:38.559 }' 00:14:38.559 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:38.559 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:38.817 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:38.817 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:38.817 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:38.817 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:38.817 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:38.817 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:38.817 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:38.818 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:38.818 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:38.818 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:38.818 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:38.818 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:38.818 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:39.076 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:39.076 "name": "pt2", 00:14:39.076 "aliases": [ 00:14:39.076 "00000000-0000-0000-0000-000000000002" 00:14:39.076 ], 00:14:39.076 "product_name": "passthru", 00:14:39.076 "block_size": 512, 00:14:39.076 "num_blocks": 65536, 00:14:39.076 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:39.076 "assigned_rate_limits": { 00:14:39.076 "rw_ios_per_sec": 0, 00:14:39.076 "rw_mbytes_per_sec": 0, 00:14:39.076 "r_mbytes_per_sec": 0, 00:14:39.076 "w_mbytes_per_sec": 0 00:14:39.076 }, 00:14:39.076 "claimed": true, 00:14:39.076 "claim_type": "exclusive_write", 00:14:39.076 "zoned": false, 00:14:39.076 "supported_io_types": { 00:14:39.076 "read": true, 00:14:39.076 "write": true, 00:14:39.076 "unmap": true, 00:14:39.076 "flush": true, 00:14:39.076 "reset": true, 00:14:39.076 "nvme_admin": false, 00:14:39.076 "nvme_io": false, 00:14:39.076 "nvme_io_md": false, 00:14:39.076 "write_zeroes": true, 00:14:39.076 "zcopy": true, 00:14:39.076 "get_zone_info": false, 00:14:39.076 "zone_management": false, 00:14:39.076 "zone_append": false, 00:14:39.076 "compare": false, 00:14:39.076 "compare_and_write": false, 00:14:39.076 "abort": true, 00:14:39.076 "seek_hole": false, 00:14:39.076 "seek_data": false, 00:14:39.076 "copy": true, 00:14:39.076 "nvme_iov_md": false 00:14:39.076 }, 00:14:39.076 "memory_domains": [ 00:14:39.076 { 00:14:39.076 "dma_device_id": "system", 00:14:39.076 "dma_device_type": 1 00:14:39.076 }, 00:14:39.076 { 00:14:39.076 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.076 "dma_device_type": 2 00:14:39.076 } 00:14:39.076 ], 00:14:39.076 "driver_specific": { 00:14:39.076 "passthru": { 00:14:39.076 "name": "pt2", 00:14:39.076 "base_bdev_name": "malloc2" 00:14:39.076 } 00:14:39.076 } 00:14:39.076 }' 00:14:39.076 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:39.076 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:39.076 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:39.076 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:39.335 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:39.335 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:39.335 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:39.335 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:39.335 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:39.335 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:39.335 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:39.335 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:39.335 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:39.335 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:39.335 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:39.594 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:39.594 "name": "pt3", 00:14:39.594 "aliases": [ 00:14:39.594 "00000000-0000-0000-0000-000000000003" 00:14:39.594 ], 00:14:39.594 "product_name": "passthru", 00:14:39.594 "block_size": 512, 00:14:39.594 "num_blocks": 65536, 00:14:39.594 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:39.594 "assigned_rate_limits": { 00:14:39.594 "rw_ios_per_sec": 0, 00:14:39.594 "rw_mbytes_per_sec": 0, 00:14:39.594 "r_mbytes_per_sec": 0, 00:14:39.594 "w_mbytes_per_sec": 0 00:14:39.594 }, 00:14:39.594 "claimed": true, 00:14:39.594 "claim_type": "exclusive_write", 00:14:39.594 "zoned": false, 00:14:39.594 "supported_io_types": { 00:14:39.594 "read": true, 00:14:39.594 "write": true, 00:14:39.594 "unmap": true, 00:14:39.594 "flush": true, 00:14:39.594 "reset": true, 00:14:39.594 "nvme_admin": false, 00:14:39.594 "nvme_io": false, 00:14:39.594 "nvme_io_md": false, 00:14:39.594 "write_zeroes": true, 00:14:39.594 "zcopy": true, 00:14:39.594 "get_zone_info": false, 00:14:39.594 "zone_management": false, 00:14:39.594 "zone_append": false, 00:14:39.594 "compare": false, 00:14:39.594 "compare_and_write": false, 00:14:39.594 "abort": true, 00:14:39.594 "seek_hole": false, 00:14:39.594 "seek_data": false, 00:14:39.594 "copy": true, 00:14:39.594 "nvme_iov_md": false 00:14:39.594 }, 00:14:39.594 "memory_domains": [ 00:14:39.594 { 00:14:39.594 "dma_device_id": "system", 00:14:39.594 "dma_device_type": 1 00:14:39.594 }, 00:14:39.594 { 00:14:39.594 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.594 "dma_device_type": 2 00:14:39.594 } 00:14:39.594 ], 00:14:39.594 "driver_specific": { 00:14:39.594 "passthru": { 00:14:39.594 "name": "pt3", 00:14:39.594 "base_bdev_name": "malloc3" 00:14:39.594 } 00:14:39.594 } 00:14:39.594 }' 00:14:39.594 08:27:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:39.594 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:39.594 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:39.594 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:39.594 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:39.853 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:39.853 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:39.853 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:39.853 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:39.853 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:39.853 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:39.853 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:39.853 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:39.853 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:40.112 [2024-07-23 08:27:52.453289] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:40.112 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' bf3b9e45-0975-4669-928d-1dda3561662b '!=' bf3b9e45-0975-4669-928d-1dda3561662b ']' 00:14:40.112 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:14:40.112 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:40.112 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:40.112 08:27:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1433787 00:14:40.113 08:27:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1433787 ']' 00:14:40.113 08:27:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1433787 00:14:40.113 08:27:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:14:40.113 08:27:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:40.113 08:27:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1433787 00:14:40.113 08:27:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:40.113 08:27:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:40.113 08:27:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1433787' 00:14:40.113 killing process with pid 1433787 00:14:40.113 08:27:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1433787 00:14:40.113 08:27:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1433787 00:14:40.113 [2024-07-23 08:27:52.496687] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:40.113 [2024-07-23 08:27:52.496776] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:40.113 [2024-07-23 08:27:52.496833] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:40.113 [2024-07-23 08:27:52.496846] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037880 name raid_bdev1, state offline 00:14:40.372 [2024-07-23 08:27:52.741369] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:41.751 08:27:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:41.751 00:14:41.751 real 0m12.040s 00:14:41.751 user 0m20.680s 00:14:41.751 sys 0m1.747s 00:14:41.751 08:27:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:41.751 08:27:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:41.751 ************************************ 00:14:41.751 END TEST raid_superblock_test 00:14:41.751 ************************************ 00:14:41.751 08:27:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:41.751 08:27:54 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:14:41.751 08:27:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:41.751 08:27:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:41.751 08:27:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:41.751 ************************************ 00:14:41.751 START TEST raid_read_error_test 00:14:41.751 ************************************ 00:14:41.751 08:27:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:14:41.751 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:14:41.751 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.3rzljPgbF5 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1436317 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1436317 /var/tmp/spdk-raid.sock 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1436317 ']' 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:41.752 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:41.752 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:41.752 [2024-07-23 08:27:54.128688] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:14:41.752 [2024-07-23 08:27:54.128798] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1436317 ] 00:14:41.752 [2024-07-23 08:27:54.255860] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:42.011 [2024-07-23 08:27:54.479177] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:42.270 [2024-07-23 08:27:54.751451] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:42.270 [2024-07-23 08:27:54.751480] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:42.528 08:27:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:42.528 08:27:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:42.528 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:42.528 08:27:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:42.787 BaseBdev1_malloc 00:14:42.787 08:27:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:42.787 true 00:14:42.787 08:27:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:43.045 [2024-07-23 08:27:55.424276] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:43.045 [2024-07-23 08:27:55.424330] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:43.045 [2024-07-23 08:27:55.424367] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:14:43.045 [2024-07-23 08:27:55.424378] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:43.046 [2024-07-23 08:27:55.426407] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:43.046 [2024-07-23 08:27:55.426438] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:43.046 BaseBdev1 00:14:43.046 08:27:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:43.046 08:27:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:43.304 BaseBdev2_malloc 00:14:43.304 08:27:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:43.304 true 00:14:43.304 08:27:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:43.564 [2024-07-23 08:27:55.969552] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:43.564 [2024-07-23 08:27:55.969605] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:43.564 [2024-07-23 08:27:55.969633] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:14:43.564 [2024-07-23 08:27:55.969647] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:43.564 [2024-07-23 08:27:55.971697] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:43.564 [2024-07-23 08:27:55.971727] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:43.564 BaseBdev2 00:14:43.564 08:27:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:43.564 08:27:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:43.823 BaseBdev3_malloc 00:14:43.823 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:44.083 true 00:14:44.083 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:44.083 [2024-07-23 08:27:56.513120] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:44.083 [2024-07-23 08:27:56.513176] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:44.083 [2024-07-23 08:27:56.513216] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036980 00:14:44.083 [2024-07-23 08:27:56.513228] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:44.083 [2024-07-23 08:27:56.515294] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:44.083 [2024-07-23 08:27:56.515324] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:44.083 BaseBdev3 00:14:44.083 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:44.342 [2024-07-23 08:27:56.685621] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:44.342 [2024-07-23 08:27:56.687283] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:44.342 [2024-07-23 08:27:56.687357] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:44.342 [2024-07-23 08:27:56.687585] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036f80 00:14:44.342 [2024-07-23 08:27:56.687598] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:44.342 [2024-07-23 08:27:56.687898] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:14:44.342 [2024-07-23 08:27:56.688099] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036f80 00:14:44.342 [2024-07-23 08:27:56.688113] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036f80 00:14:44.342 [2024-07-23 08:27:56.688284] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:44.342 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:44.342 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:44.342 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:44.342 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:44.342 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:44.342 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:44.342 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:44.342 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:44.342 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:44.342 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:44.342 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.342 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:44.601 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:44.601 "name": "raid_bdev1", 00:14:44.601 "uuid": "753c7fff-5eb6-4097-8fa0-b1ad9e049949", 00:14:44.601 "strip_size_kb": 64, 00:14:44.601 "state": "online", 00:14:44.601 "raid_level": "raid0", 00:14:44.601 "superblock": true, 00:14:44.601 "num_base_bdevs": 3, 00:14:44.601 "num_base_bdevs_discovered": 3, 00:14:44.601 "num_base_bdevs_operational": 3, 00:14:44.601 "base_bdevs_list": [ 00:14:44.602 { 00:14:44.602 "name": "BaseBdev1", 00:14:44.602 "uuid": "f58a0298-8c78-50d8-8e53-2e19f06db002", 00:14:44.602 "is_configured": true, 00:14:44.602 "data_offset": 2048, 00:14:44.602 "data_size": 63488 00:14:44.602 }, 00:14:44.602 { 00:14:44.602 "name": "BaseBdev2", 00:14:44.602 "uuid": "c788eaa4-a650-59e3-bb53-4889943617f2", 00:14:44.602 "is_configured": true, 00:14:44.602 "data_offset": 2048, 00:14:44.602 "data_size": 63488 00:14:44.602 }, 00:14:44.602 { 00:14:44.602 "name": "BaseBdev3", 00:14:44.602 "uuid": "e219c2c8-d9af-5224-b392-590b7d64ae0d", 00:14:44.602 "is_configured": true, 00:14:44.602 "data_offset": 2048, 00:14:44.602 "data_size": 63488 00:14:44.602 } 00:14:44.602 ] 00:14:44.602 }' 00:14:44.602 08:27:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:44.602 08:27:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:44.861 08:27:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:44.861 08:27:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:45.120 [2024-07-23 08:27:57.420873] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:14:46.057 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:46.057 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:46.057 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:46.057 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:46.057 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:46.057 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:46.057 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:46.057 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:46.057 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:46.057 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:46.057 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.057 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.057 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.057 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.057 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.057 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:46.317 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.317 "name": "raid_bdev1", 00:14:46.317 "uuid": "753c7fff-5eb6-4097-8fa0-b1ad9e049949", 00:14:46.317 "strip_size_kb": 64, 00:14:46.317 "state": "online", 00:14:46.317 "raid_level": "raid0", 00:14:46.317 "superblock": true, 00:14:46.317 "num_base_bdevs": 3, 00:14:46.317 "num_base_bdevs_discovered": 3, 00:14:46.317 "num_base_bdevs_operational": 3, 00:14:46.317 "base_bdevs_list": [ 00:14:46.317 { 00:14:46.317 "name": "BaseBdev1", 00:14:46.317 "uuid": "f58a0298-8c78-50d8-8e53-2e19f06db002", 00:14:46.317 "is_configured": true, 00:14:46.317 "data_offset": 2048, 00:14:46.317 "data_size": 63488 00:14:46.317 }, 00:14:46.317 { 00:14:46.317 "name": "BaseBdev2", 00:14:46.317 "uuid": "c788eaa4-a650-59e3-bb53-4889943617f2", 00:14:46.317 "is_configured": true, 00:14:46.317 "data_offset": 2048, 00:14:46.317 "data_size": 63488 00:14:46.317 }, 00:14:46.317 { 00:14:46.317 "name": "BaseBdev3", 00:14:46.317 "uuid": "e219c2c8-d9af-5224-b392-590b7d64ae0d", 00:14:46.317 "is_configured": true, 00:14:46.317 "data_offset": 2048, 00:14:46.317 "data_size": 63488 00:14:46.317 } 00:14:46.317 ] 00:14:46.317 }' 00:14:46.317 08:27:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.317 08:27:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:46.885 08:27:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:46.885 [2024-07-23 08:27:59.338495] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:46.885 [2024-07-23 08:27:59.338529] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:46.885 [2024-07-23 08:27:59.341479] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:46.885 [2024-07-23 08:27:59.341525] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:46.885 [2024-07-23 08:27:59.341572] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:46.885 [2024-07-23 08:27:59.341585] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036f80 name raid_bdev1, state offline 00:14:46.885 0 00:14:46.885 08:27:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1436317 00:14:46.885 08:27:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1436317 ']' 00:14:46.885 08:27:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1436317 00:14:46.885 08:27:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:14:46.886 08:27:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:46.886 08:27:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1436317 00:14:47.194 08:27:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:47.195 08:27:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:47.195 08:27:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1436317' 00:14:47.195 killing process with pid 1436317 00:14:47.195 08:27:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1436317 00:14:47.195 08:27:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1436317 00:14:47.195 [2024-07-23 08:27:59.409367] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:47.195 [2024-07-23 08:27:59.583874] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:48.572 08:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.3rzljPgbF5 00:14:48.572 08:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:48.572 08:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:48.572 08:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:14:48.572 08:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:14:48.572 08:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:48.572 08:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:48.572 08:28:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:14:48.572 00:14:48.572 real 0m6.888s 00:14:48.572 user 0m9.761s 00:14:48.572 sys 0m0.957s 00:14:48.572 08:28:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:48.572 08:28:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:48.572 ************************************ 00:14:48.572 END TEST raid_read_error_test 00:14:48.572 ************************************ 00:14:48.572 08:28:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:48.572 08:28:00 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:14:48.572 08:28:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:48.572 08:28:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:48.572 08:28:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:48.572 ************************************ 00:14:48.572 START TEST raid_write_error_test 00:14:48.572 ************************************ 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:48.572 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:48.573 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.DMtHKdtkQn 00:14:48.573 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1437693 00:14:48.573 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1437693 /var/tmp/spdk-raid.sock 00:14:48.573 08:28:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1437693 ']' 00:14:48.573 08:28:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:48.573 08:28:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:48.573 08:28:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:48.573 08:28:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:48.573 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:48.573 08:28:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:48.573 08:28:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:48.573 [2024-07-23 08:28:01.064757] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:14:48.573 [2024-07-23 08:28:01.064847] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1437693 ] 00:14:48.831 [2024-07-23 08:28:01.188795] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:49.090 [2024-07-23 08:28:01.405785] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:49.349 [2024-07-23 08:28:01.670404] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:49.349 [2024-07-23 08:28:01.670433] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:49.349 08:28:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:49.349 08:28:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:49.349 08:28:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:49.349 08:28:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:49.608 BaseBdev1_malloc 00:14:49.608 08:28:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:49.867 true 00:14:49.867 08:28:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:49.867 [2024-07-23 08:28:02.358307] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:49.867 [2024-07-23 08:28:02.358361] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:49.867 [2024-07-23 08:28:02.358398] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:14:49.867 [2024-07-23 08:28:02.358409] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:49.867 [2024-07-23 08:28:02.360434] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:49.867 [2024-07-23 08:28:02.360473] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:49.867 BaseBdev1 00:14:49.867 08:28:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:49.867 08:28:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:50.126 BaseBdev2_malloc 00:14:50.126 08:28:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:50.385 true 00:14:50.385 08:28:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:50.385 [2024-07-23 08:28:02.889716] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:50.385 [2024-07-23 08:28:02.889768] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:50.385 [2024-07-23 08:28:02.889802] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:14:50.385 [2024-07-23 08:28:02.889816] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:50.385 [2024-07-23 08:28:02.891789] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:50.385 [2024-07-23 08:28:02.891818] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:50.385 BaseBdev2 00:14:50.385 08:28:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:50.385 08:28:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:50.644 BaseBdev3_malloc 00:14:50.644 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:50.903 true 00:14:50.903 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:50.903 [2024-07-23 08:28:03.411359] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:50.903 [2024-07-23 08:28:03.411412] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:50.903 [2024-07-23 08:28:03.411432] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036980 00:14:50.903 [2024-07-23 08:28:03.411444] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:50.903 [2024-07-23 08:28:03.413411] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:50.903 [2024-07-23 08:28:03.413441] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:50.903 BaseBdev3 00:14:51.162 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:51.162 [2024-07-23 08:28:03.579846] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:51.162 [2024-07-23 08:28:03.581457] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:51.162 [2024-07-23 08:28:03.581528] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:51.162 [2024-07-23 08:28:03.581742] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036f80 00:14:51.162 [2024-07-23 08:28:03.581755] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:51.162 [2024-07-23 08:28:03.581997] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:14:51.162 [2024-07-23 08:28:03.582185] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036f80 00:14:51.162 [2024-07-23 08:28:03.582199] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036f80 00:14:51.162 [2024-07-23 08:28:03.582352] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:51.162 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:51.162 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:51.162 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:51.162 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:51.162 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:51.162 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:51.162 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:51.162 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:51.162 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:51.162 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:51.162 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.162 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:51.421 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:51.421 "name": "raid_bdev1", 00:14:51.421 "uuid": "497ddb8b-7f79-40a8-ba2d-b2b9c808fbcd", 00:14:51.421 "strip_size_kb": 64, 00:14:51.421 "state": "online", 00:14:51.421 "raid_level": "raid0", 00:14:51.421 "superblock": true, 00:14:51.421 "num_base_bdevs": 3, 00:14:51.421 "num_base_bdevs_discovered": 3, 00:14:51.421 "num_base_bdevs_operational": 3, 00:14:51.421 "base_bdevs_list": [ 00:14:51.421 { 00:14:51.421 "name": "BaseBdev1", 00:14:51.421 "uuid": "591a1964-424b-5a21-afcf-5d9dc99a8590", 00:14:51.421 "is_configured": true, 00:14:51.421 "data_offset": 2048, 00:14:51.421 "data_size": 63488 00:14:51.421 }, 00:14:51.421 { 00:14:51.421 "name": "BaseBdev2", 00:14:51.421 "uuid": "657a5edd-850f-5039-b609-68ac8a7cc2ed", 00:14:51.421 "is_configured": true, 00:14:51.421 "data_offset": 2048, 00:14:51.421 "data_size": 63488 00:14:51.421 }, 00:14:51.421 { 00:14:51.421 "name": "BaseBdev3", 00:14:51.421 "uuid": "ccb12ce6-48dd-5b1b-88cf-40456f83f9cb", 00:14:51.421 "is_configured": true, 00:14:51.421 "data_offset": 2048, 00:14:51.421 "data_size": 63488 00:14:51.421 } 00:14:51.421 ] 00:14:51.421 }' 00:14:51.421 08:28:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:51.421 08:28:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:51.989 08:28:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:51.989 08:28:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:51.989 [2024-07-23 08:28:04.307077] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:14:52.926 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:52.926 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:52.926 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:52.926 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:52.926 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:52.926 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:52.926 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:52.926 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:52.926 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:52.926 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:52.926 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:52.926 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:52.926 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:52.926 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:52.926 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.926 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:53.184 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:53.184 "name": "raid_bdev1", 00:14:53.184 "uuid": "497ddb8b-7f79-40a8-ba2d-b2b9c808fbcd", 00:14:53.184 "strip_size_kb": 64, 00:14:53.184 "state": "online", 00:14:53.184 "raid_level": "raid0", 00:14:53.184 "superblock": true, 00:14:53.184 "num_base_bdevs": 3, 00:14:53.184 "num_base_bdevs_discovered": 3, 00:14:53.184 "num_base_bdevs_operational": 3, 00:14:53.184 "base_bdevs_list": [ 00:14:53.184 { 00:14:53.184 "name": "BaseBdev1", 00:14:53.184 "uuid": "591a1964-424b-5a21-afcf-5d9dc99a8590", 00:14:53.184 "is_configured": true, 00:14:53.184 "data_offset": 2048, 00:14:53.184 "data_size": 63488 00:14:53.184 }, 00:14:53.184 { 00:14:53.184 "name": "BaseBdev2", 00:14:53.184 "uuid": "657a5edd-850f-5039-b609-68ac8a7cc2ed", 00:14:53.184 "is_configured": true, 00:14:53.184 "data_offset": 2048, 00:14:53.184 "data_size": 63488 00:14:53.184 }, 00:14:53.184 { 00:14:53.184 "name": "BaseBdev3", 00:14:53.184 "uuid": "ccb12ce6-48dd-5b1b-88cf-40456f83f9cb", 00:14:53.184 "is_configured": true, 00:14:53.184 "data_offset": 2048, 00:14:53.184 "data_size": 63488 00:14:53.184 } 00:14:53.184 ] 00:14:53.184 }' 00:14:53.184 08:28:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:53.184 08:28:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:53.751 08:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:54.009 [2024-07-23 08:28:06.318010] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:54.009 [2024-07-23 08:28:06.318044] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:54.009 [2024-07-23 08:28:06.320797] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:54.009 [2024-07-23 08:28:06.320839] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:54.009 [2024-07-23 08:28:06.320879] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:54.009 [2024-07-23 08:28:06.320890] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036f80 name raid_bdev1, state offline 00:14:54.009 0 00:14:54.009 08:28:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1437693 00:14:54.009 08:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1437693 ']' 00:14:54.009 08:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1437693 00:14:54.009 08:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:14:54.009 08:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:54.009 08:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1437693 00:14:54.009 08:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:54.009 08:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:54.009 08:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1437693' 00:14:54.009 killing process with pid 1437693 00:14:54.009 08:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1437693 00:14:54.009 [2024-07-23 08:28:06.370871] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:54.009 08:28:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1437693 00:14:54.267 [2024-07-23 08:28:06.539939] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:55.644 08:28:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.DMtHKdtkQn 00:14:55.644 08:28:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:55.644 08:28:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:55.644 08:28:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:14:55.644 08:28:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:14:55.644 08:28:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:55.644 08:28:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:55.644 08:28:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:14:55.644 00:14:55.644 real 0m6.898s 00:14:55.644 user 0m9.857s 00:14:55.644 sys 0m0.881s 00:14:55.644 08:28:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:55.644 08:28:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:55.644 ************************************ 00:14:55.644 END TEST raid_write_error_test 00:14:55.644 ************************************ 00:14:55.644 08:28:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:55.644 08:28:07 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:55.644 08:28:07 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:14:55.644 08:28:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:55.644 08:28:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:55.644 08:28:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:55.644 ************************************ 00:14:55.644 START TEST raid_state_function_test 00:14:55.644 ************************************ 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1439059 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1439059' 00:14:55.644 Process raid pid: 1439059 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1439059 /var/tmp/spdk-raid.sock 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1439059 ']' 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:55.644 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:55.644 08:28:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:55.644 [2024-07-23 08:28:08.040373] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:14:55.644 [2024-07-23 08:28:08.040456] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:55.903 [2024-07-23 08:28:08.164474] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:55.903 [2024-07-23 08:28:08.381804] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:56.163 [2024-07-23 08:28:08.645380] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:56.163 [2024-07-23 08:28:08.645414] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:56.421 08:28:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:56.421 08:28:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:14:56.421 08:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:56.680 [2024-07-23 08:28:08.957280] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:56.680 [2024-07-23 08:28:08.957322] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:56.680 [2024-07-23 08:28:08.957332] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:56.680 [2024-07-23 08:28:08.957343] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:56.680 [2024-07-23 08:28:08.957350] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:56.680 [2024-07-23 08:28:08.957362] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:56.680 08:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:56.680 08:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:56.680 08:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:56.680 08:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:56.680 08:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:56.680 08:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:56.680 08:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:56.680 08:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:56.680 08:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:56.680 08:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:56.680 08:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.680 08:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:56.680 08:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:56.680 "name": "Existed_Raid", 00:14:56.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:56.680 "strip_size_kb": 64, 00:14:56.680 "state": "configuring", 00:14:56.680 "raid_level": "concat", 00:14:56.680 "superblock": false, 00:14:56.680 "num_base_bdevs": 3, 00:14:56.680 "num_base_bdevs_discovered": 0, 00:14:56.680 "num_base_bdevs_operational": 3, 00:14:56.680 "base_bdevs_list": [ 00:14:56.680 { 00:14:56.680 "name": "BaseBdev1", 00:14:56.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:56.680 "is_configured": false, 00:14:56.680 "data_offset": 0, 00:14:56.680 "data_size": 0 00:14:56.680 }, 00:14:56.680 { 00:14:56.680 "name": "BaseBdev2", 00:14:56.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:56.680 "is_configured": false, 00:14:56.680 "data_offset": 0, 00:14:56.680 "data_size": 0 00:14:56.680 }, 00:14:56.680 { 00:14:56.680 "name": "BaseBdev3", 00:14:56.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:56.680 "is_configured": false, 00:14:56.680 "data_offset": 0, 00:14:56.680 "data_size": 0 00:14:56.680 } 00:14:56.680 ] 00:14:56.680 }' 00:14:56.680 08:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:56.680 08:28:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:57.247 08:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:57.505 [2024-07-23 08:28:09.779343] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:57.506 [2024-07-23 08:28:09.779376] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:14:57.506 08:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:57.506 [2024-07-23 08:28:09.931752] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:57.506 [2024-07-23 08:28:09.931786] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:57.506 [2024-07-23 08:28:09.931794] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:57.506 [2024-07-23 08:28:09.931805] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:57.506 [2024-07-23 08:28:09.931812] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:57.506 [2024-07-23 08:28:09.931823] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:57.506 08:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:57.765 [2024-07-23 08:28:10.130728] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:57.765 BaseBdev1 00:14:57.765 08:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:57.765 08:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:57.765 08:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:57.765 08:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:57.765 08:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:57.765 08:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:57.765 08:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:58.024 08:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:58.024 [ 00:14:58.024 { 00:14:58.024 "name": "BaseBdev1", 00:14:58.024 "aliases": [ 00:14:58.024 "097741f5-95fd-4ff7-9898-a7a953e73272" 00:14:58.024 ], 00:14:58.024 "product_name": "Malloc disk", 00:14:58.024 "block_size": 512, 00:14:58.024 "num_blocks": 65536, 00:14:58.024 "uuid": "097741f5-95fd-4ff7-9898-a7a953e73272", 00:14:58.024 "assigned_rate_limits": { 00:14:58.024 "rw_ios_per_sec": 0, 00:14:58.024 "rw_mbytes_per_sec": 0, 00:14:58.024 "r_mbytes_per_sec": 0, 00:14:58.024 "w_mbytes_per_sec": 0 00:14:58.024 }, 00:14:58.024 "claimed": true, 00:14:58.024 "claim_type": "exclusive_write", 00:14:58.024 "zoned": false, 00:14:58.024 "supported_io_types": { 00:14:58.024 "read": true, 00:14:58.024 "write": true, 00:14:58.024 "unmap": true, 00:14:58.024 "flush": true, 00:14:58.024 "reset": true, 00:14:58.024 "nvme_admin": false, 00:14:58.024 "nvme_io": false, 00:14:58.024 "nvme_io_md": false, 00:14:58.024 "write_zeroes": true, 00:14:58.024 "zcopy": true, 00:14:58.024 "get_zone_info": false, 00:14:58.024 "zone_management": false, 00:14:58.024 "zone_append": false, 00:14:58.024 "compare": false, 00:14:58.024 "compare_and_write": false, 00:14:58.024 "abort": true, 00:14:58.024 "seek_hole": false, 00:14:58.024 "seek_data": false, 00:14:58.024 "copy": true, 00:14:58.024 "nvme_iov_md": false 00:14:58.024 }, 00:14:58.024 "memory_domains": [ 00:14:58.024 { 00:14:58.024 "dma_device_id": "system", 00:14:58.024 "dma_device_type": 1 00:14:58.024 }, 00:14:58.024 { 00:14:58.024 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.024 "dma_device_type": 2 00:14:58.024 } 00:14:58.024 ], 00:14:58.024 "driver_specific": {} 00:14:58.024 } 00:14:58.024 ] 00:14:58.024 08:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:58.024 08:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:58.024 08:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:58.024 08:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:58.024 08:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:58.024 08:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:58.024 08:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:58.024 08:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:58.024 08:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:58.024 08:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:58.024 08:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:58.024 08:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.024 08:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:58.283 08:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:58.283 "name": "Existed_Raid", 00:14:58.283 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:58.283 "strip_size_kb": 64, 00:14:58.283 "state": "configuring", 00:14:58.283 "raid_level": "concat", 00:14:58.283 "superblock": false, 00:14:58.283 "num_base_bdevs": 3, 00:14:58.283 "num_base_bdevs_discovered": 1, 00:14:58.283 "num_base_bdevs_operational": 3, 00:14:58.283 "base_bdevs_list": [ 00:14:58.283 { 00:14:58.283 "name": "BaseBdev1", 00:14:58.283 "uuid": "097741f5-95fd-4ff7-9898-a7a953e73272", 00:14:58.283 "is_configured": true, 00:14:58.283 "data_offset": 0, 00:14:58.283 "data_size": 65536 00:14:58.283 }, 00:14:58.283 { 00:14:58.283 "name": "BaseBdev2", 00:14:58.283 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:58.283 "is_configured": false, 00:14:58.283 "data_offset": 0, 00:14:58.283 "data_size": 0 00:14:58.283 }, 00:14:58.283 { 00:14:58.283 "name": "BaseBdev3", 00:14:58.283 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:58.283 "is_configured": false, 00:14:58.283 "data_offset": 0, 00:14:58.283 "data_size": 0 00:14:58.283 } 00:14:58.283 ] 00:14:58.283 }' 00:14:58.283 08:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:58.283 08:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:58.851 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:58.851 [2024-07-23 08:28:11.297847] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:58.851 [2024-07-23 08:28:11.297894] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:14:58.851 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:59.109 [2024-07-23 08:28:11.466336] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:59.109 [2024-07-23 08:28:11.468030] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:59.109 [2024-07-23 08:28:11.468063] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:59.109 [2024-07-23 08:28:11.468073] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:59.109 [2024-07-23 08:28:11.468082] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:59.109 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:59.109 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:59.109 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:59.109 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:59.109 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:59.109 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:59.109 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:59.109 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:59.109 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:59.109 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:59.109 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:59.109 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:59.110 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:59.110 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:59.368 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:59.368 "name": "Existed_Raid", 00:14:59.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:59.368 "strip_size_kb": 64, 00:14:59.368 "state": "configuring", 00:14:59.368 "raid_level": "concat", 00:14:59.368 "superblock": false, 00:14:59.368 "num_base_bdevs": 3, 00:14:59.368 "num_base_bdevs_discovered": 1, 00:14:59.368 "num_base_bdevs_operational": 3, 00:14:59.368 "base_bdevs_list": [ 00:14:59.368 { 00:14:59.368 "name": "BaseBdev1", 00:14:59.368 "uuid": "097741f5-95fd-4ff7-9898-a7a953e73272", 00:14:59.368 "is_configured": true, 00:14:59.368 "data_offset": 0, 00:14:59.368 "data_size": 65536 00:14:59.368 }, 00:14:59.368 { 00:14:59.368 "name": "BaseBdev2", 00:14:59.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:59.368 "is_configured": false, 00:14:59.368 "data_offset": 0, 00:14:59.368 "data_size": 0 00:14:59.368 }, 00:14:59.368 { 00:14:59.368 "name": "BaseBdev3", 00:14:59.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:59.368 "is_configured": false, 00:14:59.368 "data_offset": 0, 00:14:59.368 "data_size": 0 00:14:59.368 } 00:14:59.368 ] 00:14:59.368 }' 00:14:59.368 08:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:59.368 08:28:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:59.626 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:59.885 [2024-07-23 08:28:12.323178] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:59.885 BaseBdev2 00:14:59.885 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:59.885 08:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:59.885 08:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:59.885 08:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:59.885 08:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:59.885 08:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:59.885 08:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:00.143 08:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:00.402 [ 00:15:00.402 { 00:15:00.402 "name": "BaseBdev2", 00:15:00.402 "aliases": [ 00:15:00.402 "90055a88-efb2-4dd3-bfad-8bb5164e871e" 00:15:00.402 ], 00:15:00.402 "product_name": "Malloc disk", 00:15:00.402 "block_size": 512, 00:15:00.402 "num_blocks": 65536, 00:15:00.402 "uuid": "90055a88-efb2-4dd3-bfad-8bb5164e871e", 00:15:00.402 "assigned_rate_limits": { 00:15:00.402 "rw_ios_per_sec": 0, 00:15:00.402 "rw_mbytes_per_sec": 0, 00:15:00.402 "r_mbytes_per_sec": 0, 00:15:00.402 "w_mbytes_per_sec": 0 00:15:00.402 }, 00:15:00.402 "claimed": true, 00:15:00.402 "claim_type": "exclusive_write", 00:15:00.402 "zoned": false, 00:15:00.402 "supported_io_types": { 00:15:00.402 "read": true, 00:15:00.402 "write": true, 00:15:00.402 "unmap": true, 00:15:00.402 "flush": true, 00:15:00.402 "reset": true, 00:15:00.402 "nvme_admin": false, 00:15:00.402 "nvme_io": false, 00:15:00.402 "nvme_io_md": false, 00:15:00.402 "write_zeroes": true, 00:15:00.402 "zcopy": true, 00:15:00.402 "get_zone_info": false, 00:15:00.402 "zone_management": false, 00:15:00.402 "zone_append": false, 00:15:00.402 "compare": false, 00:15:00.402 "compare_and_write": false, 00:15:00.402 "abort": true, 00:15:00.402 "seek_hole": false, 00:15:00.402 "seek_data": false, 00:15:00.402 "copy": true, 00:15:00.402 "nvme_iov_md": false 00:15:00.402 }, 00:15:00.402 "memory_domains": [ 00:15:00.402 { 00:15:00.402 "dma_device_id": "system", 00:15:00.402 "dma_device_type": 1 00:15:00.402 }, 00:15:00.402 { 00:15:00.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.402 "dma_device_type": 2 00:15:00.402 } 00:15:00.402 ], 00:15:00.402 "driver_specific": {} 00:15:00.402 } 00:15:00.402 ] 00:15:00.402 08:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:00.402 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:00.402 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:00.402 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:00.402 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:00.402 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:00.402 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:00.402 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:00.402 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:00.402 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:00.402 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:00.402 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:00.402 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:00.402 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:00.402 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.402 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:00.402 "name": "Existed_Raid", 00:15:00.403 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:00.403 "strip_size_kb": 64, 00:15:00.403 "state": "configuring", 00:15:00.403 "raid_level": "concat", 00:15:00.403 "superblock": false, 00:15:00.403 "num_base_bdevs": 3, 00:15:00.403 "num_base_bdevs_discovered": 2, 00:15:00.403 "num_base_bdevs_operational": 3, 00:15:00.403 "base_bdevs_list": [ 00:15:00.403 { 00:15:00.403 "name": "BaseBdev1", 00:15:00.403 "uuid": "097741f5-95fd-4ff7-9898-a7a953e73272", 00:15:00.403 "is_configured": true, 00:15:00.403 "data_offset": 0, 00:15:00.403 "data_size": 65536 00:15:00.403 }, 00:15:00.403 { 00:15:00.403 "name": "BaseBdev2", 00:15:00.403 "uuid": "90055a88-efb2-4dd3-bfad-8bb5164e871e", 00:15:00.403 "is_configured": true, 00:15:00.403 "data_offset": 0, 00:15:00.403 "data_size": 65536 00:15:00.403 }, 00:15:00.403 { 00:15:00.403 "name": "BaseBdev3", 00:15:00.403 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:00.403 "is_configured": false, 00:15:00.403 "data_offset": 0, 00:15:00.403 "data_size": 0 00:15:00.403 } 00:15:00.403 ] 00:15:00.403 }' 00:15:00.403 08:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:00.403 08:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:00.969 08:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:01.227 [2024-07-23 08:28:13.493582] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:01.227 [2024-07-23 08:28:13.493628] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:15:01.227 [2024-07-23 08:28:13.493639] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:01.227 [2024-07-23 08:28:13.493879] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:15:01.227 [2024-07-23 08:28:13.494059] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:15:01.227 [2024-07-23 08:28:13.494069] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:15:01.227 [2024-07-23 08:28:13.494311] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:01.227 BaseBdev3 00:15:01.227 08:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:01.227 08:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:01.227 08:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:01.227 08:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:01.227 08:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:01.227 08:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:01.227 08:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:01.227 08:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:01.485 [ 00:15:01.486 { 00:15:01.486 "name": "BaseBdev3", 00:15:01.486 "aliases": [ 00:15:01.486 "13d800c3-22f4-41fa-8269-21e9cd3603f5" 00:15:01.486 ], 00:15:01.486 "product_name": "Malloc disk", 00:15:01.486 "block_size": 512, 00:15:01.486 "num_blocks": 65536, 00:15:01.486 "uuid": "13d800c3-22f4-41fa-8269-21e9cd3603f5", 00:15:01.486 "assigned_rate_limits": { 00:15:01.486 "rw_ios_per_sec": 0, 00:15:01.486 "rw_mbytes_per_sec": 0, 00:15:01.486 "r_mbytes_per_sec": 0, 00:15:01.486 "w_mbytes_per_sec": 0 00:15:01.486 }, 00:15:01.486 "claimed": true, 00:15:01.486 "claim_type": "exclusive_write", 00:15:01.486 "zoned": false, 00:15:01.486 "supported_io_types": { 00:15:01.486 "read": true, 00:15:01.486 "write": true, 00:15:01.486 "unmap": true, 00:15:01.486 "flush": true, 00:15:01.486 "reset": true, 00:15:01.486 "nvme_admin": false, 00:15:01.486 "nvme_io": false, 00:15:01.486 "nvme_io_md": false, 00:15:01.486 "write_zeroes": true, 00:15:01.486 "zcopy": true, 00:15:01.486 "get_zone_info": false, 00:15:01.486 "zone_management": false, 00:15:01.486 "zone_append": false, 00:15:01.486 "compare": false, 00:15:01.486 "compare_and_write": false, 00:15:01.486 "abort": true, 00:15:01.486 "seek_hole": false, 00:15:01.486 "seek_data": false, 00:15:01.486 "copy": true, 00:15:01.486 "nvme_iov_md": false 00:15:01.486 }, 00:15:01.486 "memory_domains": [ 00:15:01.486 { 00:15:01.486 "dma_device_id": "system", 00:15:01.486 "dma_device_type": 1 00:15:01.486 }, 00:15:01.486 { 00:15:01.486 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.486 "dma_device_type": 2 00:15:01.486 } 00:15:01.486 ], 00:15:01.486 "driver_specific": {} 00:15:01.486 } 00:15:01.486 ] 00:15:01.486 08:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:01.486 08:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:01.486 08:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:01.486 08:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:01.486 08:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:01.486 08:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:01.486 08:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:01.486 08:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:01.486 08:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:01.486 08:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:01.486 08:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:01.486 08:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:01.486 08:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:01.486 08:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.486 08:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:01.744 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:01.744 "name": "Existed_Raid", 00:15:01.744 "uuid": "f3be6fdb-c6db-49c6-8873-e157a81f97be", 00:15:01.744 "strip_size_kb": 64, 00:15:01.744 "state": "online", 00:15:01.744 "raid_level": "concat", 00:15:01.744 "superblock": false, 00:15:01.744 "num_base_bdevs": 3, 00:15:01.744 "num_base_bdevs_discovered": 3, 00:15:01.744 "num_base_bdevs_operational": 3, 00:15:01.744 "base_bdevs_list": [ 00:15:01.744 { 00:15:01.744 "name": "BaseBdev1", 00:15:01.744 "uuid": "097741f5-95fd-4ff7-9898-a7a953e73272", 00:15:01.744 "is_configured": true, 00:15:01.744 "data_offset": 0, 00:15:01.744 "data_size": 65536 00:15:01.744 }, 00:15:01.744 { 00:15:01.744 "name": "BaseBdev2", 00:15:01.744 "uuid": "90055a88-efb2-4dd3-bfad-8bb5164e871e", 00:15:01.744 "is_configured": true, 00:15:01.744 "data_offset": 0, 00:15:01.744 "data_size": 65536 00:15:01.744 }, 00:15:01.744 { 00:15:01.744 "name": "BaseBdev3", 00:15:01.744 "uuid": "13d800c3-22f4-41fa-8269-21e9cd3603f5", 00:15:01.744 "is_configured": true, 00:15:01.744 "data_offset": 0, 00:15:01.744 "data_size": 65536 00:15:01.744 } 00:15:01.744 ] 00:15:01.744 }' 00:15:01.744 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:01.744 08:28:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:02.032 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:02.032 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:02.032 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:02.032 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:02.032 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:02.032 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:02.291 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:02.291 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:02.291 [2024-07-23 08:28:14.693051] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:02.291 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:02.291 "name": "Existed_Raid", 00:15:02.291 "aliases": [ 00:15:02.291 "f3be6fdb-c6db-49c6-8873-e157a81f97be" 00:15:02.291 ], 00:15:02.291 "product_name": "Raid Volume", 00:15:02.291 "block_size": 512, 00:15:02.291 "num_blocks": 196608, 00:15:02.291 "uuid": "f3be6fdb-c6db-49c6-8873-e157a81f97be", 00:15:02.291 "assigned_rate_limits": { 00:15:02.291 "rw_ios_per_sec": 0, 00:15:02.291 "rw_mbytes_per_sec": 0, 00:15:02.291 "r_mbytes_per_sec": 0, 00:15:02.291 "w_mbytes_per_sec": 0 00:15:02.291 }, 00:15:02.291 "claimed": false, 00:15:02.291 "zoned": false, 00:15:02.291 "supported_io_types": { 00:15:02.291 "read": true, 00:15:02.291 "write": true, 00:15:02.291 "unmap": true, 00:15:02.291 "flush": true, 00:15:02.292 "reset": true, 00:15:02.292 "nvme_admin": false, 00:15:02.292 "nvme_io": false, 00:15:02.292 "nvme_io_md": false, 00:15:02.292 "write_zeroes": true, 00:15:02.292 "zcopy": false, 00:15:02.292 "get_zone_info": false, 00:15:02.292 "zone_management": false, 00:15:02.292 "zone_append": false, 00:15:02.292 "compare": false, 00:15:02.292 "compare_and_write": false, 00:15:02.292 "abort": false, 00:15:02.292 "seek_hole": false, 00:15:02.292 "seek_data": false, 00:15:02.292 "copy": false, 00:15:02.292 "nvme_iov_md": false 00:15:02.292 }, 00:15:02.292 "memory_domains": [ 00:15:02.292 { 00:15:02.292 "dma_device_id": "system", 00:15:02.292 "dma_device_type": 1 00:15:02.292 }, 00:15:02.292 { 00:15:02.292 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.292 "dma_device_type": 2 00:15:02.292 }, 00:15:02.292 { 00:15:02.292 "dma_device_id": "system", 00:15:02.292 "dma_device_type": 1 00:15:02.292 }, 00:15:02.292 { 00:15:02.292 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.292 "dma_device_type": 2 00:15:02.292 }, 00:15:02.292 { 00:15:02.292 "dma_device_id": "system", 00:15:02.292 "dma_device_type": 1 00:15:02.292 }, 00:15:02.292 { 00:15:02.292 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.292 "dma_device_type": 2 00:15:02.292 } 00:15:02.292 ], 00:15:02.292 "driver_specific": { 00:15:02.292 "raid": { 00:15:02.292 "uuid": "f3be6fdb-c6db-49c6-8873-e157a81f97be", 00:15:02.292 "strip_size_kb": 64, 00:15:02.292 "state": "online", 00:15:02.292 "raid_level": "concat", 00:15:02.292 "superblock": false, 00:15:02.292 "num_base_bdevs": 3, 00:15:02.292 "num_base_bdevs_discovered": 3, 00:15:02.292 "num_base_bdevs_operational": 3, 00:15:02.292 "base_bdevs_list": [ 00:15:02.292 { 00:15:02.292 "name": "BaseBdev1", 00:15:02.292 "uuid": "097741f5-95fd-4ff7-9898-a7a953e73272", 00:15:02.292 "is_configured": true, 00:15:02.292 "data_offset": 0, 00:15:02.292 "data_size": 65536 00:15:02.292 }, 00:15:02.292 { 00:15:02.292 "name": "BaseBdev2", 00:15:02.292 "uuid": "90055a88-efb2-4dd3-bfad-8bb5164e871e", 00:15:02.292 "is_configured": true, 00:15:02.292 "data_offset": 0, 00:15:02.292 "data_size": 65536 00:15:02.292 }, 00:15:02.292 { 00:15:02.292 "name": "BaseBdev3", 00:15:02.292 "uuid": "13d800c3-22f4-41fa-8269-21e9cd3603f5", 00:15:02.292 "is_configured": true, 00:15:02.292 "data_offset": 0, 00:15:02.292 "data_size": 65536 00:15:02.292 } 00:15:02.292 ] 00:15:02.292 } 00:15:02.292 } 00:15:02.292 }' 00:15:02.292 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:02.292 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:02.292 BaseBdev2 00:15:02.292 BaseBdev3' 00:15:02.292 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:02.292 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:02.292 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:02.551 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:02.551 "name": "BaseBdev1", 00:15:02.551 "aliases": [ 00:15:02.551 "097741f5-95fd-4ff7-9898-a7a953e73272" 00:15:02.551 ], 00:15:02.551 "product_name": "Malloc disk", 00:15:02.551 "block_size": 512, 00:15:02.551 "num_blocks": 65536, 00:15:02.551 "uuid": "097741f5-95fd-4ff7-9898-a7a953e73272", 00:15:02.551 "assigned_rate_limits": { 00:15:02.551 "rw_ios_per_sec": 0, 00:15:02.551 "rw_mbytes_per_sec": 0, 00:15:02.551 "r_mbytes_per_sec": 0, 00:15:02.551 "w_mbytes_per_sec": 0 00:15:02.551 }, 00:15:02.551 "claimed": true, 00:15:02.551 "claim_type": "exclusive_write", 00:15:02.551 "zoned": false, 00:15:02.551 "supported_io_types": { 00:15:02.551 "read": true, 00:15:02.551 "write": true, 00:15:02.551 "unmap": true, 00:15:02.551 "flush": true, 00:15:02.551 "reset": true, 00:15:02.551 "nvme_admin": false, 00:15:02.551 "nvme_io": false, 00:15:02.551 "nvme_io_md": false, 00:15:02.551 "write_zeroes": true, 00:15:02.551 "zcopy": true, 00:15:02.551 "get_zone_info": false, 00:15:02.551 "zone_management": false, 00:15:02.551 "zone_append": false, 00:15:02.551 "compare": false, 00:15:02.551 "compare_and_write": false, 00:15:02.551 "abort": true, 00:15:02.551 "seek_hole": false, 00:15:02.551 "seek_data": false, 00:15:02.551 "copy": true, 00:15:02.551 "nvme_iov_md": false 00:15:02.551 }, 00:15:02.551 "memory_domains": [ 00:15:02.551 { 00:15:02.551 "dma_device_id": "system", 00:15:02.551 "dma_device_type": 1 00:15:02.551 }, 00:15:02.551 { 00:15:02.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.551 "dma_device_type": 2 00:15:02.551 } 00:15:02.551 ], 00:15:02.551 "driver_specific": {} 00:15:02.551 }' 00:15:02.551 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.551 08:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.551 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:02.551 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:02.551 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:02.551 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:02.551 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:02.809 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:02.809 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:02.809 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:02.809 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:02.809 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:02.809 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:02.809 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:02.809 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:03.068 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:03.068 "name": "BaseBdev2", 00:15:03.068 "aliases": [ 00:15:03.068 "90055a88-efb2-4dd3-bfad-8bb5164e871e" 00:15:03.068 ], 00:15:03.068 "product_name": "Malloc disk", 00:15:03.068 "block_size": 512, 00:15:03.068 "num_blocks": 65536, 00:15:03.068 "uuid": "90055a88-efb2-4dd3-bfad-8bb5164e871e", 00:15:03.068 "assigned_rate_limits": { 00:15:03.068 "rw_ios_per_sec": 0, 00:15:03.068 "rw_mbytes_per_sec": 0, 00:15:03.068 "r_mbytes_per_sec": 0, 00:15:03.068 "w_mbytes_per_sec": 0 00:15:03.068 }, 00:15:03.068 "claimed": true, 00:15:03.068 "claim_type": "exclusive_write", 00:15:03.068 "zoned": false, 00:15:03.068 "supported_io_types": { 00:15:03.068 "read": true, 00:15:03.068 "write": true, 00:15:03.068 "unmap": true, 00:15:03.068 "flush": true, 00:15:03.068 "reset": true, 00:15:03.068 "nvme_admin": false, 00:15:03.068 "nvme_io": false, 00:15:03.068 "nvme_io_md": false, 00:15:03.068 "write_zeroes": true, 00:15:03.068 "zcopy": true, 00:15:03.068 "get_zone_info": false, 00:15:03.068 "zone_management": false, 00:15:03.068 "zone_append": false, 00:15:03.068 "compare": false, 00:15:03.068 "compare_and_write": false, 00:15:03.068 "abort": true, 00:15:03.068 "seek_hole": false, 00:15:03.068 "seek_data": false, 00:15:03.068 "copy": true, 00:15:03.068 "nvme_iov_md": false 00:15:03.068 }, 00:15:03.068 "memory_domains": [ 00:15:03.068 { 00:15:03.068 "dma_device_id": "system", 00:15:03.068 "dma_device_type": 1 00:15:03.068 }, 00:15:03.068 { 00:15:03.068 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.068 "dma_device_type": 2 00:15:03.068 } 00:15:03.068 ], 00:15:03.068 "driver_specific": {} 00:15:03.068 }' 00:15:03.068 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:03.068 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:03.068 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:03.068 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:03.068 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:03.068 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:03.068 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:03.068 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:03.068 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:03.068 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:03.068 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:03.327 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:03.327 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:03.327 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:03.327 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:03.327 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:03.327 "name": "BaseBdev3", 00:15:03.327 "aliases": [ 00:15:03.327 "13d800c3-22f4-41fa-8269-21e9cd3603f5" 00:15:03.327 ], 00:15:03.327 "product_name": "Malloc disk", 00:15:03.327 "block_size": 512, 00:15:03.327 "num_blocks": 65536, 00:15:03.327 "uuid": "13d800c3-22f4-41fa-8269-21e9cd3603f5", 00:15:03.327 "assigned_rate_limits": { 00:15:03.327 "rw_ios_per_sec": 0, 00:15:03.327 "rw_mbytes_per_sec": 0, 00:15:03.327 "r_mbytes_per_sec": 0, 00:15:03.327 "w_mbytes_per_sec": 0 00:15:03.327 }, 00:15:03.327 "claimed": true, 00:15:03.327 "claim_type": "exclusive_write", 00:15:03.327 "zoned": false, 00:15:03.327 "supported_io_types": { 00:15:03.327 "read": true, 00:15:03.327 "write": true, 00:15:03.327 "unmap": true, 00:15:03.327 "flush": true, 00:15:03.327 "reset": true, 00:15:03.327 "nvme_admin": false, 00:15:03.327 "nvme_io": false, 00:15:03.327 "nvme_io_md": false, 00:15:03.327 "write_zeroes": true, 00:15:03.327 "zcopy": true, 00:15:03.327 "get_zone_info": false, 00:15:03.327 "zone_management": false, 00:15:03.327 "zone_append": false, 00:15:03.327 "compare": false, 00:15:03.327 "compare_and_write": false, 00:15:03.327 "abort": true, 00:15:03.327 "seek_hole": false, 00:15:03.327 "seek_data": false, 00:15:03.327 "copy": true, 00:15:03.327 "nvme_iov_md": false 00:15:03.327 }, 00:15:03.327 "memory_domains": [ 00:15:03.327 { 00:15:03.327 "dma_device_id": "system", 00:15:03.327 "dma_device_type": 1 00:15:03.327 }, 00:15:03.327 { 00:15:03.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.327 "dma_device_type": 2 00:15:03.327 } 00:15:03.327 ], 00:15:03.327 "driver_specific": {} 00:15:03.327 }' 00:15:03.327 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:03.327 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:03.327 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:03.327 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:03.586 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:03.586 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:03.586 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:03.586 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:03.586 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:03.586 08:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:03.586 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:03.586 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:03.586 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:03.845 [2024-07-23 08:28:16.205025] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:03.845 [2024-07-23 08:28:16.205054] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:03.845 [2024-07-23 08:28:16.205106] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.845 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:04.104 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:04.104 "name": "Existed_Raid", 00:15:04.104 "uuid": "f3be6fdb-c6db-49c6-8873-e157a81f97be", 00:15:04.104 "strip_size_kb": 64, 00:15:04.104 "state": "offline", 00:15:04.104 "raid_level": "concat", 00:15:04.104 "superblock": false, 00:15:04.104 "num_base_bdevs": 3, 00:15:04.104 "num_base_bdevs_discovered": 2, 00:15:04.104 "num_base_bdevs_operational": 2, 00:15:04.104 "base_bdevs_list": [ 00:15:04.104 { 00:15:04.104 "name": null, 00:15:04.104 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:04.104 "is_configured": false, 00:15:04.104 "data_offset": 0, 00:15:04.104 "data_size": 65536 00:15:04.104 }, 00:15:04.104 { 00:15:04.104 "name": "BaseBdev2", 00:15:04.104 "uuid": "90055a88-efb2-4dd3-bfad-8bb5164e871e", 00:15:04.104 "is_configured": true, 00:15:04.104 "data_offset": 0, 00:15:04.104 "data_size": 65536 00:15:04.104 }, 00:15:04.104 { 00:15:04.104 "name": "BaseBdev3", 00:15:04.104 "uuid": "13d800c3-22f4-41fa-8269-21e9cd3603f5", 00:15:04.104 "is_configured": true, 00:15:04.104 "data_offset": 0, 00:15:04.104 "data_size": 65536 00:15:04.104 } 00:15:04.104 ] 00:15:04.104 }' 00:15:04.104 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:04.104 08:28:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:04.363 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:04.363 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:04.363 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.363 08:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:04.621 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:04.621 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:04.621 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:04.880 [2024-07-23 08:28:17.181956] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:04.880 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:04.880 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:04.880 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:04.880 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.140 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:05.140 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:05.140 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:05.140 [2024-07-23 08:28:17.628069] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:05.140 [2024-07-23 08:28:17.628121] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:15:05.398 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:05.399 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:05.399 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.399 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:05.399 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:05.399 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:05.399 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:05.399 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:05.399 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:05.399 08:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:05.656 BaseBdev2 00:15:05.656 08:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:05.656 08:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:05.656 08:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:05.656 08:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:05.656 08:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:05.656 08:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:05.656 08:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:05.920 08:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:05.920 [ 00:15:05.920 { 00:15:05.920 "name": "BaseBdev2", 00:15:05.920 "aliases": [ 00:15:05.920 "de165996-b9e9-4cbc-85fc-67983bf76c14" 00:15:05.920 ], 00:15:05.920 "product_name": "Malloc disk", 00:15:05.920 "block_size": 512, 00:15:05.920 "num_blocks": 65536, 00:15:05.920 "uuid": "de165996-b9e9-4cbc-85fc-67983bf76c14", 00:15:05.920 "assigned_rate_limits": { 00:15:05.920 "rw_ios_per_sec": 0, 00:15:05.920 "rw_mbytes_per_sec": 0, 00:15:05.920 "r_mbytes_per_sec": 0, 00:15:05.920 "w_mbytes_per_sec": 0 00:15:05.920 }, 00:15:05.920 "claimed": false, 00:15:05.920 "zoned": false, 00:15:05.920 "supported_io_types": { 00:15:05.920 "read": true, 00:15:05.920 "write": true, 00:15:05.920 "unmap": true, 00:15:05.920 "flush": true, 00:15:05.920 "reset": true, 00:15:05.920 "nvme_admin": false, 00:15:05.920 "nvme_io": false, 00:15:05.920 "nvme_io_md": false, 00:15:05.920 "write_zeroes": true, 00:15:05.920 "zcopy": true, 00:15:05.920 "get_zone_info": false, 00:15:05.920 "zone_management": false, 00:15:05.920 "zone_append": false, 00:15:05.920 "compare": false, 00:15:05.920 "compare_and_write": false, 00:15:05.920 "abort": true, 00:15:05.920 "seek_hole": false, 00:15:05.920 "seek_data": false, 00:15:05.920 "copy": true, 00:15:05.920 "nvme_iov_md": false 00:15:05.920 }, 00:15:05.920 "memory_domains": [ 00:15:05.920 { 00:15:05.920 "dma_device_id": "system", 00:15:05.920 "dma_device_type": 1 00:15:05.920 }, 00:15:05.920 { 00:15:05.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:05.920 "dma_device_type": 2 00:15:05.920 } 00:15:05.920 ], 00:15:05.920 "driver_specific": {} 00:15:05.920 } 00:15:05.920 ] 00:15:05.921 08:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:05.921 08:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:05.921 08:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:05.921 08:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:06.181 BaseBdev3 00:15:06.181 08:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:06.181 08:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:06.181 08:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:06.181 08:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:06.181 08:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:06.181 08:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:06.181 08:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:06.440 08:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:06.698 [ 00:15:06.698 { 00:15:06.698 "name": "BaseBdev3", 00:15:06.698 "aliases": [ 00:15:06.698 "ff659a99-4a83-4d45-af44-c292466dcbae" 00:15:06.698 ], 00:15:06.698 "product_name": "Malloc disk", 00:15:06.698 "block_size": 512, 00:15:06.698 "num_blocks": 65536, 00:15:06.698 "uuid": "ff659a99-4a83-4d45-af44-c292466dcbae", 00:15:06.698 "assigned_rate_limits": { 00:15:06.698 "rw_ios_per_sec": 0, 00:15:06.698 "rw_mbytes_per_sec": 0, 00:15:06.698 "r_mbytes_per_sec": 0, 00:15:06.698 "w_mbytes_per_sec": 0 00:15:06.698 }, 00:15:06.698 "claimed": false, 00:15:06.698 "zoned": false, 00:15:06.698 "supported_io_types": { 00:15:06.698 "read": true, 00:15:06.698 "write": true, 00:15:06.698 "unmap": true, 00:15:06.698 "flush": true, 00:15:06.698 "reset": true, 00:15:06.698 "nvme_admin": false, 00:15:06.698 "nvme_io": false, 00:15:06.698 "nvme_io_md": false, 00:15:06.698 "write_zeroes": true, 00:15:06.698 "zcopy": true, 00:15:06.698 "get_zone_info": false, 00:15:06.698 "zone_management": false, 00:15:06.698 "zone_append": false, 00:15:06.698 "compare": false, 00:15:06.698 "compare_and_write": false, 00:15:06.698 "abort": true, 00:15:06.698 "seek_hole": false, 00:15:06.698 "seek_data": false, 00:15:06.698 "copy": true, 00:15:06.698 "nvme_iov_md": false 00:15:06.698 }, 00:15:06.698 "memory_domains": [ 00:15:06.698 { 00:15:06.698 "dma_device_id": "system", 00:15:06.698 "dma_device_type": 1 00:15:06.698 }, 00:15:06.698 { 00:15:06.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.698 "dma_device_type": 2 00:15:06.698 } 00:15:06.698 ], 00:15:06.698 "driver_specific": {} 00:15:06.698 } 00:15:06.698 ] 00:15:06.698 08:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:06.699 08:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:06.699 08:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:06.699 08:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:06.699 [2024-07-23 08:28:19.129539] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:06.699 [2024-07-23 08:28:19.129579] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:06.699 [2024-07-23 08:28:19.129628] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:06.699 [2024-07-23 08:28:19.131243] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:06.699 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:06.699 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:06.699 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:06.699 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:06.699 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:06.699 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:06.699 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:06.699 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:06.699 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:06.699 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:06.699 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:06.699 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:06.958 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:06.958 "name": "Existed_Raid", 00:15:06.958 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.958 "strip_size_kb": 64, 00:15:06.958 "state": "configuring", 00:15:06.958 "raid_level": "concat", 00:15:06.958 "superblock": false, 00:15:06.958 "num_base_bdevs": 3, 00:15:06.958 "num_base_bdevs_discovered": 2, 00:15:06.958 "num_base_bdevs_operational": 3, 00:15:06.958 "base_bdevs_list": [ 00:15:06.958 { 00:15:06.958 "name": "BaseBdev1", 00:15:06.958 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.958 "is_configured": false, 00:15:06.958 "data_offset": 0, 00:15:06.958 "data_size": 0 00:15:06.958 }, 00:15:06.958 { 00:15:06.958 "name": "BaseBdev2", 00:15:06.958 "uuid": "de165996-b9e9-4cbc-85fc-67983bf76c14", 00:15:06.958 "is_configured": true, 00:15:06.958 "data_offset": 0, 00:15:06.958 "data_size": 65536 00:15:06.959 }, 00:15:06.959 { 00:15:06.959 "name": "BaseBdev3", 00:15:06.959 "uuid": "ff659a99-4a83-4d45-af44-c292466dcbae", 00:15:06.959 "is_configured": true, 00:15:06.959 "data_offset": 0, 00:15:06.959 "data_size": 65536 00:15:06.959 } 00:15:06.959 ] 00:15:06.959 }' 00:15:06.959 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:06.959 08:28:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:07.525 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:07.525 [2024-07-23 08:28:19.971754] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:07.525 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:07.525 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:07.525 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:07.525 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:07.525 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:07.525 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:07.525 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.525 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.525 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.526 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.526 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:07.526 08:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.784 08:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:07.784 "name": "Existed_Raid", 00:15:07.784 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.784 "strip_size_kb": 64, 00:15:07.784 "state": "configuring", 00:15:07.784 "raid_level": "concat", 00:15:07.784 "superblock": false, 00:15:07.784 "num_base_bdevs": 3, 00:15:07.784 "num_base_bdevs_discovered": 1, 00:15:07.784 "num_base_bdevs_operational": 3, 00:15:07.784 "base_bdevs_list": [ 00:15:07.784 { 00:15:07.784 "name": "BaseBdev1", 00:15:07.784 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.784 "is_configured": false, 00:15:07.784 "data_offset": 0, 00:15:07.784 "data_size": 0 00:15:07.784 }, 00:15:07.784 { 00:15:07.784 "name": null, 00:15:07.784 "uuid": "de165996-b9e9-4cbc-85fc-67983bf76c14", 00:15:07.784 "is_configured": false, 00:15:07.784 "data_offset": 0, 00:15:07.784 "data_size": 65536 00:15:07.784 }, 00:15:07.784 { 00:15:07.784 "name": "BaseBdev3", 00:15:07.784 "uuid": "ff659a99-4a83-4d45-af44-c292466dcbae", 00:15:07.784 "is_configured": true, 00:15:07.784 "data_offset": 0, 00:15:07.784 "data_size": 65536 00:15:07.784 } 00:15:07.784 ] 00:15:07.784 }' 00:15:07.784 08:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:07.784 08:28:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.351 08:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:08.351 08:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.351 08:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:08.351 08:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:08.609 [2024-07-23 08:28:21.050352] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:08.609 BaseBdev1 00:15:08.609 08:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:08.609 08:28:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:08.609 08:28:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:08.609 08:28:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:08.609 08:28:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:08.609 08:28:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:08.609 08:28:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:08.868 08:28:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:09.127 [ 00:15:09.127 { 00:15:09.127 "name": "BaseBdev1", 00:15:09.127 "aliases": [ 00:15:09.127 "a2d4430d-e035-4309-ab47-968bf97f74a9" 00:15:09.127 ], 00:15:09.127 "product_name": "Malloc disk", 00:15:09.127 "block_size": 512, 00:15:09.127 "num_blocks": 65536, 00:15:09.127 "uuid": "a2d4430d-e035-4309-ab47-968bf97f74a9", 00:15:09.127 "assigned_rate_limits": { 00:15:09.127 "rw_ios_per_sec": 0, 00:15:09.127 "rw_mbytes_per_sec": 0, 00:15:09.127 "r_mbytes_per_sec": 0, 00:15:09.127 "w_mbytes_per_sec": 0 00:15:09.127 }, 00:15:09.127 "claimed": true, 00:15:09.127 "claim_type": "exclusive_write", 00:15:09.127 "zoned": false, 00:15:09.127 "supported_io_types": { 00:15:09.127 "read": true, 00:15:09.127 "write": true, 00:15:09.127 "unmap": true, 00:15:09.127 "flush": true, 00:15:09.127 "reset": true, 00:15:09.127 "nvme_admin": false, 00:15:09.127 "nvme_io": false, 00:15:09.127 "nvme_io_md": false, 00:15:09.127 "write_zeroes": true, 00:15:09.127 "zcopy": true, 00:15:09.127 "get_zone_info": false, 00:15:09.127 "zone_management": false, 00:15:09.127 "zone_append": false, 00:15:09.127 "compare": false, 00:15:09.127 "compare_and_write": false, 00:15:09.127 "abort": true, 00:15:09.127 "seek_hole": false, 00:15:09.127 "seek_data": false, 00:15:09.127 "copy": true, 00:15:09.127 "nvme_iov_md": false 00:15:09.127 }, 00:15:09.127 "memory_domains": [ 00:15:09.127 { 00:15:09.127 "dma_device_id": "system", 00:15:09.127 "dma_device_type": 1 00:15:09.127 }, 00:15:09.127 { 00:15:09.127 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.127 "dma_device_type": 2 00:15:09.127 } 00:15:09.128 ], 00:15:09.128 "driver_specific": {} 00:15:09.128 } 00:15:09.128 ] 00:15:09.128 08:28:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:09.128 08:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:09.128 08:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:09.128 08:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:09.128 08:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:09.128 08:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:09.128 08:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:09.128 08:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:09.128 08:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:09.128 08:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:09.128 08:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:09.128 08:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:09.128 08:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:09.128 08:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:09.128 "name": "Existed_Raid", 00:15:09.128 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:09.128 "strip_size_kb": 64, 00:15:09.128 "state": "configuring", 00:15:09.128 "raid_level": "concat", 00:15:09.128 "superblock": false, 00:15:09.128 "num_base_bdevs": 3, 00:15:09.128 "num_base_bdevs_discovered": 2, 00:15:09.128 "num_base_bdevs_operational": 3, 00:15:09.128 "base_bdevs_list": [ 00:15:09.128 { 00:15:09.128 "name": "BaseBdev1", 00:15:09.128 "uuid": "a2d4430d-e035-4309-ab47-968bf97f74a9", 00:15:09.128 "is_configured": true, 00:15:09.128 "data_offset": 0, 00:15:09.128 "data_size": 65536 00:15:09.128 }, 00:15:09.128 { 00:15:09.128 "name": null, 00:15:09.128 "uuid": "de165996-b9e9-4cbc-85fc-67983bf76c14", 00:15:09.128 "is_configured": false, 00:15:09.128 "data_offset": 0, 00:15:09.128 "data_size": 65536 00:15:09.128 }, 00:15:09.128 { 00:15:09.128 "name": "BaseBdev3", 00:15:09.128 "uuid": "ff659a99-4a83-4d45-af44-c292466dcbae", 00:15:09.128 "is_configured": true, 00:15:09.128 "data_offset": 0, 00:15:09.128 "data_size": 65536 00:15:09.128 } 00:15:09.128 ] 00:15:09.128 }' 00:15:09.128 08:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:09.128 08:28:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:09.695 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:09.695 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:09.695 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:09.695 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:09.954 [2024-07-23 08:28:22.361913] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:09.954 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:09.954 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:09.954 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:09.954 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:09.954 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:09.954 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:09.954 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:09.954 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:09.954 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:09.954 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:09.954 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:09.954 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:10.212 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.212 "name": "Existed_Raid", 00:15:10.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.212 "strip_size_kb": 64, 00:15:10.212 "state": "configuring", 00:15:10.212 "raid_level": "concat", 00:15:10.212 "superblock": false, 00:15:10.212 "num_base_bdevs": 3, 00:15:10.212 "num_base_bdevs_discovered": 1, 00:15:10.212 "num_base_bdevs_operational": 3, 00:15:10.212 "base_bdevs_list": [ 00:15:10.212 { 00:15:10.212 "name": "BaseBdev1", 00:15:10.212 "uuid": "a2d4430d-e035-4309-ab47-968bf97f74a9", 00:15:10.212 "is_configured": true, 00:15:10.212 "data_offset": 0, 00:15:10.212 "data_size": 65536 00:15:10.212 }, 00:15:10.212 { 00:15:10.212 "name": null, 00:15:10.212 "uuid": "de165996-b9e9-4cbc-85fc-67983bf76c14", 00:15:10.212 "is_configured": false, 00:15:10.212 "data_offset": 0, 00:15:10.212 "data_size": 65536 00:15:10.212 }, 00:15:10.212 { 00:15:10.212 "name": null, 00:15:10.212 "uuid": "ff659a99-4a83-4d45-af44-c292466dcbae", 00:15:10.213 "is_configured": false, 00:15:10.213 "data_offset": 0, 00:15:10.213 "data_size": 65536 00:15:10.213 } 00:15:10.213 ] 00:15:10.213 }' 00:15:10.213 08:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.213 08:28:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:10.780 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.780 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:10.780 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:10.780 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:11.038 [2024-07-23 08:28:23.364562] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:11.038 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:11.038 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:11.038 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:11.038 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:11.038 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:11.038 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:11.038 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:11.038 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:11.038 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:11.038 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:11.038 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.038 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:11.038 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:11.038 "name": "Existed_Raid", 00:15:11.038 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.038 "strip_size_kb": 64, 00:15:11.038 "state": "configuring", 00:15:11.038 "raid_level": "concat", 00:15:11.038 "superblock": false, 00:15:11.038 "num_base_bdevs": 3, 00:15:11.038 "num_base_bdevs_discovered": 2, 00:15:11.038 "num_base_bdevs_operational": 3, 00:15:11.038 "base_bdevs_list": [ 00:15:11.038 { 00:15:11.038 "name": "BaseBdev1", 00:15:11.038 "uuid": "a2d4430d-e035-4309-ab47-968bf97f74a9", 00:15:11.038 "is_configured": true, 00:15:11.038 "data_offset": 0, 00:15:11.038 "data_size": 65536 00:15:11.038 }, 00:15:11.038 { 00:15:11.038 "name": null, 00:15:11.038 "uuid": "de165996-b9e9-4cbc-85fc-67983bf76c14", 00:15:11.038 "is_configured": false, 00:15:11.038 "data_offset": 0, 00:15:11.038 "data_size": 65536 00:15:11.038 }, 00:15:11.038 { 00:15:11.038 "name": "BaseBdev3", 00:15:11.038 "uuid": "ff659a99-4a83-4d45-af44-c292466dcbae", 00:15:11.038 "is_configured": true, 00:15:11.038 "data_offset": 0, 00:15:11.038 "data_size": 65536 00:15:11.038 } 00:15:11.038 ] 00:15:11.038 }' 00:15:11.038 08:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:11.038 08:28:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:11.603 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.603 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:11.861 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:11.861 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:11.861 [2024-07-23 08:28:24.351201] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:12.120 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:12.120 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:12.120 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:12.120 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:12.120 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:12.120 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:12.120 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:12.120 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:12.120 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:12.120 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:12.120 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.120 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:12.378 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:12.378 "name": "Existed_Raid", 00:15:12.378 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:12.378 "strip_size_kb": 64, 00:15:12.378 "state": "configuring", 00:15:12.378 "raid_level": "concat", 00:15:12.378 "superblock": false, 00:15:12.378 "num_base_bdevs": 3, 00:15:12.378 "num_base_bdevs_discovered": 1, 00:15:12.378 "num_base_bdevs_operational": 3, 00:15:12.378 "base_bdevs_list": [ 00:15:12.378 { 00:15:12.378 "name": null, 00:15:12.378 "uuid": "a2d4430d-e035-4309-ab47-968bf97f74a9", 00:15:12.378 "is_configured": false, 00:15:12.378 "data_offset": 0, 00:15:12.378 "data_size": 65536 00:15:12.378 }, 00:15:12.378 { 00:15:12.378 "name": null, 00:15:12.379 "uuid": "de165996-b9e9-4cbc-85fc-67983bf76c14", 00:15:12.379 "is_configured": false, 00:15:12.379 "data_offset": 0, 00:15:12.379 "data_size": 65536 00:15:12.379 }, 00:15:12.379 { 00:15:12.379 "name": "BaseBdev3", 00:15:12.379 "uuid": "ff659a99-4a83-4d45-af44-c292466dcbae", 00:15:12.379 "is_configured": true, 00:15:12.379 "data_offset": 0, 00:15:12.379 "data_size": 65536 00:15:12.379 } 00:15:12.379 ] 00:15:12.379 }' 00:15:12.379 08:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:12.379 08:28:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:12.637 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.637 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:12.896 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:12.896 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:13.155 [2024-07-23 08:28:25.462770] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:13.155 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:13.155 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:13.155 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:13.155 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:13.155 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:13.155 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:13.155 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:13.155 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:13.155 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:13.155 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:13.155 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:13.155 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.155 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:13.155 "name": "Existed_Raid", 00:15:13.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.155 "strip_size_kb": 64, 00:15:13.155 "state": "configuring", 00:15:13.155 "raid_level": "concat", 00:15:13.155 "superblock": false, 00:15:13.155 "num_base_bdevs": 3, 00:15:13.155 "num_base_bdevs_discovered": 2, 00:15:13.155 "num_base_bdevs_operational": 3, 00:15:13.155 "base_bdevs_list": [ 00:15:13.155 { 00:15:13.155 "name": null, 00:15:13.155 "uuid": "a2d4430d-e035-4309-ab47-968bf97f74a9", 00:15:13.155 "is_configured": false, 00:15:13.155 "data_offset": 0, 00:15:13.155 "data_size": 65536 00:15:13.155 }, 00:15:13.155 { 00:15:13.155 "name": "BaseBdev2", 00:15:13.155 "uuid": "de165996-b9e9-4cbc-85fc-67983bf76c14", 00:15:13.155 "is_configured": true, 00:15:13.155 "data_offset": 0, 00:15:13.155 "data_size": 65536 00:15:13.155 }, 00:15:13.155 { 00:15:13.155 "name": "BaseBdev3", 00:15:13.155 "uuid": "ff659a99-4a83-4d45-af44-c292466dcbae", 00:15:13.155 "is_configured": true, 00:15:13.155 "data_offset": 0, 00:15:13.155 "data_size": 65536 00:15:13.155 } 00:15:13.155 ] 00:15:13.155 }' 00:15:13.155 08:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:13.155 08:28:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.723 08:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.723 08:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:13.982 08:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:13.982 08:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.982 08:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:13.982 08:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u a2d4430d-e035-4309-ab47-968bf97f74a9 00:15:14.240 [2024-07-23 08:28:26.646508] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:14.240 [2024-07-23 08:28:26.646549] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036980 00:15:14.240 [2024-07-23 08:28:26.646559] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:14.240 [2024-07-23 08:28:26.646810] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c200 00:15:14.240 [2024-07-23 08:28:26.646974] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036980 00:15:14.240 [2024-07-23 08:28:26.646983] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000036980 00:15:14.240 [2024-07-23 08:28:26.647223] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:14.240 NewBaseBdev 00:15:14.240 08:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:14.240 08:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:14.240 08:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:14.240 08:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:14.240 08:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:14.240 08:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:14.240 08:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:14.498 08:28:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:14.498 [ 00:15:14.498 { 00:15:14.498 "name": "NewBaseBdev", 00:15:14.498 "aliases": [ 00:15:14.498 "a2d4430d-e035-4309-ab47-968bf97f74a9" 00:15:14.498 ], 00:15:14.498 "product_name": "Malloc disk", 00:15:14.498 "block_size": 512, 00:15:14.498 "num_blocks": 65536, 00:15:14.498 "uuid": "a2d4430d-e035-4309-ab47-968bf97f74a9", 00:15:14.498 "assigned_rate_limits": { 00:15:14.498 "rw_ios_per_sec": 0, 00:15:14.498 "rw_mbytes_per_sec": 0, 00:15:14.498 "r_mbytes_per_sec": 0, 00:15:14.498 "w_mbytes_per_sec": 0 00:15:14.498 }, 00:15:14.498 "claimed": true, 00:15:14.498 "claim_type": "exclusive_write", 00:15:14.498 "zoned": false, 00:15:14.498 "supported_io_types": { 00:15:14.498 "read": true, 00:15:14.498 "write": true, 00:15:14.498 "unmap": true, 00:15:14.498 "flush": true, 00:15:14.498 "reset": true, 00:15:14.498 "nvme_admin": false, 00:15:14.498 "nvme_io": false, 00:15:14.498 "nvme_io_md": false, 00:15:14.498 "write_zeroes": true, 00:15:14.498 "zcopy": true, 00:15:14.498 "get_zone_info": false, 00:15:14.498 "zone_management": false, 00:15:14.498 "zone_append": false, 00:15:14.498 "compare": false, 00:15:14.498 "compare_and_write": false, 00:15:14.498 "abort": true, 00:15:14.498 "seek_hole": false, 00:15:14.498 "seek_data": false, 00:15:14.499 "copy": true, 00:15:14.499 "nvme_iov_md": false 00:15:14.499 }, 00:15:14.499 "memory_domains": [ 00:15:14.499 { 00:15:14.499 "dma_device_id": "system", 00:15:14.499 "dma_device_type": 1 00:15:14.499 }, 00:15:14.499 { 00:15:14.499 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.499 "dma_device_type": 2 00:15:14.499 } 00:15:14.499 ], 00:15:14.499 "driver_specific": {} 00:15:14.499 } 00:15:14.499 ] 00:15:14.499 08:28:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:14.499 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:14.499 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:14.499 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:14.499 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:14.499 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:14.499 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:14.499 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:14.499 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:14.499 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:14.499 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:14.499 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.499 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:14.758 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:14.758 "name": "Existed_Raid", 00:15:14.758 "uuid": "a4cee085-0b7c-4fab-84c3-e6720f23ec8f", 00:15:14.758 "strip_size_kb": 64, 00:15:14.758 "state": "online", 00:15:14.758 "raid_level": "concat", 00:15:14.758 "superblock": false, 00:15:14.758 "num_base_bdevs": 3, 00:15:14.758 "num_base_bdevs_discovered": 3, 00:15:14.758 "num_base_bdevs_operational": 3, 00:15:14.758 "base_bdevs_list": [ 00:15:14.758 { 00:15:14.758 "name": "NewBaseBdev", 00:15:14.758 "uuid": "a2d4430d-e035-4309-ab47-968bf97f74a9", 00:15:14.758 "is_configured": true, 00:15:14.758 "data_offset": 0, 00:15:14.758 "data_size": 65536 00:15:14.758 }, 00:15:14.758 { 00:15:14.758 "name": "BaseBdev2", 00:15:14.758 "uuid": "de165996-b9e9-4cbc-85fc-67983bf76c14", 00:15:14.758 "is_configured": true, 00:15:14.758 "data_offset": 0, 00:15:14.758 "data_size": 65536 00:15:14.758 }, 00:15:14.758 { 00:15:14.758 "name": "BaseBdev3", 00:15:14.758 "uuid": "ff659a99-4a83-4d45-af44-c292466dcbae", 00:15:14.758 "is_configured": true, 00:15:14.758 "data_offset": 0, 00:15:14.758 "data_size": 65536 00:15:14.758 } 00:15:14.758 ] 00:15:14.758 }' 00:15:14.758 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:14.758 08:28:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:15.326 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:15.326 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:15.326 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:15.326 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:15.326 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:15.326 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:15.326 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:15.326 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:15.326 [2024-07-23 08:28:27.817943] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:15.326 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:15.326 "name": "Existed_Raid", 00:15:15.326 "aliases": [ 00:15:15.326 "a4cee085-0b7c-4fab-84c3-e6720f23ec8f" 00:15:15.326 ], 00:15:15.326 "product_name": "Raid Volume", 00:15:15.326 "block_size": 512, 00:15:15.326 "num_blocks": 196608, 00:15:15.326 "uuid": "a4cee085-0b7c-4fab-84c3-e6720f23ec8f", 00:15:15.326 "assigned_rate_limits": { 00:15:15.326 "rw_ios_per_sec": 0, 00:15:15.326 "rw_mbytes_per_sec": 0, 00:15:15.326 "r_mbytes_per_sec": 0, 00:15:15.326 "w_mbytes_per_sec": 0 00:15:15.326 }, 00:15:15.326 "claimed": false, 00:15:15.326 "zoned": false, 00:15:15.326 "supported_io_types": { 00:15:15.326 "read": true, 00:15:15.326 "write": true, 00:15:15.326 "unmap": true, 00:15:15.326 "flush": true, 00:15:15.326 "reset": true, 00:15:15.326 "nvme_admin": false, 00:15:15.326 "nvme_io": false, 00:15:15.326 "nvme_io_md": false, 00:15:15.326 "write_zeroes": true, 00:15:15.326 "zcopy": false, 00:15:15.326 "get_zone_info": false, 00:15:15.326 "zone_management": false, 00:15:15.326 "zone_append": false, 00:15:15.326 "compare": false, 00:15:15.326 "compare_and_write": false, 00:15:15.326 "abort": false, 00:15:15.326 "seek_hole": false, 00:15:15.326 "seek_data": false, 00:15:15.326 "copy": false, 00:15:15.326 "nvme_iov_md": false 00:15:15.326 }, 00:15:15.326 "memory_domains": [ 00:15:15.326 { 00:15:15.326 "dma_device_id": "system", 00:15:15.326 "dma_device_type": 1 00:15:15.326 }, 00:15:15.326 { 00:15:15.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.326 "dma_device_type": 2 00:15:15.326 }, 00:15:15.326 { 00:15:15.326 "dma_device_id": "system", 00:15:15.326 "dma_device_type": 1 00:15:15.326 }, 00:15:15.326 { 00:15:15.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.326 "dma_device_type": 2 00:15:15.326 }, 00:15:15.326 { 00:15:15.326 "dma_device_id": "system", 00:15:15.326 "dma_device_type": 1 00:15:15.326 }, 00:15:15.326 { 00:15:15.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.326 "dma_device_type": 2 00:15:15.326 } 00:15:15.326 ], 00:15:15.326 "driver_specific": { 00:15:15.326 "raid": { 00:15:15.326 "uuid": "a4cee085-0b7c-4fab-84c3-e6720f23ec8f", 00:15:15.326 "strip_size_kb": 64, 00:15:15.326 "state": "online", 00:15:15.326 "raid_level": "concat", 00:15:15.326 "superblock": false, 00:15:15.326 "num_base_bdevs": 3, 00:15:15.326 "num_base_bdevs_discovered": 3, 00:15:15.326 "num_base_bdevs_operational": 3, 00:15:15.326 "base_bdevs_list": [ 00:15:15.326 { 00:15:15.326 "name": "NewBaseBdev", 00:15:15.326 "uuid": "a2d4430d-e035-4309-ab47-968bf97f74a9", 00:15:15.326 "is_configured": true, 00:15:15.326 "data_offset": 0, 00:15:15.326 "data_size": 65536 00:15:15.326 }, 00:15:15.326 { 00:15:15.326 "name": "BaseBdev2", 00:15:15.326 "uuid": "de165996-b9e9-4cbc-85fc-67983bf76c14", 00:15:15.326 "is_configured": true, 00:15:15.326 "data_offset": 0, 00:15:15.326 "data_size": 65536 00:15:15.326 }, 00:15:15.326 { 00:15:15.326 "name": "BaseBdev3", 00:15:15.326 "uuid": "ff659a99-4a83-4d45-af44-c292466dcbae", 00:15:15.326 "is_configured": true, 00:15:15.326 "data_offset": 0, 00:15:15.327 "data_size": 65536 00:15:15.327 } 00:15:15.327 ] 00:15:15.327 } 00:15:15.327 } 00:15:15.327 }' 00:15:15.327 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:15.585 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:15.585 BaseBdev2 00:15:15.585 BaseBdev3' 00:15:15.585 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:15.585 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:15.585 08:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:15.585 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:15.585 "name": "NewBaseBdev", 00:15:15.585 "aliases": [ 00:15:15.585 "a2d4430d-e035-4309-ab47-968bf97f74a9" 00:15:15.585 ], 00:15:15.585 "product_name": "Malloc disk", 00:15:15.585 "block_size": 512, 00:15:15.585 "num_blocks": 65536, 00:15:15.585 "uuid": "a2d4430d-e035-4309-ab47-968bf97f74a9", 00:15:15.585 "assigned_rate_limits": { 00:15:15.585 "rw_ios_per_sec": 0, 00:15:15.585 "rw_mbytes_per_sec": 0, 00:15:15.585 "r_mbytes_per_sec": 0, 00:15:15.585 "w_mbytes_per_sec": 0 00:15:15.585 }, 00:15:15.585 "claimed": true, 00:15:15.585 "claim_type": "exclusive_write", 00:15:15.585 "zoned": false, 00:15:15.585 "supported_io_types": { 00:15:15.585 "read": true, 00:15:15.585 "write": true, 00:15:15.585 "unmap": true, 00:15:15.585 "flush": true, 00:15:15.585 "reset": true, 00:15:15.585 "nvme_admin": false, 00:15:15.585 "nvme_io": false, 00:15:15.585 "nvme_io_md": false, 00:15:15.585 "write_zeroes": true, 00:15:15.585 "zcopy": true, 00:15:15.585 "get_zone_info": false, 00:15:15.585 "zone_management": false, 00:15:15.585 "zone_append": false, 00:15:15.585 "compare": false, 00:15:15.585 "compare_and_write": false, 00:15:15.585 "abort": true, 00:15:15.585 "seek_hole": false, 00:15:15.585 "seek_data": false, 00:15:15.585 "copy": true, 00:15:15.585 "nvme_iov_md": false 00:15:15.585 }, 00:15:15.585 "memory_domains": [ 00:15:15.585 { 00:15:15.585 "dma_device_id": "system", 00:15:15.585 "dma_device_type": 1 00:15:15.585 }, 00:15:15.585 { 00:15:15.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.585 "dma_device_type": 2 00:15:15.585 } 00:15:15.585 ], 00:15:15.585 "driver_specific": {} 00:15:15.585 }' 00:15:15.585 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:15.585 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:15.844 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:15.844 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.844 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.844 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:15.844 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.844 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.844 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:15.844 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:15.844 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:15.844 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:16.133 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:16.133 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:16.133 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:16.133 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:16.133 "name": "BaseBdev2", 00:15:16.133 "aliases": [ 00:15:16.133 "de165996-b9e9-4cbc-85fc-67983bf76c14" 00:15:16.133 ], 00:15:16.133 "product_name": "Malloc disk", 00:15:16.133 "block_size": 512, 00:15:16.133 "num_blocks": 65536, 00:15:16.133 "uuid": "de165996-b9e9-4cbc-85fc-67983bf76c14", 00:15:16.133 "assigned_rate_limits": { 00:15:16.133 "rw_ios_per_sec": 0, 00:15:16.133 "rw_mbytes_per_sec": 0, 00:15:16.133 "r_mbytes_per_sec": 0, 00:15:16.133 "w_mbytes_per_sec": 0 00:15:16.133 }, 00:15:16.133 "claimed": true, 00:15:16.133 "claim_type": "exclusive_write", 00:15:16.133 "zoned": false, 00:15:16.133 "supported_io_types": { 00:15:16.133 "read": true, 00:15:16.133 "write": true, 00:15:16.133 "unmap": true, 00:15:16.133 "flush": true, 00:15:16.133 "reset": true, 00:15:16.133 "nvme_admin": false, 00:15:16.133 "nvme_io": false, 00:15:16.133 "nvme_io_md": false, 00:15:16.133 "write_zeroes": true, 00:15:16.133 "zcopy": true, 00:15:16.133 "get_zone_info": false, 00:15:16.133 "zone_management": false, 00:15:16.133 "zone_append": false, 00:15:16.133 "compare": false, 00:15:16.133 "compare_and_write": false, 00:15:16.133 "abort": true, 00:15:16.133 "seek_hole": false, 00:15:16.133 "seek_data": false, 00:15:16.133 "copy": true, 00:15:16.133 "nvme_iov_md": false 00:15:16.133 }, 00:15:16.133 "memory_domains": [ 00:15:16.133 { 00:15:16.133 "dma_device_id": "system", 00:15:16.133 "dma_device_type": 1 00:15:16.133 }, 00:15:16.133 { 00:15:16.133 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.134 "dma_device_type": 2 00:15:16.134 } 00:15:16.134 ], 00:15:16.134 "driver_specific": {} 00:15:16.134 }' 00:15:16.134 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:16.134 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:16.134 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:16.134 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:16.392 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:16.392 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:16.392 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:16.392 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:16.392 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:16.392 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:16.392 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:16.393 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:16.393 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:16.393 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:16.393 08:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:16.652 08:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:16.652 "name": "BaseBdev3", 00:15:16.652 "aliases": [ 00:15:16.652 "ff659a99-4a83-4d45-af44-c292466dcbae" 00:15:16.652 ], 00:15:16.652 "product_name": "Malloc disk", 00:15:16.652 "block_size": 512, 00:15:16.652 "num_blocks": 65536, 00:15:16.652 "uuid": "ff659a99-4a83-4d45-af44-c292466dcbae", 00:15:16.652 "assigned_rate_limits": { 00:15:16.652 "rw_ios_per_sec": 0, 00:15:16.652 "rw_mbytes_per_sec": 0, 00:15:16.652 "r_mbytes_per_sec": 0, 00:15:16.652 "w_mbytes_per_sec": 0 00:15:16.652 }, 00:15:16.652 "claimed": true, 00:15:16.652 "claim_type": "exclusive_write", 00:15:16.652 "zoned": false, 00:15:16.652 "supported_io_types": { 00:15:16.652 "read": true, 00:15:16.652 "write": true, 00:15:16.652 "unmap": true, 00:15:16.652 "flush": true, 00:15:16.652 "reset": true, 00:15:16.652 "nvme_admin": false, 00:15:16.652 "nvme_io": false, 00:15:16.652 "nvme_io_md": false, 00:15:16.652 "write_zeroes": true, 00:15:16.652 "zcopy": true, 00:15:16.652 "get_zone_info": false, 00:15:16.652 "zone_management": false, 00:15:16.652 "zone_append": false, 00:15:16.652 "compare": false, 00:15:16.652 "compare_and_write": false, 00:15:16.652 "abort": true, 00:15:16.652 "seek_hole": false, 00:15:16.652 "seek_data": false, 00:15:16.652 "copy": true, 00:15:16.652 "nvme_iov_md": false 00:15:16.652 }, 00:15:16.652 "memory_domains": [ 00:15:16.652 { 00:15:16.652 "dma_device_id": "system", 00:15:16.652 "dma_device_type": 1 00:15:16.652 }, 00:15:16.652 { 00:15:16.652 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.652 "dma_device_type": 2 00:15:16.652 } 00:15:16.652 ], 00:15:16.652 "driver_specific": {} 00:15:16.652 }' 00:15:16.652 08:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:16.652 08:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:16.652 08:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:16.652 08:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:16.652 08:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:16.911 08:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:16.911 08:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:16.911 08:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:16.911 08:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:16.911 08:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:16.911 08:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:16.911 08:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:16.911 08:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:17.170 [2024-07-23 08:28:29.486037] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:17.170 [2024-07-23 08:28:29.486067] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:17.170 [2024-07-23 08:28:29.486145] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:17.170 [2024-07-23 08:28:29.486200] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:17.170 [2024-07-23 08:28:29.486217] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036980 name Existed_Raid, state offline 00:15:17.170 08:28:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1439059 00:15:17.170 08:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1439059 ']' 00:15:17.170 08:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1439059 00:15:17.170 08:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:15:17.170 08:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:17.170 08:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1439059 00:15:17.170 08:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:17.170 08:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:17.170 08:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1439059' 00:15:17.170 killing process with pid 1439059 00:15:17.170 08:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1439059 00:15:17.170 [2024-07-23 08:28:29.545317] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:17.170 08:28:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1439059 00:15:17.429 [2024-07-23 08:28:29.785131] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:18.806 00:15:18.806 real 0m23.097s 00:15:18.806 user 0m41.277s 00:15:18.806 sys 0m3.424s 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:18.806 ************************************ 00:15:18.806 END TEST raid_state_function_test 00:15:18.806 ************************************ 00:15:18.806 08:28:31 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:18.806 08:28:31 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:15:18.806 08:28:31 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:18.806 08:28:31 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:18.806 08:28:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:18.806 ************************************ 00:15:18.806 START TEST raid_state_function_test_sb 00:15:18.806 ************************************ 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:18.806 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1443949 00:15:18.807 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1443949' 00:15:18.807 Process raid pid: 1443949 00:15:18.807 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:18.807 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1443949 /var/tmp/spdk-raid.sock 00:15:18.807 08:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1443949 ']' 00:15:18.807 08:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:18.807 08:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:18.807 08:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:18.807 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:18.807 08:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:18.807 08:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:18.807 [2024-07-23 08:28:31.206488] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:15:18.807 [2024-07-23 08:28:31.206589] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:19.065 [2024-07-23 08:28:31.332357] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:19.065 [2024-07-23 08:28:31.545198] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:19.324 [2024-07-23 08:28:31.795560] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:19.324 [2024-07-23 08:28:31.795601] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:19.583 08:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:19.583 08:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:19.583 08:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:19.843 [2024-07-23 08:28:32.127832] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:19.843 [2024-07-23 08:28:32.127875] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:19.843 [2024-07-23 08:28:32.127885] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:19.843 [2024-07-23 08:28:32.127896] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:19.843 [2024-07-23 08:28:32.127918] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:19.843 [2024-07-23 08:28:32.127927] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:19.843 08:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:19.843 08:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:19.843 08:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:19.843 08:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:19.843 08:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:19.843 08:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:19.843 08:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:19.843 08:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:19.843 08:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:19.843 08:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:19.843 08:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:19.843 08:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.843 08:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.843 "name": "Existed_Raid", 00:15:19.843 "uuid": "3ea6910a-ff0c-4932-adb8-7f2590544aaf", 00:15:19.843 "strip_size_kb": 64, 00:15:19.843 "state": "configuring", 00:15:19.843 "raid_level": "concat", 00:15:19.843 "superblock": true, 00:15:19.843 "num_base_bdevs": 3, 00:15:19.843 "num_base_bdevs_discovered": 0, 00:15:19.843 "num_base_bdevs_operational": 3, 00:15:19.843 "base_bdevs_list": [ 00:15:19.843 { 00:15:19.843 "name": "BaseBdev1", 00:15:19.843 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.843 "is_configured": false, 00:15:19.843 "data_offset": 0, 00:15:19.843 "data_size": 0 00:15:19.843 }, 00:15:19.843 { 00:15:19.843 "name": "BaseBdev2", 00:15:19.843 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.843 "is_configured": false, 00:15:19.843 "data_offset": 0, 00:15:19.843 "data_size": 0 00:15:19.843 }, 00:15:19.843 { 00:15:19.843 "name": "BaseBdev3", 00:15:19.843 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.843 "is_configured": false, 00:15:19.843 "data_offset": 0, 00:15:19.843 "data_size": 0 00:15:19.843 } 00:15:19.843 ] 00:15:19.843 }' 00:15:19.843 08:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.843 08:28:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:20.411 08:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:20.670 [2024-07-23 08:28:32.949862] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:20.670 [2024-07-23 08:28:32.949895] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:15:20.670 08:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:20.670 [2024-07-23 08:28:33.118340] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:20.670 [2024-07-23 08:28:33.118377] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:20.670 [2024-07-23 08:28:33.118387] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:20.670 [2024-07-23 08:28:33.118398] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:20.670 [2024-07-23 08:28:33.118404] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:20.670 [2024-07-23 08:28:33.118416] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:20.670 08:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:20.927 [2024-07-23 08:28:33.315896] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:20.927 BaseBdev1 00:15:20.927 08:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:20.927 08:28:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:20.927 08:28:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:20.927 08:28:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:20.927 08:28:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:20.927 08:28:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:20.927 08:28:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:21.185 08:28:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:21.185 [ 00:15:21.185 { 00:15:21.185 "name": "BaseBdev1", 00:15:21.185 "aliases": [ 00:15:21.185 "5a685f81-f9bd-4217-ba5e-b830c03fb581" 00:15:21.185 ], 00:15:21.185 "product_name": "Malloc disk", 00:15:21.185 "block_size": 512, 00:15:21.185 "num_blocks": 65536, 00:15:21.185 "uuid": "5a685f81-f9bd-4217-ba5e-b830c03fb581", 00:15:21.185 "assigned_rate_limits": { 00:15:21.185 "rw_ios_per_sec": 0, 00:15:21.185 "rw_mbytes_per_sec": 0, 00:15:21.185 "r_mbytes_per_sec": 0, 00:15:21.185 "w_mbytes_per_sec": 0 00:15:21.185 }, 00:15:21.185 "claimed": true, 00:15:21.185 "claim_type": "exclusive_write", 00:15:21.185 "zoned": false, 00:15:21.185 "supported_io_types": { 00:15:21.185 "read": true, 00:15:21.185 "write": true, 00:15:21.185 "unmap": true, 00:15:21.185 "flush": true, 00:15:21.185 "reset": true, 00:15:21.185 "nvme_admin": false, 00:15:21.185 "nvme_io": false, 00:15:21.185 "nvme_io_md": false, 00:15:21.185 "write_zeroes": true, 00:15:21.185 "zcopy": true, 00:15:21.185 "get_zone_info": false, 00:15:21.185 "zone_management": false, 00:15:21.185 "zone_append": false, 00:15:21.185 "compare": false, 00:15:21.185 "compare_and_write": false, 00:15:21.185 "abort": true, 00:15:21.185 "seek_hole": false, 00:15:21.185 "seek_data": false, 00:15:21.185 "copy": true, 00:15:21.185 "nvme_iov_md": false 00:15:21.185 }, 00:15:21.185 "memory_domains": [ 00:15:21.186 { 00:15:21.186 "dma_device_id": "system", 00:15:21.186 "dma_device_type": 1 00:15:21.186 }, 00:15:21.186 { 00:15:21.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.186 "dma_device_type": 2 00:15:21.186 } 00:15:21.186 ], 00:15:21.186 "driver_specific": {} 00:15:21.186 } 00:15:21.186 ] 00:15:21.186 08:28:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:21.186 08:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:21.186 08:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:21.186 08:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:21.186 08:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:21.186 08:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:21.186 08:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:21.186 08:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:21.186 08:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:21.186 08:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:21.186 08:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:21.186 08:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.186 08:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:21.445 08:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:21.445 "name": "Existed_Raid", 00:15:21.445 "uuid": "eaf396eb-a786-4e80-8986-df4ab3bb8197", 00:15:21.445 "strip_size_kb": 64, 00:15:21.445 "state": "configuring", 00:15:21.445 "raid_level": "concat", 00:15:21.445 "superblock": true, 00:15:21.445 "num_base_bdevs": 3, 00:15:21.445 "num_base_bdevs_discovered": 1, 00:15:21.445 "num_base_bdevs_operational": 3, 00:15:21.445 "base_bdevs_list": [ 00:15:21.445 { 00:15:21.445 "name": "BaseBdev1", 00:15:21.445 "uuid": "5a685f81-f9bd-4217-ba5e-b830c03fb581", 00:15:21.445 "is_configured": true, 00:15:21.445 "data_offset": 2048, 00:15:21.445 "data_size": 63488 00:15:21.445 }, 00:15:21.445 { 00:15:21.445 "name": "BaseBdev2", 00:15:21.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.445 "is_configured": false, 00:15:21.445 "data_offset": 0, 00:15:21.445 "data_size": 0 00:15:21.445 }, 00:15:21.445 { 00:15:21.445 "name": "BaseBdev3", 00:15:21.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.445 "is_configured": false, 00:15:21.445 "data_offset": 0, 00:15:21.445 "data_size": 0 00:15:21.445 } 00:15:21.445 ] 00:15:21.445 }' 00:15:21.445 08:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:21.445 08:28:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:22.012 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:22.012 [2024-07-23 08:28:34.462964] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:22.012 [2024-07-23 08:28:34.463012] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:15:22.012 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:22.271 [2024-07-23 08:28:34.619410] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:22.271 [2024-07-23 08:28:34.621017] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:22.271 [2024-07-23 08:28:34.621051] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:22.271 [2024-07-23 08:28:34.621060] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:22.271 [2024-07-23 08:28:34.621069] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:22.271 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:22.271 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:22.271 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:22.271 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:22.271 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:22.271 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:22.271 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:22.271 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:22.271 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:22.271 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:22.271 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:22.271 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:22.271 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.271 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:22.528 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:22.528 "name": "Existed_Raid", 00:15:22.528 "uuid": "8f60ad26-3b1e-449d-8e28-873f353ab75f", 00:15:22.528 "strip_size_kb": 64, 00:15:22.528 "state": "configuring", 00:15:22.528 "raid_level": "concat", 00:15:22.528 "superblock": true, 00:15:22.528 "num_base_bdevs": 3, 00:15:22.528 "num_base_bdevs_discovered": 1, 00:15:22.528 "num_base_bdevs_operational": 3, 00:15:22.528 "base_bdevs_list": [ 00:15:22.528 { 00:15:22.528 "name": "BaseBdev1", 00:15:22.528 "uuid": "5a685f81-f9bd-4217-ba5e-b830c03fb581", 00:15:22.528 "is_configured": true, 00:15:22.528 "data_offset": 2048, 00:15:22.528 "data_size": 63488 00:15:22.528 }, 00:15:22.528 { 00:15:22.528 "name": "BaseBdev2", 00:15:22.528 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:22.528 "is_configured": false, 00:15:22.528 "data_offset": 0, 00:15:22.528 "data_size": 0 00:15:22.528 }, 00:15:22.528 { 00:15:22.528 "name": "BaseBdev3", 00:15:22.528 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:22.529 "is_configured": false, 00:15:22.529 "data_offset": 0, 00:15:22.529 "data_size": 0 00:15:22.529 } 00:15:22.529 ] 00:15:22.529 }' 00:15:22.529 08:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:22.529 08:28:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:22.786 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:23.044 [2024-07-23 08:28:35.467760] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:23.044 BaseBdev2 00:15:23.044 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:23.044 08:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:23.044 08:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:23.044 08:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:23.044 08:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:23.044 08:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:23.044 08:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:23.303 08:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:23.303 [ 00:15:23.303 { 00:15:23.303 "name": "BaseBdev2", 00:15:23.303 "aliases": [ 00:15:23.303 "fbd8849f-9090-4bae-9514-42c51b954d4a" 00:15:23.303 ], 00:15:23.303 "product_name": "Malloc disk", 00:15:23.303 "block_size": 512, 00:15:23.303 "num_blocks": 65536, 00:15:23.303 "uuid": "fbd8849f-9090-4bae-9514-42c51b954d4a", 00:15:23.303 "assigned_rate_limits": { 00:15:23.303 "rw_ios_per_sec": 0, 00:15:23.303 "rw_mbytes_per_sec": 0, 00:15:23.303 "r_mbytes_per_sec": 0, 00:15:23.303 "w_mbytes_per_sec": 0 00:15:23.303 }, 00:15:23.303 "claimed": true, 00:15:23.303 "claim_type": "exclusive_write", 00:15:23.303 "zoned": false, 00:15:23.303 "supported_io_types": { 00:15:23.303 "read": true, 00:15:23.303 "write": true, 00:15:23.303 "unmap": true, 00:15:23.303 "flush": true, 00:15:23.303 "reset": true, 00:15:23.303 "nvme_admin": false, 00:15:23.303 "nvme_io": false, 00:15:23.303 "nvme_io_md": false, 00:15:23.303 "write_zeroes": true, 00:15:23.303 "zcopy": true, 00:15:23.303 "get_zone_info": false, 00:15:23.303 "zone_management": false, 00:15:23.303 "zone_append": false, 00:15:23.303 "compare": false, 00:15:23.303 "compare_and_write": false, 00:15:23.303 "abort": true, 00:15:23.303 "seek_hole": false, 00:15:23.303 "seek_data": false, 00:15:23.303 "copy": true, 00:15:23.303 "nvme_iov_md": false 00:15:23.303 }, 00:15:23.303 "memory_domains": [ 00:15:23.303 { 00:15:23.303 "dma_device_id": "system", 00:15:23.303 "dma_device_type": 1 00:15:23.303 }, 00:15:23.303 { 00:15:23.303 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.303 "dma_device_type": 2 00:15:23.303 } 00:15:23.303 ], 00:15:23.303 "driver_specific": {} 00:15:23.303 } 00:15:23.303 ] 00:15:23.303 08:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:23.303 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:23.304 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:23.304 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:23.304 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:23.304 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:23.304 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:23.304 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:23.304 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:23.304 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:23.304 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:23.304 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:23.304 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:23.304 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.304 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:23.562 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:23.562 "name": "Existed_Raid", 00:15:23.562 "uuid": "8f60ad26-3b1e-449d-8e28-873f353ab75f", 00:15:23.562 "strip_size_kb": 64, 00:15:23.562 "state": "configuring", 00:15:23.562 "raid_level": "concat", 00:15:23.562 "superblock": true, 00:15:23.562 "num_base_bdevs": 3, 00:15:23.562 "num_base_bdevs_discovered": 2, 00:15:23.562 "num_base_bdevs_operational": 3, 00:15:23.562 "base_bdevs_list": [ 00:15:23.562 { 00:15:23.562 "name": "BaseBdev1", 00:15:23.562 "uuid": "5a685f81-f9bd-4217-ba5e-b830c03fb581", 00:15:23.562 "is_configured": true, 00:15:23.562 "data_offset": 2048, 00:15:23.562 "data_size": 63488 00:15:23.562 }, 00:15:23.562 { 00:15:23.562 "name": "BaseBdev2", 00:15:23.562 "uuid": "fbd8849f-9090-4bae-9514-42c51b954d4a", 00:15:23.563 "is_configured": true, 00:15:23.563 "data_offset": 2048, 00:15:23.563 "data_size": 63488 00:15:23.563 }, 00:15:23.563 { 00:15:23.563 "name": "BaseBdev3", 00:15:23.563 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:23.563 "is_configured": false, 00:15:23.563 "data_offset": 0, 00:15:23.563 "data_size": 0 00:15:23.563 } 00:15:23.563 ] 00:15:23.563 }' 00:15:23.563 08:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:23.563 08:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:24.129 08:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:24.129 [2024-07-23 08:28:36.639306] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:24.129 [2024-07-23 08:28:36.639518] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:15:24.129 [2024-07-23 08:28:36.639536] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:24.129 [2024-07-23 08:28:36.639791] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:15:24.129 [2024-07-23 08:28:36.639982] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:15:24.129 [2024-07-23 08:28:36.639993] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:15:24.129 [2024-07-23 08:28:36.640155] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:24.129 BaseBdev3 00:15:24.387 08:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:24.387 08:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:24.387 08:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:24.387 08:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:24.387 08:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:24.387 08:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:24.387 08:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:24.387 08:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:24.645 [ 00:15:24.645 { 00:15:24.645 "name": "BaseBdev3", 00:15:24.645 "aliases": [ 00:15:24.645 "9bf3a65f-b0a9-4ce3-bf09-78e6f1d7be07" 00:15:24.645 ], 00:15:24.645 "product_name": "Malloc disk", 00:15:24.645 "block_size": 512, 00:15:24.645 "num_blocks": 65536, 00:15:24.645 "uuid": "9bf3a65f-b0a9-4ce3-bf09-78e6f1d7be07", 00:15:24.645 "assigned_rate_limits": { 00:15:24.645 "rw_ios_per_sec": 0, 00:15:24.645 "rw_mbytes_per_sec": 0, 00:15:24.645 "r_mbytes_per_sec": 0, 00:15:24.645 "w_mbytes_per_sec": 0 00:15:24.645 }, 00:15:24.645 "claimed": true, 00:15:24.645 "claim_type": "exclusive_write", 00:15:24.645 "zoned": false, 00:15:24.645 "supported_io_types": { 00:15:24.645 "read": true, 00:15:24.645 "write": true, 00:15:24.645 "unmap": true, 00:15:24.645 "flush": true, 00:15:24.645 "reset": true, 00:15:24.645 "nvme_admin": false, 00:15:24.645 "nvme_io": false, 00:15:24.645 "nvme_io_md": false, 00:15:24.645 "write_zeroes": true, 00:15:24.645 "zcopy": true, 00:15:24.645 "get_zone_info": false, 00:15:24.645 "zone_management": false, 00:15:24.645 "zone_append": false, 00:15:24.645 "compare": false, 00:15:24.645 "compare_and_write": false, 00:15:24.645 "abort": true, 00:15:24.645 "seek_hole": false, 00:15:24.645 "seek_data": false, 00:15:24.645 "copy": true, 00:15:24.645 "nvme_iov_md": false 00:15:24.645 }, 00:15:24.645 "memory_domains": [ 00:15:24.645 { 00:15:24.645 "dma_device_id": "system", 00:15:24.645 "dma_device_type": 1 00:15:24.645 }, 00:15:24.645 { 00:15:24.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.645 "dma_device_type": 2 00:15:24.645 } 00:15:24.645 ], 00:15:24.645 "driver_specific": {} 00:15:24.645 } 00:15:24.645 ] 00:15:24.645 08:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:24.645 08:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:24.646 08:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:24.646 08:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:24.646 08:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.646 08:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:24.646 08:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:24.646 08:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:24.646 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:24.646 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.646 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.646 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.646 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.646 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.646 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.904 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.904 "name": "Existed_Raid", 00:15:24.905 "uuid": "8f60ad26-3b1e-449d-8e28-873f353ab75f", 00:15:24.905 "strip_size_kb": 64, 00:15:24.905 "state": "online", 00:15:24.905 "raid_level": "concat", 00:15:24.905 "superblock": true, 00:15:24.905 "num_base_bdevs": 3, 00:15:24.905 "num_base_bdevs_discovered": 3, 00:15:24.905 "num_base_bdevs_operational": 3, 00:15:24.905 "base_bdevs_list": [ 00:15:24.905 { 00:15:24.905 "name": "BaseBdev1", 00:15:24.905 "uuid": "5a685f81-f9bd-4217-ba5e-b830c03fb581", 00:15:24.905 "is_configured": true, 00:15:24.905 "data_offset": 2048, 00:15:24.905 "data_size": 63488 00:15:24.905 }, 00:15:24.905 { 00:15:24.905 "name": "BaseBdev2", 00:15:24.905 "uuid": "fbd8849f-9090-4bae-9514-42c51b954d4a", 00:15:24.905 "is_configured": true, 00:15:24.905 "data_offset": 2048, 00:15:24.905 "data_size": 63488 00:15:24.905 }, 00:15:24.905 { 00:15:24.905 "name": "BaseBdev3", 00:15:24.905 "uuid": "9bf3a65f-b0a9-4ce3-bf09-78e6f1d7be07", 00:15:24.905 "is_configured": true, 00:15:24.905 "data_offset": 2048, 00:15:24.905 "data_size": 63488 00:15:24.905 } 00:15:24.905 ] 00:15:24.905 }' 00:15:24.905 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.905 08:28:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:25.164 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:25.164 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:25.164 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:25.164 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:25.164 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:25.164 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:25.164 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:25.164 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:25.423 [2024-07-23 08:28:37.806828] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:25.423 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:25.423 "name": "Existed_Raid", 00:15:25.423 "aliases": [ 00:15:25.423 "8f60ad26-3b1e-449d-8e28-873f353ab75f" 00:15:25.423 ], 00:15:25.423 "product_name": "Raid Volume", 00:15:25.423 "block_size": 512, 00:15:25.423 "num_blocks": 190464, 00:15:25.423 "uuid": "8f60ad26-3b1e-449d-8e28-873f353ab75f", 00:15:25.423 "assigned_rate_limits": { 00:15:25.423 "rw_ios_per_sec": 0, 00:15:25.423 "rw_mbytes_per_sec": 0, 00:15:25.423 "r_mbytes_per_sec": 0, 00:15:25.423 "w_mbytes_per_sec": 0 00:15:25.423 }, 00:15:25.423 "claimed": false, 00:15:25.423 "zoned": false, 00:15:25.423 "supported_io_types": { 00:15:25.423 "read": true, 00:15:25.423 "write": true, 00:15:25.423 "unmap": true, 00:15:25.423 "flush": true, 00:15:25.423 "reset": true, 00:15:25.423 "nvme_admin": false, 00:15:25.423 "nvme_io": false, 00:15:25.423 "nvme_io_md": false, 00:15:25.423 "write_zeroes": true, 00:15:25.423 "zcopy": false, 00:15:25.423 "get_zone_info": false, 00:15:25.423 "zone_management": false, 00:15:25.423 "zone_append": false, 00:15:25.423 "compare": false, 00:15:25.423 "compare_and_write": false, 00:15:25.423 "abort": false, 00:15:25.423 "seek_hole": false, 00:15:25.423 "seek_data": false, 00:15:25.423 "copy": false, 00:15:25.423 "nvme_iov_md": false 00:15:25.423 }, 00:15:25.423 "memory_domains": [ 00:15:25.423 { 00:15:25.423 "dma_device_id": "system", 00:15:25.423 "dma_device_type": 1 00:15:25.423 }, 00:15:25.423 { 00:15:25.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.423 "dma_device_type": 2 00:15:25.423 }, 00:15:25.423 { 00:15:25.423 "dma_device_id": "system", 00:15:25.423 "dma_device_type": 1 00:15:25.423 }, 00:15:25.423 { 00:15:25.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.423 "dma_device_type": 2 00:15:25.423 }, 00:15:25.423 { 00:15:25.423 "dma_device_id": "system", 00:15:25.423 "dma_device_type": 1 00:15:25.423 }, 00:15:25.423 { 00:15:25.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.423 "dma_device_type": 2 00:15:25.423 } 00:15:25.423 ], 00:15:25.423 "driver_specific": { 00:15:25.423 "raid": { 00:15:25.423 "uuid": "8f60ad26-3b1e-449d-8e28-873f353ab75f", 00:15:25.423 "strip_size_kb": 64, 00:15:25.423 "state": "online", 00:15:25.423 "raid_level": "concat", 00:15:25.423 "superblock": true, 00:15:25.423 "num_base_bdevs": 3, 00:15:25.423 "num_base_bdevs_discovered": 3, 00:15:25.423 "num_base_bdevs_operational": 3, 00:15:25.423 "base_bdevs_list": [ 00:15:25.423 { 00:15:25.423 "name": "BaseBdev1", 00:15:25.423 "uuid": "5a685f81-f9bd-4217-ba5e-b830c03fb581", 00:15:25.423 "is_configured": true, 00:15:25.423 "data_offset": 2048, 00:15:25.423 "data_size": 63488 00:15:25.423 }, 00:15:25.423 { 00:15:25.423 "name": "BaseBdev2", 00:15:25.423 "uuid": "fbd8849f-9090-4bae-9514-42c51b954d4a", 00:15:25.423 "is_configured": true, 00:15:25.423 "data_offset": 2048, 00:15:25.423 "data_size": 63488 00:15:25.423 }, 00:15:25.423 { 00:15:25.423 "name": "BaseBdev3", 00:15:25.423 "uuid": "9bf3a65f-b0a9-4ce3-bf09-78e6f1d7be07", 00:15:25.423 "is_configured": true, 00:15:25.423 "data_offset": 2048, 00:15:25.423 "data_size": 63488 00:15:25.423 } 00:15:25.423 ] 00:15:25.423 } 00:15:25.423 } 00:15:25.423 }' 00:15:25.423 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:25.423 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:25.423 BaseBdev2 00:15:25.423 BaseBdev3' 00:15:25.423 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:25.423 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:25.423 08:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:25.682 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:25.682 "name": "BaseBdev1", 00:15:25.682 "aliases": [ 00:15:25.682 "5a685f81-f9bd-4217-ba5e-b830c03fb581" 00:15:25.682 ], 00:15:25.682 "product_name": "Malloc disk", 00:15:25.682 "block_size": 512, 00:15:25.682 "num_blocks": 65536, 00:15:25.682 "uuid": "5a685f81-f9bd-4217-ba5e-b830c03fb581", 00:15:25.682 "assigned_rate_limits": { 00:15:25.682 "rw_ios_per_sec": 0, 00:15:25.682 "rw_mbytes_per_sec": 0, 00:15:25.682 "r_mbytes_per_sec": 0, 00:15:25.682 "w_mbytes_per_sec": 0 00:15:25.682 }, 00:15:25.682 "claimed": true, 00:15:25.682 "claim_type": "exclusive_write", 00:15:25.682 "zoned": false, 00:15:25.682 "supported_io_types": { 00:15:25.682 "read": true, 00:15:25.682 "write": true, 00:15:25.682 "unmap": true, 00:15:25.682 "flush": true, 00:15:25.682 "reset": true, 00:15:25.682 "nvme_admin": false, 00:15:25.682 "nvme_io": false, 00:15:25.682 "nvme_io_md": false, 00:15:25.682 "write_zeroes": true, 00:15:25.682 "zcopy": true, 00:15:25.682 "get_zone_info": false, 00:15:25.682 "zone_management": false, 00:15:25.682 "zone_append": false, 00:15:25.682 "compare": false, 00:15:25.682 "compare_and_write": false, 00:15:25.682 "abort": true, 00:15:25.682 "seek_hole": false, 00:15:25.682 "seek_data": false, 00:15:25.682 "copy": true, 00:15:25.682 "nvme_iov_md": false 00:15:25.682 }, 00:15:25.682 "memory_domains": [ 00:15:25.682 { 00:15:25.682 "dma_device_id": "system", 00:15:25.682 "dma_device_type": 1 00:15:25.682 }, 00:15:25.682 { 00:15:25.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.682 "dma_device_type": 2 00:15:25.682 } 00:15:25.682 ], 00:15:25.682 "driver_specific": {} 00:15:25.682 }' 00:15:25.682 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.682 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.682 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:25.682 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:25.682 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:25.682 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:25.682 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:25.940 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:25.940 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:25.940 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:25.940 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:25.940 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:25.940 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:25.940 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:25.940 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:26.198 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:26.198 "name": "BaseBdev2", 00:15:26.198 "aliases": [ 00:15:26.198 "fbd8849f-9090-4bae-9514-42c51b954d4a" 00:15:26.198 ], 00:15:26.198 "product_name": "Malloc disk", 00:15:26.198 "block_size": 512, 00:15:26.198 "num_blocks": 65536, 00:15:26.198 "uuid": "fbd8849f-9090-4bae-9514-42c51b954d4a", 00:15:26.198 "assigned_rate_limits": { 00:15:26.198 "rw_ios_per_sec": 0, 00:15:26.198 "rw_mbytes_per_sec": 0, 00:15:26.198 "r_mbytes_per_sec": 0, 00:15:26.198 "w_mbytes_per_sec": 0 00:15:26.198 }, 00:15:26.198 "claimed": true, 00:15:26.198 "claim_type": "exclusive_write", 00:15:26.198 "zoned": false, 00:15:26.198 "supported_io_types": { 00:15:26.198 "read": true, 00:15:26.198 "write": true, 00:15:26.198 "unmap": true, 00:15:26.198 "flush": true, 00:15:26.198 "reset": true, 00:15:26.198 "nvme_admin": false, 00:15:26.198 "nvme_io": false, 00:15:26.198 "nvme_io_md": false, 00:15:26.198 "write_zeroes": true, 00:15:26.198 "zcopy": true, 00:15:26.198 "get_zone_info": false, 00:15:26.198 "zone_management": false, 00:15:26.198 "zone_append": false, 00:15:26.198 "compare": false, 00:15:26.198 "compare_and_write": false, 00:15:26.198 "abort": true, 00:15:26.198 "seek_hole": false, 00:15:26.198 "seek_data": false, 00:15:26.198 "copy": true, 00:15:26.198 "nvme_iov_md": false 00:15:26.198 }, 00:15:26.198 "memory_domains": [ 00:15:26.198 { 00:15:26.198 "dma_device_id": "system", 00:15:26.198 "dma_device_type": 1 00:15:26.198 }, 00:15:26.198 { 00:15:26.198 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.198 "dma_device_type": 2 00:15:26.198 } 00:15:26.198 ], 00:15:26.198 "driver_specific": {} 00:15:26.198 }' 00:15:26.198 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.198 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.198 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:26.198 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.198 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.198 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:26.198 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.198 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.456 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:26.456 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.456 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.456 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:26.456 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:26.456 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:26.456 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:26.456 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:26.456 "name": "BaseBdev3", 00:15:26.456 "aliases": [ 00:15:26.456 "9bf3a65f-b0a9-4ce3-bf09-78e6f1d7be07" 00:15:26.456 ], 00:15:26.456 "product_name": "Malloc disk", 00:15:26.456 "block_size": 512, 00:15:26.456 "num_blocks": 65536, 00:15:26.456 "uuid": "9bf3a65f-b0a9-4ce3-bf09-78e6f1d7be07", 00:15:26.456 "assigned_rate_limits": { 00:15:26.456 "rw_ios_per_sec": 0, 00:15:26.456 "rw_mbytes_per_sec": 0, 00:15:26.456 "r_mbytes_per_sec": 0, 00:15:26.456 "w_mbytes_per_sec": 0 00:15:26.456 }, 00:15:26.456 "claimed": true, 00:15:26.456 "claim_type": "exclusive_write", 00:15:26.456 "zoned": false, 00:15:26.456 "supported_io_types": { 00:15:26.456 "read": true, 00:15:26.456 "write": true, 00:15:26.456 "unmap": true, 00:15:26.456 "flush": true, 00:15:26.456 "reset": true, 00:15:26.456 "nvme_admin": false, 00:15:26.456 "nvme_io": false, 00:15:26.456 "nvme_io_md": false, 00:15:26.456 "write_zeroes": true, 00:15:26.456 "zcopy": true, 00:15:26.456 "get_zone_info": false, 00:15:26.457 "zone_management": false, 00:15:26.457 "zone_append": false, 00:15:26.457 "compare": false, 00:15:26.457 "compare_and_write": false, 00:15:26.457 "abort": true, 00:15:26.457 "seek_hole": false, 00:15:26.457 "seek_data": false, 00:15:26.457 "copy": true, 00:15:26.457 "nvme_iov_md": false 00:15:26.457 }, 00:15:26.457 "memory_domains": [ 00:15:26.457 { 00:15:26.457 "dma_device_id": "system", 00:15:26.457 "dma_device_type": 1 00:15:26.457 }, 00:15:26.457 { 00:15:26.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.457 "dma_device_type": 2 00:15:26.457 } 00:15:26.457 ], 00:15:26.457 "driver_specific": {} 00:15:26.457 }' 00:15:26.457 08:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.716 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.716 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:26.716 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.716 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.716 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:26.716 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.716 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.716 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:26.716 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.975 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.975 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:26.975 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:26.975 [2024-07-23 08:28:39.451016] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:26.975 [2024-07-23 08:28:39.451043] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:26.975 [2024-07-23 08:28:39.451096] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:27.234 "name": "Existed_Raid", 00:15:27.234 "uuid": "8f60ad26-3b1e-449d-8e28-873f353ab75f", 00:15:27.234 "strip_size_kb": 64, 00:15:27.234 "state": "offline", 00:15:27.234 "raid_level": "concat", 00:15:27.234 "superblock": true, 00:15:27.234 "num_base_bdevs": 3, 00:15:27.234 "num_base_bdevs_discovered": 2, 00:15:27.234 "num_base_bdevs_operational": 2, 00:15:27.234 "base_bdevs_list": [ 00:15:27.234 { 00:15:27.234 "name": null, 00:15:27.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:27.234 "is_configured": false, 00:15:27.234 "data_offset": 2048, 00:15:27.234 "data_size": 63488 00:15:27.234 }, 00:15:27.234 { 00:15:27.234 "name": "BaseBdev2", 00:15:27.234 "uuid": "fbd8849f-9090-4bae-9514-42c51b954d4a", 00:15:27.234 "is_configured": true, 00:15:27.234 "data_offset": 2048, 00:15:27.234 "data_size": 63488 00:15:27.234 }, 00:15:27.234 { 00:15:27.234 "name": "BaseBdev3", 00:15:27.234 "uuid": "9bf3a65f-b0a9-4ce3-bf09-78e6f1d7be07", 00:15:27.234 "is_configured": true, 00:15:27.234 "data_offset": 2048, 00:15:27.234 "data_size": 63488 00:15:27.234 } 00:15:27.234 ] 00:15:27.234 }' 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:27.234 08:28:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:27.801 08:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:27.801 08:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:27.801 08:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.801 08:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:27.801 08:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:27.801 08:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:27.801 08:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:28.081 [2024-07-23 08:28:40.479022] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:28.081 08:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:28.081 08:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:28.082 08:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.082 08:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:28.340 08:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:28.340 08:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:28.340 08:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:28.598 [2024-07-23 08:28:40.916521] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:28.598 [2024-07-23 08:28:40.916579] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:15:28.598 08:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:28.598 08:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:28.598 08:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.598 08:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:28.856 08:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:28.856 08:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:28.856 08:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:28.856 08:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:28.856 08:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:28.856 08:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:29.114 BaseBdev2 00:15:29.114 08:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:29.114 08:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:29.114 08:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:29.114 08:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:29.114 08:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:29.114 08:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:29.114 08:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:29.114 08:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:29.373 [ 00:15:29.373 { 00:15:29.373 "name": "BaseBdev2", 00:15:29.373 "aliases": [ 00:15:29.373 "a7005867-4572-4af6-a832-4549fac99fdc" 00:15:29.373 ], 00:15:29.373 "product_name": "Malloc disk", 00:15:29.373 "block_size": 512, 00:15:29.373 "num_blocks": 65536, 00:15:29.373 "uuid": "a7005867-4572-4af6-a832-4549fac99fdc", 00:15:29.373 "assigned_rate_limits": { 00:15:29.373 "rw_ios_per_sec": 0, 00:15:29.373 "rw_mbytes_per_sec": 0, 00:15:29.373 "r_mbytes_per_sec": 0, 00:15:29.373 "w_mbytes_per_sec": 0 00:15:29.373 }, 00:15:29.373 "claimed": false, 00:15:29.373 "zoned": false, 00:15:29.373 "supported_io_types": { 00:15:29.373 "read": true, 00:15:29.373 "write": true, 00:15:29.373 "unmap": true, 00:15:29.373 "flush": true, 00:15:29.373 "reset": true, 00:15:29.373 "nvme_admin": false, 00:15:29.373 "nvme_io": false, 00:15:29.373 "nvme_io_md": false, 00:15:29.373 "write_zeroes": true, 00:15:29.373 "zcopy": true, 00:15:29.373 "get_zone_info": false, 00:15:29.373 "zone_management": false, 00:15:29.373 "zone_append": false, 00:15:29.373 "compare": false, 00:15:29.373 "compare_and_write": false, 00:15:29.373 "abort": true, 00:15:29.373 "seek_hole": false, 00:15:29.373 "seek_data": false, 00:15:29.373 "copy": true, 00:15:29.373 "nvme_iov_md": false 00:15:29.373 }, 00:15:29.373 "memory_domains": [ 00:15:29.373 { 00:15:29.373 "dma_device_id": "system", 00:15:29.373 "dma_device_type": 1 00:15:29.373 }, 00:15:29.373 { 00:15:29.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:29.373 "dma_device_type": 2 00:15:29.373 } 00:15:29.373 ], 00:15:29.373 "driver_specific": {} 00:15:29.373 } 00:15:29.373 ] 00:15:29.373 08:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:29.373 08:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:29.373 08:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:29.373 08:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:29.632 BaseBdev3 00:15:29.632 08:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:29.632 08:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:29.632 08:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:29.632 08:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:29.632 08:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:29.632 08:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:29.632 08:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:29.632 08:28:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:29.891 [ 00:15:29.891 { 00:15:29.891 "name": "BaseBdev3", 00:15:29.891 "aliases": [ 00:15:29.891 "2aa24ecd-70c4-49c6-b26a-35db42765356" 00:15:29.891 ], 00:15:29.891 "product_name": "Malloc disk", 00:15:29.891 "block_size": 512, 00:15:29.891 "num_blocks": 65536, 00:15:29.891 "uuid": "2aa24ecd-70c4-49c6-b26a-35db42765356", 00:15:29.891 "assigned_rate_limits": { 00:15:29.891 "rw_ios_per_sec": 0, 00:15:29.891 "rw_mbytes_per_sec": 0, 00:15:29.891 "r_mbytes_per_sec": 0, 00:15:29.891 "w_mbytes_per_sec": 0 00:15:29.891 }, 00:15:29.891 "claimed": false, 00:15:29.891 "zoned": false, 00:15:29.891 "supported_io_types": { 00:15:29.891 "read": true, 00:15:29.891 "write": true, 00:15:29.891 "unmap": true, 00:15:29.891 "flush": true, 00:15:29.891 "reset": true, 00:15:29.891 "nvme_admin": false, 00:15:29.891 "nvme_io": false, 00:15:29.891 "nvme_io_md": false, 00:15:29.891 "write_zeroes": true, 00:15:29.891 "zcopy": true, 00:15:29.891 "get_zone_info": false, 00:15:29.891 "zone_management": false, 00:15:29.891 "zone_append": false, 00:15:29.891 "compare": false, 00:15:29.891 "compare_and_write": false, 00:15:29.891 "abort": true, 00:15:29.891 "seek_hole": false, 00:15:29.891 "seek_data": false, 00:15:29.891 "copy": true, 00:15:29.891 "nvme_iov_md": false 00:15:29.891 }, 00:15:29.891 "memory_domains": [ 00:15:29.891 { 00:15:29.891 "dma_device_id": "system", 00:15:29.891 "dma_device_type": 1 00:15:29.891 }, 00:15:29.891 { 00:15:29.891 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:29.891 "dma_device_type": 2 00:15:29.891 } 00:15:29.891 ], 00:15:29.891 "driver_specific": {} 00:15:29.891 } 00:15:29.891 ] 00:15:29.891 08:28:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:29.891 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:29.891 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:29.891 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:30.153 [2024-07-23 08:28:42.454118] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:30.153 [2024-07-23 08:28:42.454157] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:30.153 [2024-07-23 08:28:42.454181] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:30.153 [2024-07-23 08:28:42.455754] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:30.153 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:30.153 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:30.153 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:30.153 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:30.153 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:30.153 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:30.153 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:30.153 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:30.153 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:30.153 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:30.153 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.153 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:30.153 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:30.153 "name": "Existed_Raid", 00:15:30.153 "uuid": "3eabde83-2e12-4de4-91b1-66a688398dc3", 00:15:30.153 "strip_size_kb": 64, 00:15:30.153 "state": "configuring", 00:15:30.153 "raid_level": "concat", 00:15:30.153 "superblock": true, 00:15:30.153 "num_base_bdevs": 3, 00:15:30.153 "num_base_bdevs_discovered": 2, 00:15:30.153 "num_base_bdevs_operational": 3, 00:15:30.153 "base_bdevs_list": [ 00:15:30.153 { 00:15:30.153 "name": "BaseBdev1", 00:15:30.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:30.153 "is_configured": false, 00:15:30.153 "data_offset": 0, 00:15:30.153 "data_size": 0 00:15:30.153 }, 00:15:30.153 { 00:15:30.153 "name": "BaseBdev2", 00:15:30.153 "uuid": "a7005867-4572-4af6-a832-4549fac99fdc", 00:15:30.153 "is_configured": true, 00:15:30.153 "data_offset": 2048, 00:15:30.153 "data_size": 63488 00:15:30.153 }, 00:15:30.153 { 00:15:30.153 "name": "BaseBdev3", 00:15:30.153 "uuid": "2aa24ecd-70c4-49c6-b26a-35db42765356", 00:15:30.153 "is_configured": true, 00:15:30.153 "data_offset": 2048, 00:15:30.153 "data_size": 63488 00:15:30.153 } 00:15:30.153 ] 00:15:30.153 }' 00:15:30.153 08:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:30.153 08:28:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:30.790 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:30.790 [2024-07-23 08:28:43.272256] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:30.790 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:30.790 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:30.790 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:30.790 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:30.790 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:30.790 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:30.790 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:30.790 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:30.790 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:30.790 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:30.790 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.790 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:31.048 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:31.048 "name": "Existed_Raid", 00:15:31.048 "uuid": "3eabde83-2e12-4de4-91b1-66a688398dc3", 00:15:31.048 "strip_size_kb": 64, 00:15:31.048 "state": "configuring", 00:15:31.048 "raid_level": "concat", 00:15:31.048 "superblock": true, 00:15:31.048 "num_base_bdevs": 3, 00:15:31.048 "num_base_bdevs_discovered": 1, 00:15:31.048 "num_base_bdevs_operational": 3, 00:15:31.048 "base_bdevs_list": [ 00:15:31.048 { 00:15:31.048 "name": "BaseBdev1", 00:15:31.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.048 "is_configured": false, 00:15:31.048 "data_offset": 0, 00:15:31.048 "data_size": 0 00:15:31.048 }, 00:15:31.048 { 00:15:31.048 "name": null, 00:15:31.048 "uuid": "a7005867-4572-4af6-a832-4549fac99fdc", 00:15:31.048 "is_configured": false, 00:15:31.048 "data_offset": 2048, 00:15:31.048 "data_size": 63488 00:15:31.048 }, 00:15:31.048 { 00:15:31.048 "name": "BaseBdev3", 00:15:31.048 "uuid": "2aa24ecd-70c4-49c6-b26a-35db42765356", 00:15:31.048 "is_configured": true, 00:15:31.048 "data_offset": 2048, 00:15:31.048 "data_size": 63488 00:15:31.048 } 00:15:31.048 ] 00:15:31.048 }' 00:15:31.048 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:31.048 08:28:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:31.615 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.615 08:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:31.873 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:31.873 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:31.873 [2024-07-23 08:28:44.312027] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:31.873 BaseBdev1 00:15:31.873 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:31.873 08:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:31.873 08:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:31.873 08:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:31.873 08:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:31.873 08:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:31.873 08:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:32.133 08:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:32.133 [ 00:15:32.133 { 00:15:32.133 "name": "BaseBdev1", 00:15:32.133 "aliases": [ 00:15:32.133 "d228dd13-a8cc-4732-a86d-f58fc2d51f0a" 00:15:32.133 ], 00:15:32.133 "product_name": "Malloc disk", 00:15:32.133 "block_size": 512, 00:15:32.133 "num_blocks": 65536, 00:15:32.133 "uuid": "d228dd13-a8cc-4732-a86d-f58fc2d51f0a", 00:15:32.133 "assigned_rate_limits": { 00:15:32.133 "rw_ios_per_sec": 0, 00:15:32.133 "rw_mbytes_per_sec": 0, 00:15:32.133 "r_mbytes_per_sec": 0, 00:15:32.133 "w_mbytes_per_sec": 0 00:15:32.133 }, 00:15:32.133 "claimed": true, 00:15:32.133 "claim_type": "exclusive_write", 00:15:32.133 "zoned": false, 00:15:32.133 "supported_io_types": { 00:15:32.133 "read": true, 00:15:32.133 "write": true, 00:15:32.133 "unmap": true, 00:15:32.133 "flush": true, 00:15:32.133 "reset": true, 00:15:32.133 "nvme_admin": false, 00:15:32.133 "nvme_io": false, 00:15:32.133 "nvme_io_md": false, 00:15:32.133 "write_zeroes": true, 00:15:32.133 "zcopy": true, 00:15:32.133 "get_zone_info": false, 00:15:32.133 "zone_management": false, 00:15:32.133 "zone_append": false, 00:15:32.133 "compare": false, 00:15:32.133 "compare_and_write": false, 00:15:32.133 "abort": true, 00:15:32.133 "seek_hole": false, 00:15:32.133 "seek_data": false, 00:15:32.133 "copy": true, 00:15:32.133 "nvme_iov_md": false 00:15:32.133 }, 00:15:32.133 "memory_domains": [ 00:15:32.133 { 00:15:32.133 "dma_device_id": "system", 00:15:32.133 "dma_device_type": 1 00:15:32.133 }, 00:15:32.133 { 00:15:32.133 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.133 "dma_device_type": 2 00:15:32.133 } 00:15:32.133 ], 00:15:32.133 "driver_specific": {} 00:15:32.133 } 00:15:32.133 ] 00:15:32.394 08:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:32.394 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:32.394 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:32.394 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:32.394 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:32.394 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:32.394 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:32.394 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:32.394 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:32.394 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:32.394 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:32.394 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.394 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:32.394 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:32.394 "name": "Existed_Raid", 00:15:32.394 "uuid": "3eabde83-2e12-4de4-91b1-66a688398dc3", 00:15:32.394 "strip_size_kb": 64, 00:15:32.394 "state": "configuring", 00:15:32.394 "raid_level": "concat", 00:15:32.394 "superblock": true, 00:15:32.394 "num_base_bdevs": 3, 00:15:32.394 "num_base_bdevs_discovered": 2, 00:15:32.394 "num_base_bdevs_operational": 3, 00:15:32.394 "base_bdevs_list": [ 00:15:32.394 { 00:15:32.394 "name": "BaseBdev1", 00:15:32.394 "uuid": "d228dd13-a8cc-4732-a86d-f58fc2d51f0a", 00:15:32.394 "is_configured": true, 00:15:32.394 "data_offset": 2048, 00:15:32.394 "data_size": 63488 00:15:32.394 }, 00:15:32.394 { 00:15:32.394 "name": null, 00:15:32.394 "uuid": "a7005867-4572-4af6-a832-4549fac99fdc", 00:15:32.394 "is_configured": false, 00:15:32.394 "data_offset": 2048, 00:15:32.394 "data_size": 63488 00:15:32.394 }, 00:15:32.394 { 00:15:32.394 "name": "BaseBdev3", 00:15:32.394 "uuid": "2aa24ecd-70c4-49c6-b26a-35db42765356", 00:15:32.394 "is_configured": true, 00:15:32.394 "data_offset": 2048, 00:15:32.394 "data_size": 63488 00:15:32.394 } 00:15:32.394 ] 00:15:32.394 }' 00:15:32.394 08:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:32.394 08:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:32.962 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.962 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:33.220 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:33.220 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:33.220 [2024-07-23 08:28:45.635588] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:33.220 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:33.220 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:33.220 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:33.220 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:33.220 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:33.220 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:33.220 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:33.220 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:33.220 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:33.220 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:33.220 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:33.220 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.479 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:33.479 "name": "Existed_Raid", 00:15:33.479 "uuid": "3eabde83-2e12-4de4-91b1-66a688398dc3", 00:15:33.479 "strip_size_kb": 64, 00:15:33.479 "state": "configuring", 00:15:33.479 "raid_level": "concat", 00:15:33.479 "superblock": true, 00:15:33.479 "num_base_bdevs": 3, 00:15:33.479 "num_base_bdevs_discovered": 1, 00:15:33.479 "num_base_bdevs_operational": 3, 00:15:33.479 "base_bdevs_list": [ 00:15:33.479 { 00:15:33.479 "name": "BaseBdev1", 00:15:33.479 "uuid": "d228dd13-a8cc-4732-a86d-f58fc2d51f0a", 00:15:33.479 "is_configured": true, 00:15:33.479 "data_offset": 2048, 00:15:33.479 "data_size": 63488 00:15:33.479 }, 00:15:33.479 { 00:15:33.479 "name": null, 00:15:33.479 "uuid": "a7005867-4572-4af6-a832-4549fac99fdc", 00:15:33.479 "is_configured": false, 00:15:33.479 "data_offset": 2048, 00:15:33.479 "data_size": 63488 00:15:33.479 }, 00:15:33.479 { 00:15:33.479 "name": null, 00:15:33.479 "uuid": "2aa24ecd-70c4-49c6-b26a-35db42765356", 00:15:33.479 "is_configured": false, 00:15:33.479 "data_offset": 2048, 00:15:33.479 "data_size": 63488 00:15:33.479 } 00:15:33.479 ] 00:15:33.479 }' 00:15:33.479 08:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:33.479 08:28:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:34.045 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:34.045 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.045 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:34.045 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:34.303 [2024-07-23 08:28:46.650263] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:34.303 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:34.303 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:34.303 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:34.303 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:34.303 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:34.303 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:34.303 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.303 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.303 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.303 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.303 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.303 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:34.562 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:34.562 "name": "Existed_Raid", 00:15:34.562 "uuid": "3eabde83-2e12-4de4-91b1-66a688398dc3", 00:15:34.562 "strip_size_kb": 64, 00:15:34.562 "state": "configuring", 00:15:34.562 "raid_level": "concat", 00:15:34.562 "superblock": true, 00:15:34.562 "num_base_bdevs": 3, 00:15:34.562 "num_base_bdevs_discovered": 2, 00:15:34.562 "num_base_bdevs_operational": 3, 00:15:34.562 "base_bdevs_list": [ 00:15:34.562 { 00:15:34.562 "name": "BaseBdev1", 00:15:34.562 "uuid": "d228dd13-a8cc-4732-a86d-f58fc2d51f0a", 00:15:34.562 "is_configured": true, 00:15:34.562 "data_offset": 2048, 00:15:34.562 "data_size": 63488 00:15:34.562 }, 00:15:34.562 { 00:15:34.562 "name": null, 00:15:34.562 "uuid": "a7005867-4572-4af6-a832-4549fac99fdc", 00:15:34.562 "is_configured": false, 00:15:34.562 "data_offset": 2048, 00:15:34.562 "data_size": 63488 00:15:34.562 }, 00:15:34.562 { 00:15:34.562 "name": "BaseBdev3", 00:15:34.562 "uuid": "2aa24ecd-70c4-49c6-b26a-35db42765356", 00:15:34.562 "is_configured": true, 00:15:34.562 "data_offset": 2048, 00:15:34.562 "data_size": 63488 00:15:34.562 } 00:15:34.562 ] 00:15:34.562 }' 00:15:34.562 08:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:34.562 08:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:35.129 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:35.129 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.129 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:35.129 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:35.387 [2024-07-23 08:28:47.669003] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:35.387 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:35.387 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:35.387 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:35.387 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:35.387 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:35.387 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:35.387 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:35.387 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:35.387 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:35.387 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:35.387 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.387 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:35.646 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:35.646 "name": "Existed_Raid", 00:15:35.646 "uuid": "3eabde83-2e12-4de4-91b1-66a688398dc3", 00:15:35.646 "strip_size_kb": 64, 00:15:35.646 "state": "configuring", 00:15:35.646 "raid_level": "concat", 00:15:35.646 "superblock": true, 00:15:35.646 "num_base_bdevs": 3, 00:15:35.646 "num_base_bdevs_discovered": 1, 00:15:35.646 "num_base_bdevs_operational": 3, 00:15:35.646 "base_bdevs_list": [ 00:15:35.646 { 00:15:35.646 "name": null, 00:15:35.646 "uuid": "d228dd13-a8cc-4732-a86d-f58fc2d51f0a", 00:15:35.646 "is_configured": false, 00:15:35.646 "data_offset": 2048, 00:15:35.646 "data_size": 63488 00:15:35.646 }, 00:15:35.646 { 00:15:35.646 "name": null, 00:15:35.646 "uuid": "a7005867-4572-4af6-a832-4549fac99fdc", 00:15:35.646 "is_configured": false, 00:15:35.646 "data_offset": 2048, 00:15:35.646 "data_size": 63488 00:15:35.646 }, 00:15:35.646 { 00:15:35.646 "name": "BaseBdev3", 00:15:35.646 "uuid": "2aa24ecd-70c4-49c6-b26a-35db42765356", 00:15:35.646 "is_configured": true, 00:15:35.646 "data_offset": 2048, 00:15:35.646 "data_size": 63488 00:15:35.646 } 00:15:35.646 ] 00:15:35.646 }' 00:15:35.646 08:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:35.646 08:28:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:36.212 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.212 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:36.212 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:36.212 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:36.471 [2024-07-23 08:28:48.780940] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:36.471 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:36.471 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:36.471 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:36.471 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:36.471 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:36.471 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:36.471 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:36.471 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:36.471 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:36.471 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:36.471 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.471 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:36.471 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:36.471 "name": "Existed_Raid", 00:15:36.471 "uuid": "3eabde83-2e12-4de4-91b1-66a688398dc3", 00:15:36.471 "strip_size_kb": 64, 00:15:36.471 "state": "configuring", 00:15:36.471 "raid_level": "concat", 00:15:36.471 "superblock": true, 00:15:36.471 "num_base_bdevs": 3, 00:15:36.471 "num_base_bdevs_discovered": 2, 00:15:36.471 "num_base_bdevs_operational": 3, 00:15:36.471 "base_bdevs_list": [ 00:15:36.471 { 00:15:36.471 "name": null, 00:15:36.471 "uuid": "d228dd13-a8cc-4732-a86d-f58fc2d51f0a", 00:15:36.471 "is_configured": false, 00:15:36.471 "data_offset": 2048, 00:15:36.471 "data_size": 63488 00:15:36.471 }, 00:15:36.471 { 00:15:36.471 "name": "BaseBdev2", 00:15:36.471 "uuid": "a7005867-4572-4af6-a832-4549fac99fdc", 00:15:36.471 "is_configured": true, 00:15:36.471 "data_offset": 2048, 00:15:36.471 "data_size": 63488 00:15:36.471 }, 00:15:36.471 { 00:15:36.471 "name": "BaseBdev3", 00:15:36.471 "uuid": "2aa24ecd-70c4-49c6-b26a-35db42765356", 00:15:36.471 "is_configured": true, 00:15:36.471 "data_offset": 2048, 00:15:36.471 "data_size": 63488 00:15:36.471 } 00:15:36.471 ] 00:15:36.471 }' 00:15:36.471 08:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:36.471 08:28:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:37.037 08:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.037 08:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:37.295 08:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:37.295 08:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.295 08:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:37.295 08:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u d228dd13-a8cc-4732-a86d-f58fc2d51f0a 00:15:37.554 [2024-07-23 08:28:49.974702] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:37.554 [2024-07-23 08:28:49.974904] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036980 00:15:37.554 [2024-07-23 08:28:49.974921] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:37.554 [2024-07-23 08:28:49.975150] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c200 00:15:37.554 [2024-07-23 08:28:49.975327] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036980 00:15:37.554 [2024-07-23 08:28:49.975336] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000036980 00:15:37.554 [2024-07-23 08:28:49.975485] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:37.554 NewBaseBdev 00:15:37.554 08:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:37.554 08:28:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:37.554 08:28:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:37.554 08:28:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:37.554 08:28:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:37.554 08:28:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:37.554 08:28:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:37.812 08:28:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:37.812 [ 00:15:37.812 { 00:15:37.812 "name": "NewBaseBdev", 00:15:37.812 "aliases": [ 00:15:37.812 "d228dd13-a8cc-4732-a86d-f58fc2d51f0a" 00:15:37.812 ], 00:15:37.812 "product_name": "Malloc disk", 00:15:37.812 "block_size": 512, 00:15:37.812 "num_blocks": 65536, 00:15:37.812 "uuid": "d228dd13-a8cc-4732-a86d-f58fc2d51f0a", 00:15:37.812 "assigned_rate_limits": { 00:15:37.812 "rw_ios_per_sec": 0, 00:15:37.812 "rw_mbytes_per_sec": 0, 00:15:37.812 "r_mbytes_per_sec": 0, 00:15:37.812 "w_mbytes_per_sec": 0 00:15:37.812 }, 00:15:37.812 "claimed": true, 00:15:37.812 "claim_type": "exclusive_write", 00:15:37.812 "zoned": false, 00:15:37.812 "supported_io_types": { 00:15:37.812 "read": true, 00:15:37.812 "write": true, 00:15:37.812 "unmap": true, 00:15:37.812 "flush": true, 00:15:37.813 "reset": true, 00:15:37.813 "nvme_admin": false, 00:15:37.813 "nvme_io": false, 00:15:37.813 "nvme_io_md": false, 00:15:37.813 "write_zeroes": true, 00:15:37.813 "zcopy": true, 00:15:37.813 "get_zone_info": false, 00:15:37.813 "zone_management": false, 00:15:37.813 "zone_append": false, 00:15:37.813 "compare": false, 00:15:37.813 "compare_and_write": false, 00:15:37.813 "abort": true, 00:15:37.813 "seek_hole": false, 00:15:37.813 "seek_data": false, 00:15:37.813 "copy": true, 00:15:37.813 "nvme_iov_md": false 00:15:37.813 }, 00:15:37.813 "memory_domains": [ 00:15:37.813 { 00:15:37.813 "dma_device_id": "system", 00:15:37.813 "dma_device_type": 1 00:15:37.813 }, 00:15:37.813 { 00:15:37.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.813 "dma_device_type": 2 00:15:37.813 } 00:15:37.813 ], 00:15:37.813 "driver_specific": {} 00:15:37.813 } 00:15:37.813 ] 00:15:38.071 08:28:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:38.071 08:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:38.071 08:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:38.071 08:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:38.071 08:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:38.071 08:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:38.071 08:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:38.071 08:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:38.071 08:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:38.071 08:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:38.071 08:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:38.071 08:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:38.071 08:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.071 08:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:38.071 "name": "Existed_Raid", 00:15:38.071 "uuid": "3eabde83-2e12-4de4-91b1-66a688398dc3", 00:15:38.071 "strip_size_kb": 64, 00:15:38.071 "state": "online", 00:15:38.071 "raid_level": "concat", 00:15:38.071 "superblock": true, 00:15:38.071 "num_base_bdevs": 3, 00:15:38.071 "num_base_bdevs_discovered": 3, 00:15:38.071 "num_base_bdevs_operational": 3, 00:15:38.071 "base_bdevs_list": [ 00:15:38.071 { 00:15:38.071 "name": "NewBaseBdev", 00:15:38.071 "uuid": "d228dd13-a8cc-4732-a86d-f58fc2d51f0a", 00:15:38.071 "is_configured": true, 00:15:38.071 "data_offset": 2048, 00:15:38.071 "data_size": 63488 00:15:38.071 }, 00:15:38.071 { 00:15:38.071 "name": "BaseBdev2", 00:15:38.071 "uuid": "a7005867-4572-4af6-a832-4549fac99fdc", 00:15:38.071 "is_configured": true, 00:15:38.071 "data_offset": 2048, 00:15:38.071 "data_size": 63488 00:15:38.071 }, 00:15:38.071 { 00:15:38.071 "name": "BaseBdev3", 00:15:38.071 "uuid": "2aa24ecd-70c4-49c6-b26a-35db42765356", 00:15:38.071 "is_configured": true, 00:15:38.071 "data_offset": 2048, 00:15:38.071 "data_size": 63488 00:15:38.071 } 00:15:38.071 ] 00:15:38.071 }' 00:15:38.071 08:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:38.071 08:28:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:38.638 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:38.638 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:38.638 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:38.638 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:38.638 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:38.638 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:38.638 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:38.638 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:38.896 [2024-07-23 08:28:51.162134] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:38.896 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:38.896 "name": "Existed_Raid", 00:15:38.896 "aliases": [ 00:15:38.896 "3eabde83-2e12-4de4-91b1-66a688398dc3" 00:15:38.896 ], 00:15:38.896 "product_name": "Raid Volume", 00:15:38.896 "block_size": 512, 00:15:38.896 "num_blocks": 190464, 00:15:38.896 "uuid": "3eabde83-2e12-4de4-91b1-66a688398dc3", 00:15:38.896 "assigned_rate_limits": { 00:15:38.896 "rw_ios_per_sec": 0, 00:15:38.896 "rw_mbytes_per_sec": 0, 00:15:38.896 "r_mbytes_per_sec": 0, 00:15:38.896 "w_mbytes_per_sec": 0 00:15:38.896 }, 00:15:38.896 "claimed": false, 00:15:38.896 "zoned": false, 00:15:38.896 "supported_io_types": { 00:15:38.896 "read": true, 00:15:38.896 "write": true, 00:15:38.896 "unmap": true, 00:15:38.896 "flush": true, 00:15:38.896 "reset": true, 00:15:38.896 "nvme_admin": false, 00:15:38.896 "nvme_io": false, 00:15:38.896 "nvme_io_md": false, 00:15:38.896 "write_zeroes": true, 00:15:38.896 "zcopy": false, 00:15:38.896 "get_zone_info": false, 00:15:38.896 "zone_management": false, 00:15:38.896 "zone_append": false, 00:15:38.896 "compare": false, 00:15:38.896 "compare_and_write": false, 00:15:38.896 "abort": false, 00:15:38.896 "seek_hole": false, 00:15:38.896 "seek_data": false, 00:15:38.896 "copy": false, 00:15:38.896 "nvme_iov_md": false 00:15:38.896 }, 00:15:38.896 "memory_domains": [ 00:15:38.896 { 00:15:38.896 "dma_device_id": "system", 00:15:38.896 "dma_device_type": 1 00:15:38.896 }, 00:15:38.896 { 00:15:38.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.896 "dma_device_type": 2 00:15:38.896 }, 00:15:38.896 { 00:15:38.896 "dma_device_id": "system", 00:15:38.896 "dma_device_type": 1 00:15:38.896 }, 00:15:38.896 { 00:15:38.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.896 "dma_device_type": 2 00:15:38.896 }, 00:15:38.896 { 00:15:38.896 "dma_device_id": "system", 00:15:38.896 "dma_device_type": 1 00:15:38.896 }, 00:15:38.896 { 00:15:38.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.896 "dma_device_type": 2 00:15:38.896 } 00:15:38.896 ], 00:15:38.896 "driver_specific": { 00:15:38.896 "raid": { 00:15:38.896 "uuid": "3eabde83-2e12-4de4-91b1-66a688398dc3", 00:15:38.896 "strip_size_kb": 64, 00:15:38.896 "state": "online", 00:15:38.896 "raid_level": "concat", 00:15:38.896 "superblock": true, 00:15:38.896 "num_base_bdevs": 3, 00:15:38.896 "num_base_bdevs_discovered": 3, 00:15:38.897 "num_base_bdevs_operational": 3, 00:15:38.897 "base_bdevs_list": [ 00:15:38.897 { 00:15:38.897 "name": "NewBaseBdev", 00:15:38.897 "uuid": "d228dd13-a8cc-4732-a86d-f58fc2d51f0a", 00:15:38.897 "is_configured": true, 00:15:38.897 "data_offset": 2048, 00:15:38.897 "data_size": 63488 00:15:38.897 }, 00:15:38.897 { 00:15:38.897 "name": "BaseBdev2", 00:15:38.897 "uuid": "a7005867-4572-4af6-a832-4549fac99fdc", 00:15:38.897 "is_configured": true, 00:15:38.897 "data_offset": 2048, 00:15:38.897 "data_size": 63488 00:15:38.897 }, 00:15:38.897 { 00:15:38.897 "name": "BaseBdev3", 00:15:38.897 "uuid": "2aa24ecd-70c4-49c6-b26a-35db42765356", 00:15:38.897 "is_configured": true, 00:15:38.897 "data_offset": 2048, 00:15:38.897 "data_size": 63488 00:15:38.897 } 00:15:38.897 ] 00:15:38.897 } 00:15:38.897 } 00:15:38.897 }' 00:15:38.897 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:38.897 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:38.897 BaseBdev2 00:15:38.897 BaseBdev3' 00:15:38.897 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:38.897 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:38.897 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:38.897 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:38.897 "name": "NewBaseBdev", 00:15:38.897 "aliases": [ 00:15:38.897 "d228dd13-a8cc-4732-a86d-f58fc2d51f0a" 00:15:38.897 ], 00:15:38.897 "product_name": "Malloc disk", 00:15:38.897 "block_size": 512, 00:15:38.897 "num_blocks": 65536, 00:15:38.897 "uuid": "d228dd13-a8cc-4732-a86d-f58fc2d51f0a", 00:15:38.897 "assigned_rate_limits": { 00:15:38.897 "rw_ios_per_sec": 0, 00:15:38.897 "rw_mbytes_per_sec": 0, 00:15:38.897 "r_mbytes_per_sec": 0, 00:15:38.897 "w_mbytes_per_sec": 0 00:15:38.897 }, 00:15:38.897 "claimed": true, 00:15:38.897 "claim_type": "exclusive_write", 00:15:38.897 "zoned": false, 00:15:38.897 "supported_io_types": { 00:15:38.897 "read": true, 00:15:38.897 "write": true, 00:15:38.897 "unmap": true, 00:15:38.897 "flush": true, 00:15:38.897 "reset": true, 00:15:38.897 "nvme_admin": false, 00:15:38.897 "nvme_io": false, 00:15:38.897 "nvme_io_md": false, 00:15:38.897 "write_zeroes": true, 00:15:38.897 "zcopy": true, 00:15:38.897 "get_zone_info": false, 00:15:38.897 "zone_management": false, 00:15:38.897 "zone_append": false, 00:15:38.897 "compare": false, 00:15:38.897 "compare_and_write": false, 00:15:38.897 "abort": true, 00:15:38.897 "seek_hole": false, 00:15:38.897 "seek_data": false, 00:15:38.897 "copy": true, 00:15:38.897 "nvme_iov_md": false 00:15:38.897 }, 00:15:38.897 "memory_domains": [ 00:15:38.897 { 00:15:38.897 "dma_device_id": "system", 00:15:38.897 "dma_device_type": 1 00:15:38.897 }, 00:15:38.897 { 00:15:38.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.897 "dma_device_type": 2 00:15:38.897 } 00:15:38.897 ], 00:15:38.897 "driver_specific": {} 00:15:38.897 }' 00:15:38.897 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.155 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.155 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:39.155 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.155 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.155 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.155 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.155 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.155 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.155 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.413 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.413 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.413 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.413 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.413 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:39.413 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:39.413 "name": "BaseBdev2", 00:15:39.413 "aliases": [ 00:15:39.413 "a7005867-4572-4af6-a832-4549fac99fdc" 00:15:39.413 ], 00:15:39.413 "product_name": "Malloc disk", 00:15:39.413 "block_size": 512, 00:15:39.413 "num_blocks": 65536, 00:15:39.413 "uuid": "a7005867-4572-4af6-a832-4549fac99fdc", 00:15:39.413 "assigned_rate_limits": { 00:15:39.413 "rw_ios_per_sec": 0, 00:15:39.413 "rw_mbytes_per_sec": 0, 00:15:39.413 "r_mbytes_per_sec": 0, 00:15:39.413 "w_mbytes_per_sec": 0 00:15:39.413 }, 00:15:39.413 "claimed": true, 00:15:39.413 "claim_type": "exclusive_write", 00:15:39.413 "zoned": false, 00:15:39.413 "supported_io_types": { 00:15:39.413 "read": true, 00:15:39.413 "write": true, 00:15:39.413 "unmap": true, 00:15:39.413 "flush": true, 00:15:39.413 "reset": true, 00:15:39.413 "nvme_admin": false, 00:15:39.413 "nvme_io": false, 00:15:39.413 "nvme_io_md": false, 00:15:39.413 "write_zeroes": true, 00:15:39.413 "zcopy": true, 00:15:39.413 "get_zone_info": false, 00:15:39.413 "zone_management": false, 00:15:39.413 "zone_append": false, 00:15:39.413 "compare": false, 00:15:39.413 "compare_and_write": false, 00:15:39.413 "abort": true, 00:15:39.413 "seek_hole": false, 00:15:39.414 "seek_data": false, 00:15:39.414 "copy": true, 00:15:39.414 "nvme_iov_md": false 00:15:39.414 }, 00:15:39.414 "memory_domains": [ 00:15:39.414 { 00:15:39.414 "dma_device_id": "system", 00:15:39.414 "dma_device_type": 1 00:15:39.414 }, 00:15:39.414 { 00:15:39.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.414 "dma_device_type": 2 00:15:39.414 } 00:15:39.414 ], 00:15:39.414 "driver_specific": {} 00:15:39.414 }' 00:15:39.414 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.672 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.672 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:39.672 08:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.672 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.672 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.672 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.672 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.672 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.672 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.672 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.930 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.930 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.930 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:39.930 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.930 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:39.930 "name": "BaseBdev3", 00:15:39.930 "aliases": [ 00:15:39.930 "2aa24ecd-70c4-49c6-b26a-35db42765356" 00:15:39.930 ], 00:15:39.930 "product_name": "Malloc disk", 00:15:39.930 "block_size": 512, 00:15:39.930 "num_blocks": 65536, 00:15:39.930 "uuid": "2aa24ecd-70c4-49c6-b26a-35db42765356", 00:15:39.930 "assigned_rate_limits": { 00:15:39.930 "rw_ios_per_sec": 0, 00:15:39.930 "rw_mbytes_per_sec": 0, 00:15:39.930 "r_mbytes_per_sec": 0, 00:15:39.930 "w_mbytes_per_sec": 0 00:15:39.930 }, 00:15:39.930 "claimed": true, 00:15:39.930 "claim_type": "exclusive_write", 00:15:39.930 "zoned": false, 00:15:39.930 "supported_io_types": { 00:15:39.930 "read": true, 00:15:39.930 "write": true, 00:15:39.930 "unmap": true, 00:15:39.930 "flush": true, 00:15:39.930 "reset": true, 00:15:39.930 "nvme_admin": false, 00:15:39.930 "nvme_io": false, 00:15:39.930 "nvme_io_md": false, 00:15:39.930 "write_zeroes": true, 00:15:39.930 "zcopy": true, 00:15:39.930 "get_zone_info": false, 00:15:39.930 "zone_management": false, 00:15:39.930 "zone_append": false, 00:15:39.930 "compare": false, 00:15:39.930 "compare_and_write": false, 00:15:39.930 "abort": true, 00:15:39.930 "seek_hole": false, 00:15:39.930 "seek_data": false, 00:15:39.930 "copy": true, 00:15:39.930 "nvme_iov_md": false 00:15:39.930 }, 00:15:39.930 "memory_domains": [ 00:15:39.930 { 00:15:39.930 "dma_device_id": "system", 00:15:39.930 "dma_device_type": 1 00:15:39.930 }, 00:15:39.930 { 00:15:39.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.930 "dma_device_type": 2 00:15:39.930 } 00:15:39.930 ], 00:15:39.930 "driver_specific": {} 00:15:39.930 }' 00:15:39.930 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.930 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.189 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:40.189 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.189 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.189 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:40.189 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.189 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.189 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:40.189 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.189 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.447 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:40.447 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:40.447 [2024-07-23 08:28:52.870322] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:40.447 [2024-07-23 08:28:52.870349] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:40.448 [2024-07-23 08:28:52.870428] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:40.448 [2024-07-23 08:28:52.870481] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:40.448 [2024-07-23 08:28:52.870499] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036980 name Existed_Raid, state offline 00:15:40.448 08:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1443949 00:15:40.448 08:28:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1443949 ']' 00:15:40.448 08:28:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1443949 00:15:40.448 08:28:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:15:40.448 08:28:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:40.448 08:28:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1443949 00:15:40.448 08:28:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:40.448 08:28:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:40.448 08:28:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1443949' 00:15:40.448 killing process with pid 1443949 00:15:40.448 08:28:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1443949 00:15:40.448 [2024-07-23 08:28:52.930607] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:40.448 08:28:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1443949 00:15:40.706 [2024-07-23 08:28:53.170771] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:42.084 08:28:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:42.084 00:15:42.084 real 0m23.336s 00:15:42.084 user 0m41.777s 00:15:42.084 sys 0m3.396s 00:15:42.084 08:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:42.084 08:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:42.084 ************************************ 00:15:42.084 END TEST raid_state_function_test_sb 00:15:42.084 ************************************ 00:15:42.084 08:28:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:42.084 08:28:54 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:15:42.084 08:28:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:15:42.084 08:28:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:42.084 08:28:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:42.084 ************************************ 00:15:42.084 START TEST raid_superblock_test 00:15:42.084 ************************************ 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1448838 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1448838 /var/tmp/spdk-raid.sock 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1448838 ']' 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:42.084 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:42.084 08:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:42.343 [2024-07-23 08:28:54.609631] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:15:42.343 [2024-07-23 08:28:54.609737] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1448838 ] 00:15:42.343 [2024-07-23 08:28:54.734953] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:42.602 [2024-07-23 08:28:54.948152] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:42.860 [2024-07-23 08:28:55.218191] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:42.860 [2024-07-23 08:28:55.218216] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:42.860 08:28:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:42.860 08:28:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:15:42.860 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:42.860 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:42.860 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:42.860 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:42.860 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:42.860 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:42.860 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:42.860 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:42.860 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:43.119 malloc1 00:15:43.119 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:43.378 [2024-07-23 08:28:55.741313] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:43.378 [2024-07-23 08:28:55.741365] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:43.378 [2024-07-23 08:28:55.741388] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:15:43.378 [2024-07-23 08:28:55.741400] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:43.378 [2024-07-23 08:28:55.743280] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:43.378 [2024-07-23 08:28:55.743310] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:43.378 pt1 00:15:43.378 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:43.378 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:43.378 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:43.378 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:43.378 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:43.378 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:43.378 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:43.378 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:43.378 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:43.639 malloc2 00:15:43.639 08:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:43.639 [2024-07-23 08:28:56.129293] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:43.639 [2024-07-23 08:28:56.129342] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:43.639 [2024-07-23 08:28:56.129362] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:15:43.639 [2024-07-23 08:28:56.129371] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:43.639 [2024-07-23 08:28:56.131278] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:43.639 [2024-07-23 08:28:56.131307] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:43.639 pt2 00:15:43.639 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:43.639 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:43.639 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:15:43.639 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:15:43.639 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:43.639 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:43.639 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:43.639 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:43.639 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:43.898 malloc3 00:15:43.898 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:44.158 [2024-07-23 08:28:56.506530] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:44.158 [2024-07-23 08:28:56.506584] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:44.158 [2024-07-23 08:28:56.506607] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036080 00:15:44.158 [2024-07-23 08:28:56.506624] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:44.158 [2024-07-23 08:28:56.508665] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:44.158 [2024-07-23 08:28:56.508693] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:44.158 pt3 00:15:44.158 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:44.158 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:44.158 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:15:44.158 [2024-07-23 08:28:56.666997] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:44.158 [2024-07-23 08:28:56.668576] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:44.158 [2024-07-23 08:28:56.668645] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:44.158 [2024-07-23 08:28:56.668824] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036680 00:15:44.158 [2024-07-23 08:28:56.668839] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:44.158 [2024-07-23 08:28:56.669076] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:15:44.158 [2024-07-23 08:28:56.669269] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036680 00:15:44.158 [2024-07-23 08:28:56.669279] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036680 00:15:44.158 [2024-07-23 08:28:56.669437] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:44.418 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:44.418 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:44.418 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:44.418 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:44.418 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:44.418 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:44.418 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:44.418 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:44.418 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:44.418 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:44.418 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.418 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:44.418 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:44.418 "name": "raid_bdev1", 00:15:44.418 "uuid": "00a01af7-f1ba-4caf-8c4e-12b46c9366ca", 00:15:44.418 "strip_size_kb": 64, 00:15:44.418 "state": "online", 00:15:44.418 "raid_level": "concat", 00:15:44.418 "superblock": true, 00:15:44.418 "num_base_bdevs": 3, 00:15:44.418 "num_base_bdevs_discovered": 3, 00:15:44.418 "num_base_bdevs_operational": 3, 00:15:44.418 "base_bdevs_list": [ 00:15:44.418 { 00:15:44.418 "name": "pt1", 00:15:44.418 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:44.418 "is_configured": true, 00:15:44.418 "data_offset": 2048, 00:15:44.418 "data_size": 63488 00:15:44.418 }, 00:15:44.418 { 00:15:44.418 "name": "pt2", 00:15:44.418 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:44.418 "is_configured": true, 00:15:44.418 "data_offset": 2048, 00:15:44.418 "data_size": 63488 00:15:44.418 }, 00:15:44.418 { 00:15:44.418 "name": "pt3", 00:15:44.418 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:44.418 "is_configured": true, 00:15:44.418 "data_offset": 2048, 00:15:44.418 "data_size": 63488 00:15:44.418 } 00:15:44.418 ] 00:15:44.418 }' 00:15:44.418 08:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:44.418 08:28:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.006 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:45.006 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:45.006 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:45.006 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:45.006 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:45.006 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:45.006 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:45.006 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:45.006 [2024-07-23 08:28:57.489351] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:45.006 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:45.006 "name": "raid_bdev1", 00:15:45.006 "aliases": [ 00:15:45.006 "00a01af7-f1ba-4caf-8c4e-12b46c9366ca" 00:15:45.006 ], 00:15:45.006 "product_name": "Raid Volume", 00:15:45.006 "block_size": 512, 00:15:45.006 "num_blocks": 190464, 00:15:45.006 "uuid": "00a01af7-f1ba-4caf-8c4e-12b46c9366ca", 00:15:45.006 "assigned_rate_limits": { 00:15:45.006 "rw_ios_per_sec": 0, 00:15:45.006 "rw_mbytes_per_sec": 0, 00:15:45.006 "r_mbytes_per_sec": 0, 00:15:45.006 "w_mbytes_per_sec": 0 00:15:45.006 }, 00:15:45.006 "claimed": false, 00:15:45.007 "zoned": false, 00:15:45.007 "supported_io_types": { 00:15:45.007 "read": true, 00:15:45.007 "write": true, 00:15:45.007 "unmap": true, 00:15:45.007 "flush": true, 00:15:45.007 "reset": true, 00:15:45.007 "nvme_admin": false, 00:15:45.007 "nvme_io": false, 00:15:45.007 "nvme_io_md": false, 00:15:45.007 "write_zeroes": true, 00:15:45.007 "zcopy": false, 00:15:45.007 "get_zone_info": false, 00:15:45.007 "zone_management": false, 00:15:45.007 "zone_append": false, 00:15:45.007 "compare": false, 00:15:45.007 "compare_and_write": false, 00:15:45.007 "abort": false, 00:15:45.007 "seek_hole": false, 00:15:45.007 "seek_data": false, 00:15:45.007 "copy": false, 00:15:45.007 "nvme_iov_md": false 00:15:45.007 }, 00:15:45.007 "memory_domains": [ 00:15:45.007 { 00:15:45.007 "dma_device_id": "system", 00:15:45.007 "dma_device_type": 1 00:15:45.007 }, 00:15:45.007 { 00:15:45.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:45.007 "dma_device_type": 2 00:15:45.007 }, 00:15:45.007 { 00:15:45.007 "dma_device_id": "system", 00:15:45.007 "dma_device_type": 1 00:15:45.007 }, 00:15:45.007 { 00:15:45.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:45.007 "dma_device_type": 2 00:15:45.007 }, 00:15:45.007 { 00:15:45.007 "dma_device_id": "system", 00:15:45.007 "dma_device_type": 1 00:15:45.007 }, 00:15:45.007 { 00:15:45.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:45.007 "dma_device_type": 2 00:15:45.007 } 00:15:45.007 ], 00:15:45.007 "driver_specific": { 00:15:45.007 "raid": { 00:15:45.007 "uuid": "00a01af7-f1ba-4caf-8c4e-12b46c9366ca", 00:15:45.007 "strip_size_kb": 64, 00:15:45.007 "state": "online", 00:15:45.007 "raid_level": "concat", 00:15:45.007 "superblock": true, 00:15:45.007 "num_base_bdevs": 3, 00:15:45.007 "num_base_bdevs_discovered": 3, 00:15:45.007 "num_base_bdevs_operational": 3, 00:15:45.007 "base_bdevs_list": [ 00:15:45.007 { 00:15:45.007 "name": "pt1", 00:15:45.007 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:45.007 "is_configured": true, 00:15:45.007 "data_offset": 2048, 00:15:45.007 "data_size": 63488 00:15:45.007 }, 00:15:45.007 { 00:15:45.007 "name": "pt2", 00:15:45.007 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:45.007 "is_configured": true, 00:15:45.007 "data_offset": 2048, 00:15:45.007 "data_size": 63488 00:15:45.007 }, 00:15:45.007 { 00:15:45.007 "name": "pt3", 00:15:45.007 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:45.007 "is_configured": true, 00:15:45.007 "data_offset": 2048, 00:15:45.007 "data_size": 63488 00:15:45.007 } 00:15:45.007 ] 00:15:45.007 } 00:15:45.007 } 00:15:45.007 }' 00:15:45.007 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:45.266 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:45.266 pt2 00:15:45.266 pt3' 00:15:45.266 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:45.266 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:45.266 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:45.266 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:45.266 "name": "pt1", 00:15:45.266 "aliases": [ 00:15:45.266 "00000000-0000-0000-0000-000000000001" 00:15:45.266 ], 00:15:45.266 "product_name": "passthru", 00:15:45.266 "block_size": 512, 00:15:45.266 "num_blocks": 65536, 00:15:45.266 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:45.266 "assigned_rate_limits": { 00:15:45.266 "rw_ios_per_sec": 0, 00:15:45.266 "rw_mbytes_per_sec": 0, 00:15:45.266 "r_mbytes_per_sec": 0, 00:15:45.266 "w_mbytes_per_sec": 0 00:15:45.266 }, 00:15:45.266 "claimed": true, 00:15:45.266 "claim_type": "exclusive_write", 00:15:45.266 "zoned": false, 00:15:45.266 "supported_io_types": { 00:15:45.266 "read": true, 00:15:45.266 "write": true, 00:15:45.266 "unmap": true, 00:15:45.266 "flush": true, 00:15:45.266 "reset": true, 00:15:45.266 "nvme_admin": false, 00:15:45.266 "nvme_io": false, 00:15:45.266 "nvme_io_md": false, 00:15:45.266 "write_zeroes": true, 00:15:45.266 "zcopy": true, 00:15:45.266 "get_zone_info": false, 00:15:45.266 "zone_management": false, 00:15:45.266 "zone_append": false, 00:15:45.266 "compare": false, 00:15:45.266 "compare_and_write": false, 00:15:45.266 "abort": true, 00:15:45.266 "seek_hole": false, 00:15:45.266 "seek_data": false, 00:15:45.266 "copy": true, 00:15:45.266 "nvme_iov_md": false 00:15:45.266 }, 00:15:45.266 "memory_domains": [ 00:15:45.266 { 00:15:45.266 "dma_device_id": "system", 00:15:45.266 "dma_device_type": 1 00:15:45.266 }, 00:15:45.266 { 00:15:45.266 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:45.266 "dma_device_type": 2 00:15:45.266 } 00:15:45.266 ], 00:15:45.266 "driver_specific": { 00:15:45.266 "passthru": { 00:15:45.266 "name": "pt1", 00:15:45.266 "base_bdev_name": "malloc1" 00:15:45.267 } 00:15:45.267 } 00:15:45.267 }' 00:15:45.267 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:45.267 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:45.525 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:45.525 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:45.525 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:45.525 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:45.525 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:45.525 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:45.525 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:45.525 08:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:45.525 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:45.783 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:45.783 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:45.783 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:45.783 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:45.783 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:45.783 "name": "pt2", 00:15:45.783 "aliases": [ 00:15:45.783 "00000000-0000-0000-0000-000000000002" 00:15:45.783 ], 00:15:45.783 "product_name": "passthru", 00:15:45.783 "block_size": 512, 00:15:45.783 "num_blocks": 65536, 00:15:45.783 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:45.783 "assigned_rate_limits": { 00:15:45.783 "rw_ios_per_sec": 0, 00:15:45.783 "rw_mbytes_per_sec": 0, 00:15:45.783 "r_mbytes_per_sec": 0, 00:15:45.783 "w_mbytes_per_sec": 0 00:15:45.783 }, 00:15:45.784 "claimed": true, 00:15:45.784 "claim_type": "exclusive_write", 00:15:45.784 "zoned": false, 00:15:45.784 "supported_io_types": { 00:15:45.784 "read": true, 00:15:45.784 "write": true, 00:15:45.784 "unmap": true, 00:15:45.784 "flush": true, 00:15:45.784 "reset": true, 00:15:45.784 "nvme_admin": false, 00:15:45.784 "nvme_io": false, 00:15:45.784 "nvme_io_md": false, 00:15:45.784 "write_zeroes": true, 00:15:45.784 "zcopy": true, 00:15:45.784 "get_zone_info": false, 00:15:45.784 "zone_management": false, 00:15:45.784 "zone_append": false, 00:15:45.784 "compare": false, 00:15:45.784 "compare_and_write": false, 00:15:45.784 "abort": true, 00:15:45.784 "seek_hole": false, 00:15:45.784 "seek_data": false, 00:15:45.784 "copy": true, 00:15:45.784 "nvme_iov_md": false 00:15:45.784 }, 00:15:45.784 "memory_domains": [ 00:15:45.784 { 00:15:45.784 "dma_device_id": "system", 00:15:45.784 "dma_device_type": 1 00:15:45.784 }, 00:15:45.784 { 00:15:45.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:45.784 "dma_device_type": 2 00:15:45.784 } 00:15:45.784 ], 00:15:45.784 "driver_specific": { 00:15:45.784 "passthru": { 00:15:45.784 "name": "pt2", 00:15:45.784 "base_bdev_name": "malloc2" 00:15:45.784 } 00:15:45.784 } 00:15:45.784 }' 00:15:45.784 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:45.784 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:45.784 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:45.784 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.042 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.042 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:46.042 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:46.042 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:46.042 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:46.042 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:46.042 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:46.042 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:46.042 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:46.042 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:46.042 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:46.300 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:46.300 "name": "pt3", 00:15:46.300 "aliases": [ 00:15:46.300 "00000000-0000-0000-0000-000000000003" 00:15:46.300 ], 00:15:46.300 "product_name": "passthru", 00:15:46.300 "block_size": 512, 00:15:46.300 "num_blocks": 65536, 00:15:46.300 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:46.300 "assigned_rate_limits": { 00:15:46.300 "rw_ios_per_sec": 0, 00:15:46.301 "rw_mbytes_per_sec": 0, 00:15:46.301 "r_mbytes_per_sec": 0, 00:15:46.301 "w_mbytes_per_sec": 0 00:15:46.301 }, 00:15:46.301 "claimed": true, 00:15:46.301 "claim_type": "exclusive_write", 00:15:46.301 "zoned": false, 00:15:46.301 "supported_io_types": { 00:15:46.301 "read": true, 00:15:46.301 "write": true, 00:15:46.301 "unmap": true, 00:15:46.301 "flush": true, 00:15:46.301 "reset": true, 00:15:46.301 "nvme_admin": false, 00:15:46.301 "nvme_io": false, 00:15:46.301 "nvme_io_md": false, 00:15:46.301 "write_zeroes": true, 00:15:46.301 "zcopy": true, 00:15:46.301 "get_zone_info": false, 00:15:46.301 "zone_management": false, 00:15:46.301 "zone_append": false, 00:15:46.301 "compare": false, 00:15:46.301 "compare_and_write": false, 00:15:46.301 "abort": true, 00:15:46.301 "seek_hole": false, 00:15:46.301 "seek_data": false, 00:15:46.301 "copy": true, 00:15:46.301 "nvme_iov_md": false 00:15:46.301 }, 00:15:46.301 "memory_domains": [ 00:15:46.301 { 00:15:46.301 "dma_device_id": "system", 00:15:46.301 "dma_device_type": 1 00:15:46.301 }, 00:15:46.301 { 00:15:46.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.301 "dma_device_type": 2 00:15:46.301 } 00:15:46.301 ], 00:15:46.301 "driver_specific": { 00:15:46.301 "passthru": { 00:15:46.301 "name": "pt3", 00:15:46.301 "base_bdev_name": "malloc3" 00:15:46.301 } 00:15:46.301 } 00:15:46.301 }' 00:15:46.301 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:46.301 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:46.301 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:46.301 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.559 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.559 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:46.559 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:46.559 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:46.559 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:46.559 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:46.559 08:28:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:46.559 08:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:46.559 08:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:46.559 08:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:46.818 [2024-07-23 08:28:59.165875] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:46.818 08:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=00a01af7-f1ba-4caf-8c4e-12b46c9366ca 00:15:46.818 08:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 00a01af7-f1ba-4caf-8c4e-12b46c9366ca ']' 00:15:46.818 08:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:47.077 [2024-07-23 08:28:59.338036] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:47.077 [2024-07-23 08:28:59.338063] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:47.077 [2024-07-23 08:28:59.338132] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:47.077 [2024-07-23 08:28:59.338194] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:47.077 [2024-07-23 08:28:59.338204] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036680 name raid_bdev1, state offline 00:15:47.077 08:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.077 08:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:47.077 08:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:47.077 08:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:47.077 08:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:47.077 08:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:47.335 08:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:47.335 08:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:47.594 08:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:47.594 08:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:47.594 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:47.594 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:47.853 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:47.853 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:47.853 08:29:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:47.853 08:29:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:47.853 08:29:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:47.853 08:29:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:47.853 08:29:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:47.853 08:29:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:47.853 08:29:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:47.853 08:29:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:47.853 08:29:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:47.853 08:29:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:47.853 08:29:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:48.112 [2024-07-23 08:29:00.372776] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:48.112 [2024-07-23 08:29:00.374437] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:48.112 [2024-07-23 08:29:00.374488] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:48.112 [2024-07-23 08:29:00.374535] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:48.112 [2024-07-23 08:29:00.374576] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:48.112 [2024-07-23 08:29:00.374595] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:48.112 [2024-07-23 08:29:00.374618] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:48.112 [2024-07-23 08:29:00.374631] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036c80 name raid_bdev1, state configuring 00:15:48.112 request: 00:15:48.112 { 00:15:48.112 "name": "raid_bdev1", 00:15:48.112 "raid_level": "concat", 00:15:48.112 "base_bdevs": [ 00:15:48.112 "malloc1", 00:15:48.112 "malloc2", 00:15:48.112 "malloc3" 00:15:48.112 ], 00:15:48.112 "strip_size_kb": 64, 00:15:48.112 "superblock": false, 00:15:48.112 "method": "bdev_raid_create", 00:15:48.112 "req_id": 1 00:15:48.112 } 00:15:48.112 Got JSON-RPC error response 00:15:48.112 response: 00:15:48.112 { 00:15:48.112 "code": -17, 00:15:48.112 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:48.112 } 00:15:48.112 08:29:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:48.112 08:29:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:48.112 08:29:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:48.112 08:29:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:48.112 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.112 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:48.112 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:48.113 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:48.113 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:48.372 [2024-07-23 08:29:00.717639] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:48.372 [2024-07-23 08:29:00.717690] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:48.372 [2024-07-23 08:29:00.717708] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037280 00:15:48.372 [2024-07-23 08:29:00.717717] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:48.372 [2024-07-23 08:29:00.719732] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:48.372 [2024-07-23 08:29:00.719762] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:48.372 [2024-07-23 08:29:00.719838] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:48.372 [2024-07-23 08:29:00.719899] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:48.372 pt1 00:15:48.372 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:15:48.372 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:48.372 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:48.372 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:48.372 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.372 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:48.372 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.372 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.372 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.372 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.372 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.372 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:48.631 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.631 "name": "raid_bdev1", 00:15:48.631 "uuid": "00a01af7-f1ba-4caf-8c4e-12b46c9366ca", 00:15:48.631 "strip_size_kb": 64, 00:15:48.631 "state": "configuring", 00:15:48.631 "raid_level": "concat", 00:15:48.631 "superblock": true, 00:15:48.631 "num_base_bdevs": 3, 00:15:48.631 "num_base_bdevs_discovered": 1, 00:15:48.631 "num_base_bdevs_operational": 3, 00:15:48.631 "base_bdevs_list": [ 00:15:48.631 { 00:15:48.631 "name": "pt1", 00:15:48.631 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:48.631 "is_configured": true, 00:15:48.631 "data_offset": 2048, 00:15:48.631 "data_size": 63488 00:15:48.631 }, 00:15:48.631 { 00:15:48.631 "name": null, 00:15:48.631 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:48.631 "is_configured": false, 00:15:48.631 "data_offset": 2048, 00:15:48.631 "data_size": 63488 00:15:48.631 }, 00:15:48.631 { 00:15:48.631 "name": null, 00:15:48.631 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:48.631 "is_configured": false, 00:15:48.631 "data_offset": 2048, 00:15:48.631 "data_size": 63488 00:15:48.631 } 00:15:48.631 ] 00:15:48.631 }' 00:15:48.631 08:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.631 08:29:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:48.889 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:15:48.889 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:49.147 [2024-07-23 08:29:01.547821] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:49.147 [2024-07-23 08:29:01.547881] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:49.147 [2024-07-23 08:29:01.547903] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037b80 00:15:49.147 [2024-07-23 08:29:01.547912] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:49.147 [2024-07-23 08:29:01.548371] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:49.147 [2024-07-23 08:29:01.548390] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:49.147 [2024-07-23 08:29:01.548469] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:49.147 [2024-07-23 08:29:01.548491] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:49.147 pt2 00:15:49.147 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:49.406 [2024-07-23 08:29:01.720326] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:49.406 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:15:49.406 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:49.406 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:49.406 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:49.406 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:49.406 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:49.406 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:49.406 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:49.406 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:49.406 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:49.406 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:49.406 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:49.406 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:49.406 "name": "raid_bdev1", 00:15:49.406 "uuid": "00a01af7-f1ba-4caf-8c4e-12b46c9366ca", 00:15:49.406 "strip_size_kb": 64, 00:15:49.406 "state": "configuring", 00:15:49.406 "raid_level": "concat", 00:15:49.406 "superblock": true, 00:15:49.406 "num_base_bdevs": 3, 00:15:49.406 "num_base_bdevs_discovered": 1, 00:15:49.406 "num_base_bdevs_operational": 3, 00:15:49.406 "base_bdevs_list": [ 00:15:49.406 { 00:15:49.406 "name": "pt1", 00:15:49.406 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:49.406 "is_configured": true, 00:15:49.406 "data_offset": 2048, 00:15:49.406 "data_size": 63488 00:15:49.406 }, 00:15:49.406 { 00:15:49.406 "name": null, 00:15:49.406 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:49.406 "is_configured": false, 00:15:49.406 "data_offset": 2048, 00:15:49.406 "data_size": 63488 00:15:49.406 }, 00:15:49.406 { 00:15:49.406 "name": null, 00:15:49.406 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:49.406 "is_configured": false, 00:15:49.406 "data_offset": 2048, 00:15:49.406 "data_size": 63488 00:15:49.406 } 00:15:49.406 ] 00:15:49.406 }' 00:15:49.406 08:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:49.406 08:29:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.973 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:49.973 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:49.973 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:50.232 [2024-07-23 08:29:02.530417] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:50.232 [2024-07-23 08:29:02.530473] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:50.232 [2024-07-23 08:29:02.530489] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037e80 00:15:50.232 [2024-07-23 08:29:02.530500] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:50.232 [2024-07-23 08:29:02.530935] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:50.232 [2024-07-23 08:29:02.530956] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:50.232 [2024-07-23 08:29:02.531025] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:50.232 [2024-07-23 08:29:02.531046] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:50.232 pt2 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:50.232 [2024-07-23 08:29:02.698858] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:50.232 [2024-07-23 08:29:02.698907] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:50.232 [2024-07-23 08:29:02.698923] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038180 00:15:50.232 [2024-07-23 08:29:02.698933] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:50.232 [2024-07-23 08:29:02.699370] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:50.232 [2024-07-23 08:29:02.699390] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:50.232 [2024-07-23 08:29:02.699463] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:50.232 [2024-07-23 08:29:02.699488] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:50.232 [2024-07-23 08:29:02.699639] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037880 00:15:50.232 [2024-07-23 08:29:02.699653] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:50.232 [2024-07-23 08:29:02.699861] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:15:50.232 [2024-07-23 08:29:02.700027] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037880 00:15:50.232 [2024-07-23 08:29:02.700036] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037880 00:15:50.232 [2024-07-23 08:29:02.700168] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:50.232 pt3 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.232 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:50.491 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:50.491 "name": "raid_bdev1", 00:15:50.491 "uuid": "00a01af7-f1ba-4caf-8c4e-12b46c9366ca", 00:15:50.491 "strip_size_kb": 64, 00:15:50.491 "state": "online", 00:15:50.491 "raid_level": "concat", 00:15:50.491 "superblock": true, 00:15:50.491 "num_base_bdevs": 3, 00:15:50.491 "num_base_bdevs_discovered": 3, 00:15:50.491 "num_base_bdevs_operational": 3, 00:15:50.491 "base_bdevs_list": [ 00:15:50.491 { 00:15:50.491 "name": "pt1", 00:15:50.491 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:50.491 "is_configured": true, 00:15:50.491 "data_offset": 2048, 00:15:50.491 "data_size": 63488 00:15:50.491 }, 00:15:50.491 { 00:15:50.491 "name": "pt2", 00:15:50.491 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:50.491 "is_configured": true, 00:15:50.491 "data_offset": 2048, 00:15:50.491 "data_size": 63488 00:15:50.491 }, 00:15:50.491 { 00:15:50.491 "name": "pt3", 00:15:50.491 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:50.491 "is_configured": true, 00:15:50.491 "data_offset": 2048, 00:15:50.491 "data_size": 63488 00:15:50.491 } 00:15:50.491 ] 00:15:50.491 }' 00:15:50.491 08:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:50.491 08:29:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:51.057 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:51.057 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:51.057 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:51.057 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:51.057 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:51.057 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:51.057 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:51.057 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:51.057 [2024-07-23 08:29:03.557489] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:51.057 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:51.057 "name": "raid_bdev1", 00:15:51.057 "aliases": [ 00:15:51.057 "00a01af7-f1ba-4caf-8c4e-12b46c9366ca" 00:15:51.057 ], 00:15:51.057 "product_name": "Raid Volume", 00:15:51.058 "block_size": 512, 00:15:51.058 "num_blocks": 190464, 00:15:51.058 "uuid": "00a01af7-f1ba-4caf-8c4e-12b46c9366ca", 00:15:51.058 "assigned_rate_limits": { 00:15:51.058 "rw_ios_per_sec": 0, 00:15:51.058 "rw_mbytes_per_sec": 0, 00:15:51.058 "r_mbytes_per_sec": 0, 00:15:51.058 "w_mbytes_per_sec": 0 00:15:51.058 }, 00:15:51.058 "claimed": false, 00:15:51.058 "zoned": false, 00:15:51.058 "supported_io_types": { 00:15:51.058 "read": true, 00:15:51.058 "write": true, 00:15:51.058 "unmap": true, 00:15:51.058 "flush": true, 00:15:51.058 "reset": true, 00:15:51.058 "nvme_admin": false, 00:15:51.058 "nvme_io": false, 00:15:51.058 "nvme_io_md": false, 00:15:51.058 "write_zeroes": true, 00:15:51.058 "zcopy": false, 00:15:51.058 "get_zone_info": false, 00:15:51.058 "zone_management": false, 00:15:51.058 "zone_append": false, 00:15:51.058 "compare": false, 00:15:51.058 "compare_and_write": false, 00:15:51.058 "abort": false, 00:15:51.058 "seek_hole": false, 00:15:51.058 "seek_data": false, 00:15:51.058 "copy": false, 00:15:51.058 "nvme_iov_md": false 00:15:51.058 }, 00:15:51.058 "memory_domains": [ 00:15:51.058 { 00:15:51.058 "dma_device_id": "system", 00:15:51.058 "dma_device_type": 1 00:15:51.058 }, 00:15:51.058 { 00:15:51.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.058 "dma_device_type": 2 00:15:51.058 }, 00:15:51.058 { 00:15:51.058 "dma_device_id": "system", 00:15:51.058 "dma_device_type": 1 00:15:51.058 }, 00:15:51.058 { 00:15:51.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.058 "dma_device_type": 2 00:15:51.058 }, 00:15:51.058 { 00:15:51.058 "dma_device_id": "system", 00:15:51.058 "dma_device_type": 1 00:15:51.058 }, 00:15:51.058 { 00:15:51.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.058 "dma_device_type": 2 00:15:51.058 } 00:15:51.058 ], 00:15:51.058 "driver_specific": { 00:15:51.058 "raid": { 00:15:51.058 "uuid": "00a01af7-f1ba-4caf-8c4e-12b46c9366ca", 00:15:51.058 "strip_size_kb": 64, 00:15:51.058 "state": "online", 00:15:51.058 "raid_level": "concat", 00:15:51.058 "superblock": true, 00:15:51.058 "num_base_bdevs": 3, 00:15:51.058 "num_base_bdevs_discovered": 3, 00:15:51.058 "num_base_bdevs_operational": 3, 00:15:51.058 "base_bdevs_list": [ 00:15:51.058 { 00:15:51.058 "name": "pt1", 00:15:51.058 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:51.058 "is_configured": true, 00:15:51.058 "data_offset": 2048, 00:15:51.058 "data_size": 63488 00:15:51.058 }, 00:15:51.058 { 00:15:51.058 "name": "pt2", 00:15:51.058 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:51.058 "is_configured": true, 00:15:51.058 "data_offset": 2048, 00:15:51.058 "data_size": 63488 00:15:51.058 }, 00:15:51.058 { 00:15:51.058 "name": "pt3", 00:15:51.058 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:51.058 "is_configured": true, 00:15:51.058 "data_offset": 2048, 00:15:51.058 "data_size": 63488 00:15:51.058 } 00:15:51.058 ] 00:15:51.058 } 00:15:51.058 } 00:15:51.058 }' 00:15:51.058 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:51.316 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:51.316 pt2 00:15:51.316 pt3' 00:15:51.316 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:51.316 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:51.316 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:51.316 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:51.316 "name": "pt1", 00:15:51.316 "aliases": [ 00:15:51.316 "00000000-0000-0000-0000-000000000001" 00:15:51.316 ], 00:15:51.316 "product_name": "passthru", 00:15:51.316 "block_size": 512, 00:15:51.316 "num_blocks": 65536, 00:15:51.316 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:51.316 "assigned_rate_limits": { 00:15:51.316 "rw_ios_per_sec": 0, 00:15:51.316 "rw_mbytes_per_sec": 0, 00:15:51.317 "r_mbytes_per_sec": 0, 00:15:51.317 "w_mbytes_per_sec": 0 00:15:51.317 }, 00:15:51.317 "claimed": true, 00:15:51.317 "claim_type": "exclusive_write", 00:15:51.317 "zoned": false, 00:15:51.317 "supported_io_types": { 00:15:51.317 "read": true, 00:15:51.317 "write": true, 00:15:51.317 "unmap": true, 00:15:51.317 "flush": true, 00:15:51.317 "reset": true, 00:15:51.317 "nvme_admin": false, 00:15:51.317 "nvme_io": false, 00:15:51.317 "nvme_io_md": false, 00:15:51.317 "write_zeroes": true, 00:15:51.317 "zcopy": true, 00:15:51.317 "get_zone_info": false, 00:15:51.317 "zone_management": false, 00:15:51.317 "zone_append": false, 00:15:51.317 "compare": false, 00:15:51.317 "compare_and_write": false, 00:15:51.317 "abort": true, 00:15:51.317 "seek_hole": false, 00:15:51.317 "seek_data": false, 00:15:51.317 "copy": true, 00:15:51.317 "nvme_iov_md": false 00:15:51.317 }, 00:15:51.317 "memory_domains": [ 00:15:51.317 { 00:15:51.317 "dma_device_id": "system", 00:15:51.317 "dma_device_type": 1 00:15:51.317 }, 00:15:51.317 { 00:15:51.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.317 "dma_device_type": 2 00:15:51.317 } 00:15:51.317 ], 00:15:51.317 "driver_specific": { 00:15:51.317 "passthru": { 00:15:51.317 "name": "pt1", 00:15:51.317 "base_bdev_name": "malloc1" 00:15:51.317 } 00:15:51.317 } 00:15:51.317 }' 00:15:51.317 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.317 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.575 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:51.575 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.575 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.575 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:51.575 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.575 08:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.575 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:51.575 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:51.575 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:51.575 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:51.834 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:51.834 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:51.834 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:51.834 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:51.834 "name": "pt2", 00:15:51.834 "aliases": [ 00:15:51.834 "00000000-0000-0000-0000-000000000002" 00:15:51.834 ], 00:15:51.834 "product_name": "passthru", 00:15:51.834 "block_size": 512, 00:15:51.834 "num_blocks": 65536, 00:15:51.834 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:51.834 "assigned_rate_limits": { 00:15:51.834 "rw_ios_per_sec": 0, 00:15:51.834 "rw_mbytes_per_sec": 0, 00:15:51.834 "r_mbytes_per_sec": 0, 00:15:51.834 "w_mbytes_per_sec": 0 00:15:51.834 }, 00:15:51.834 "claimed": true, 00:15:51.834 "claim_type": "exclusive_write", 00:15:51.834 "zoned": false, 00:15:51.834 "supported_io_types": { 00:15:51.834 "read": true, 00:15:51.834 "write": true, 00:15:51.834 "unmap": true, 00:15:51.834 "flush": true, 00:15:51.834 "reset": true, 00:15:51.834 "nvme_admin": false, 00:15:51.834 "nvme_io": false, 00:15:51.834 "nvme_io_md": false, 00:15:51.834 "write_zeroes": true, 00:15:51.834 "zcopy": true, 00:15:51.834 "get_zone_info": false, 00:15:51.834 "zone_management": false, 00:15:51.834 "zone_append": false, 00:15:51.834 "compare": false, 00:15:51.834 "compare_and_write": false, 00:15:51.834 "abort": true, 00:15:51.834 "seek_hole": false, 00:15:51.834 "seek_data": false, 00:15:51.834 "copy": true, 00:15:51.834 "nvme_iov_md": false 00:15:51.834 }, 00:15:51.834 "memory_domains": [ 00:15:51.834 { 00:15:51.834 "dma_device_id": "system", 00:15:51.834 "dma_device_type": 1 00:15:51.834 }, 00:15:51.834 { 00:15:51.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.834 "dma_device_type": 2 00:15:51.834 } 00:15:51.834 ], 00:15:51.834 "driver_specific": { 00:15:51.834 "passthru": { 00:15:51.834 "name": "pt2", 00:15:51.834 "base_bdev_name": "malloc2" 00:15:51.834 } 00:15:51.834 } 00:15:51.834 }' 00:15:51.834 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.834 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.834 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:51.834 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:52.093 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:52.093 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:52.093 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:52.093 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:52.093 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:52.093 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.093 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.093 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:52.093 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:52.093 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:52.093 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:52.352 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:52.352 "name": "pt3", 00:15:52.352 "aliases": [ 00:15:52.352 "00000000-0000-0000-0000-000000000003" 00:15:52.352 ], 00:15:52.352 "product_name": "passthru", 00:15:52.352 "block_size": 512, 00:15:52.352 "num_blocks": 65536, 00:15:52.352 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:52.352 "assigned_rate_limits": { 00:15:52.352 "rw_ios_per_sec": 0, 00:15:52.352 "rw_mbytes_per_sec": 0, 00:15:52.352 "r_mbytes_per_sec": 0, 00:15:52.352 "w_mbytes_per_sec": 0 00:15:52.352 }, 00:15:52.352 "claimed": true, 00:15:52.352 "claim_type": "exclusive_write", 00:15:52.352 "zoned": false, 00:15:52.352 "supported_io_types": { 00:15:52.352 "read": true, 00:15:52.352 "write": true, 00:15:52.352 "unmap": true, 00:15:52.352 "flush": true, 00:15:52.352 "reset": true, 00:15:52.352 "nvme_admin": false, 00:15:52.352 "nvme_io": false, 00:15:52.352 "nvme_io_md": false, 00:15:52.352 "write_zeroes": true, 00:15:52.352 "zcopy": true, 00:15:52.352 "get_zone_info": false, 00:15:52.352 "zone_management": false, 00:15:52.352 "zone_append": false, 00:15:52.352 "compare": false, 00:15:52.352 "compare_and_write": false, 00:15:52.352 "abort": true, 00:15:52.352 "seek_hole": false, 00:15:52.352 "seek_data": false, 00:15:52.352 "copy": true, 00:15:52.352 "nvme_iov_md": false 00:15:52.352 }, 00:15:52.352 "memory_domains": [ 00:15:52.352 { 00:15:52.352 "dma_device_id": "system", 00:15:52.352 "dma_device_type": 1 00:15:52.352 }, 00:15:52.352 { 00:15:52.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.352 "dma_device_type": 2 00:15:52.352 } 00:15:52.352 ], 00:15:52.352 "driver_specific": { 00:15:52.352 "passthru": { 00:15:52.352 "name": "pt3", 00:15:52.352 "base_bdev_name": "malloc3" 00:15:52.352 } 00:15:52.352 } 00:15:52.352 }' 00:15:52.352 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:52.352 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:52.352 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:52.352 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:52.352 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:52.611 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:52.611 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:52.611 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:52.611 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:52.611 08:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.611 08:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.611 08:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:52.611 08:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:52.611 08:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:52.870 [2024-07-23 08:29:05.221870] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:52.870 08:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 00a01af7-f1ba-4caf-8c4e-12b46c9366ca '!=' 00a01af7-f1ba-4caf-8c4e-12b46c9366ca ']' 00:15:52.870 08:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:15:52.870 08:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:52.870 08:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:52.870 08:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1448838 00:15:52.870 08:29:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1448838 ']' 00:15:52.870 08:29:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1448838 00:15:52.870 08:29:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:52.870 08:29:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:52.870 08:29:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1448838 00:15:52.870 08:29:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:52.870 08:29:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:52.870 08:29:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1448838' 00:15:52.870 killing process with pid 1448838 00:15:52.870 08:29:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1448838 00:15:52.870 [2024-07-23 08:29:05.282367] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:52.870 [2024-07-23 08:29:05.282459] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:52.870 [2024-07-23 08:29:05.282518] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:52.870 08:29:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1448838 00:15:52.870 [2024-07-23 08:29:05.282531] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037880 name raid_bdev1, state offline 00:15:53.129 [2024-07-23 08:29:05.521496] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:54.506 08:29:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:54.506 00:15:54.506 real 0m12.258s 00:15:54.506 user 0m20.988s 00:15:54.506 sys 0m1.837s 00:15:54.506 08:29:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:54.506 08:29:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:54.506 ************************************ 00:15:54.506 END TEST raid_superblock_test 00:15:54.506 ************************************ 00:15:54.506 08:29:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:54.506 08:29:06 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:15:54.506 08:29:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:54.506 08:29:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:54.506 08:29:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:54.506 ************************************ 00:15:54.506 START TEST raid_read_error_test 00:15:54.506 ************************************ 00:15:54.506 08:29:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.8bQa9WoMji 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1451398 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1451398 /var/tmp/spdk-raid.sock 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1451398 ']' 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:54.507 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:54.507 08:29:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:54.507 [2024-07-23 08:29:06.935744] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:15:54.507 [2024-07-23 08:29:06.935839] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1451398 ] 00:15:54.766 [2024-07-23 08:29:07.062148] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:54.766 [2024-07-23 08:29:07.268969] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:55.046 [2024-07-23 08:29:07.529150] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:55.046 [2024-07-23 08:29:07.529181] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:55.303 08:29:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:55.303 08:29:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:55.303 08:29:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:55.303 08:29:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:55.561 BaseBdev1_malloc 00:15:55.561 08:29:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:55.561 true 00:15:55.820 08:29:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:55.820 [2024-07-23 08:29:08.231310] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:55.820 [2024-07-23 08:29:08.231363] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:55.820 [2024-07-23 08:29:08.231380] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:15:55.820 [2024-07-23 08:29:08.231390] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:55.820 [2024-07-23 08:29:08.233278] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:55.820 [2024-07-23 08:29:08.233309] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:55.820 BaseBdev1 00:15:55.820 08:29:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:55.820 08:29:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:56.079 BaseBdev2_malloc 00:15:56.079 08:29:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:56.337 true 00:15:56.337 08:29:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:56.337 [2024-07-23 08:29:08.777123] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:56.337 [2024-07-23 08:29:08.777173] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:56.338 [2024-07-23 08:29:08.777190] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:15:56.338 [2024-07-23 08:29:08.777203] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:56.338 [2024-07-23 08:29:08.779172] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:56.338 [2024-07-23 08:29:08.779202] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:56.338 BaseBdev2 00:15:56.338 08:29:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:56.338 08:29:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:56.596 BaseBdev3_malloc 00:15:56.596 08:29:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:56.854 true 00:15:56.854 08:29:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:56.854 [2024-07-23 08:29:09.332589] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:56.854 [2024-07-23 08:29:09.332647] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:56.854 [2024-07-23 08:29:09.332667] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036980 00:15:56.854 [2024-07-23 08:29:09.332678] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:56.854 [2024-07-23 08:29:09.334570] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:56.854 [2024-07-23 08:29:09.334600] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:56.854 BaseBdev3 00:15:56.854 08:29:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:57.114 [2024-07-23 08:29:09.501093] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:57.114 [2024-07-23 08:29:09.502705] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:57.114 [2024-07-23 08:29:09.502775] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:57.114 [2024-07-23 08:29:09.502990] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036f80 00:15:57.114 [2024-07-23 08:29:09.503002] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:57.114 [2024-07-23 08:29:09.503256] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:15:57.114 [2024-07-23 08:29:09.503457] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036f80 00:15:57.114 [2024-07-23 08:29:09.503471] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036f80 00:15:57.114 [2024-07-23 08:29:09.503641] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:57.114 08:29:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:57.114 08:29:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:57.114 08:29:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:57.114 08:29:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:57.114 08:29:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:57.114 08:29:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:57.114 08:29:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:57.114 08:29:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:57.114 08:29:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:57.114 08:29:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:57.114 08:29:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:57.114 08:29:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:57.373 08:29:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:57.373 "name": "raid_bdev1", 00:15:57.373 "uuid": "e0ad2509-bfbb-429d-8432-069753bf6e0b", 00:15:57.373 "strip_size_kb": 64, 00:15:57.373 "state": "online", 00:15:57.373 "raid_level": "concat", 00:15:57.373 "superblock": true, 00:15:57.373 "num_base_bdevs": 3, 00:15:57.373 "num_base_bdevs_discovered": 3, 00:15:57.373 "num_base_bdevs_operational": 3, 00:15:57.373 "base_bdevs_list": [ 00:15:57.373 { 00:15:57.373 "name": "BaseBdev1", 00:15:57.373 "uuid": "e890c2c3-e08b-5351-8053-9874c37d658d", 00:15:57.373 "is_configured": true, 00:15:57.373 "data_offset": 2048, 00:15:57.373 "data_size": 63488 00:15:57.373 }, 00:15:57.373 { 00:15:57.373 "name": "BaseBdev2", 00:15:57.373 "uuid": "8c28cbde-aae5-55fe-b574-ea15c91d8296", 00:15:57.373 "is_configured": true, 00:15:57.373 "data_offset": 2048, 00:15:57.373 "data_size": 63488 00:15:57.373 }, 00:15:57.373 { 00:15:57.373 "name": "BaseBdev3", 00:15:57.373 "uuid": "b7c30ff6-ee18-5f65-a1f7-31ea08c9dec1", 00:15:57.373 "is_configured": true, 00:15:57.373 "data_offset": 2048, 00:15:57.373 "data_size": 63488 00:15:57.373 } 00:15:57.373 ] 00:15:57.373 }' 00:15:57.373 08:29:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:57.373 08:29:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:57.940 08:29:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:57.940 08:29:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:57.940 [2024-07-23 08:29:10.260858] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:15:58.877 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:58.877 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:58.877 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:15:58.877 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:58.877 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:58.877 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:58.877 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:58.877 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:58.877 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:58.877 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:58.877 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.877 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.877 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.877 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.877 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:58.877 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.144 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.145 "name": "raid_bdev1", 00:15:59.145 "uuid": "e0ad2509-bfbb-429d-8432-069753bf6e0b", 00:15:59.145 "strip_size_kb": 64, 00:15:59.145 "state": "online", 00:15:59.145 "raid_level": "concat", 00:15:59.145 "superblock": true, 00:15:59.145 "num_base_bdevs": 3, 00:15:59.145 "num_base_bdevs_discovered": 3, 00:15:59.145 "num_base_bdevs_operational": 3, 00:15:59.145 "base_bdevs_list": [ 00:15:59.145 { 00:15:59.145 "name": "BaseBdev1", 00:15:59.145 "uuid": "e890c2c3-e08b-5351-8053-9874c37d658d", 00:15:59.145 "is_configured": true, 00:15:59.145 "data_offset": 2048, 00:15:59.145 "data_size": 63488 00:15:59.145 }, 00:15:59.145 { 00:15:59.145 "name": "BaseBdev2", 00:15:59.145 "uuid": "8c28cbde-aae5-55fe-b574-ea15c91d8296", 00:15:59.145 "is_configured": true, 00:15:59.145 "data_offset": 2048, 00:15:59.145 "data_size": 63488 00:15:59.145 }, 00:15:59.145 { 00:15:59.145 "name": "BaseBdev3", 00:15:59.145 "uuid": "b7c30ff6-ee18-5f65-a1f7-31ea08c9dec1", 00:15:59.145 "is_configured": true, 00:15:59.145 "data_offset": 2048, 00:15:59.145 "data_size": 63488 00:15:59.145 } 00:15:59.145 ] 00:15:59.145 }' 00:15:59.145 08:29:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.145 08:29:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:59.761 08:29:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:59.761 [2024-07-23 08:29:12.186226] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:59.761 [2024-07-23 08:29:12.186267] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:59.761 [2024-07-23 08:29:12.188571] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:59.761 [2024-07-23 08:29:12.188607] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:59.761 [2024-07-23 08:29:12.188665] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:59.761 [2024-07-23 08:29:12.188675] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036f80 name raid_bdev1, state offline 00:15:59.761 0 00:15:59.761 08:29:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1451398 00:15:59.761 08:29:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1451398 ']' 00:15:59.761 08:29:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1451398 00:15:59.761 08:29:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:59.761 08:29:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:59.761 08:29:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1451398 00:15:59.761 08:29:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:59.761 08:29:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:59.761 08:29:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1451398' 00:15:59.761 killing process with pid 1451398 00:15:59.761 08:29:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1451398 00:15:59.761 [2024-07-23 08:29:12.247685] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:59.761 08:29:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1451398 00:16:00.020 [2024-07-23 08:29:12.413435] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:01.406 08:29:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:01.406 08:29:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.8bQa9WoMji 00:16:01.406 08:29:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:01.406 08:29:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:16:01.406 08:29:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:01.406 08:29:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:01.406 08:29:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:01.406 08:29:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:16:01.406 00:16:01.406 real 0m6.885s 00:16:01.406 user 0m9.805s 00:16:01.406 sys 0m0.932s 00:16:01.406 08:29:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:01.406 08:29:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:01.406 ************************************ 00:16:01.406 END TEST raid_read_error_test 00:16:01.406 ************************************ 00:16:01.406 08:29:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:01.406 08:29:13 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:16:01.406 08:29:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:01.406 08:29:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:01.406 08:29:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:01.406 ************************************ 00:16:01.406 START TEST raid_write_error_test 00:16:01.406 ************************************ 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.XYv0oMtjW9 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1452763 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1452763 /var/tmp/spdk-raid.sock 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1452763 ']' 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:01.406 08:29:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:01.407 08:29:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:01.407 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:01.407 08:29:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:01.407 08:29:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:01.407 08:29:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:01.407 [2024-07-23 08:29:13.875421] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:16:01.407 [2024-07-23 08:29:13.875510] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1452763 ] 00:16:01.667 [2024-07-23 08:29:13.998235] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:01.925 [2024-07-23 08:29:14.217536] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:02.184 [2024-07-23 08:29:14.537392] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:02.184 [2024-07-23 08:29:14.537426] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:02.184 08:29:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:02.184 08:29:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:02.184 08:29:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:02.184 08:29:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:02.442 BaseBdev1_malloc 00:16:02.442 08:29:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:02.701 true 00:16:02.701 08:29:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:02.701 [2024-07-23 08:29:15.196068] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:02.701 [2024-07-23 08:29:15.196128] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:02.701 [2024-07-23 08:29:15.196149] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:16:02.701 [2024-07-23 08:29:15.196160] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:02.701 [2024-07-23 08:29:15.198177] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:02.701 [2024-07-23 08:29:15.198209] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:02.701 BaseBdev1 00:16:02.701 08:29:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:02.701 08:29:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:02.960 BaseBdev2_malloc 00:16:02.960 08:29:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:03.219 true 00:16:03.219 08:29:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:03.478 [2024-07-23 08:29:15.741318] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:03.478 [2024-07-23 08:29:15.741373] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:03.478 [2024-07-23 08:29:15.741393] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:16:03.478 [2024-07-23 08:29:15.741407] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:03.478 [2024-07-23 08:29:15.743452] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:03.478 [2024-07-23 08:29:15.743486] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:03.478 BaseBdev2 00:16:03.478 08:29:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:03.478 08:29:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:03.478 BaseBdev3_malloc 00:16:03.478 08:29:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:03.737 true 00:16:03.737 08:29:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:03.996 [2024-07-23 08:29:16.285896] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:03.996 [2024-07-23 08:29:16.285949] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:03.996 [2024-07-23 08:29:16.285969] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036980 00:16:03.996 [2024-07-23 08:29:16.285979] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:03.996 [2024-07-23 08:29:16.287987] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:03.996 [2024-07-23 08:29:16.288018] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:03.996 BaseBdev3 00:16:03.996 08:29:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:03.996 [2024-07-23 08:29:16.442336] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:03.996 [2024-07-23 08:29:16.443887] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:03.996 [2024-07-23 08:29:16.443959] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:03.996 [2024-07-23 08:29:16.444167] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036f80 00:16:03.996 [2024-07-23 08:29:16.444179] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:03.996 [2024-07-23 08:29:16.444419] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:16:03.996 [2024-07-23 08:29:16.444620] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036f80 00:16:03.996 [2024-07-23 08:29:16.444651] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036f80 00:16:03.996 [2024-07-23 08:29:16.444817] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:03.996 08:29:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:03.996 08:29:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:03.996 08:29:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:03.996 08:29:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:03.996 08:29:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:03.996 08:29:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:03.996 08:29:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:03.996 08:29:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:03.996 08:29:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:03.996 08:29:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:03.996 08:29:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.996 08:29:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:04.255 08:29:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.255 "name": "raid_bdev1", 00:16:04.255 "uuid": "a7f2edab-f690-4c32-adfb-3902a873098e", 00:16:04.255 "strip_size_kb": 64, 00:16:04.255 "state": "online", 00:16:04.255 "raid_level": "concat", 00:16:04.255 "superblock": true, 00:16:04.255 "num_base_bdevs": 3, 00:16:04.255 "num_base_bdevs_discovered": 3, 00:16:04.255 "num_base_bdevs_operational": 3, 00:16:04.255 "base_bdevs_list": [ 00:16:04.255 { 00:16:04.255 "name": "BaseBdev1", 00:16:04.255 "uuid": "1a0d82dc-c56f-5b37-8b7b-cfb94bd31c2f", 00:16:04.255 "is_configured": true, 00:16:04.255 "data_offset": 2048, 00:16:04.255 "data_size": 63488 00:16:04.255 }, 00:16:04.255 { 00:16:04.255 "name": "BaseBdev2", 00:16:04.255 "uuid": "41d3a92f-e9c6-5a22-badf-76c941838e29", 00:16:04.255 "is_configured": true, 00:16:04.255 "data_offset": 2048, 00:16:04.255 "data_size": 63488 00:16:04.255 }, 00:16:04.255 { 00:16:04.255 "name": "BaseBdev3", 00:16:04.255 "uuid": "6022f3ac-a5d1-53fb-9647-69dcc7c098ca", 00:16:04.255 "is_configured": true, 00:16:04.255 "data_offset": 2048, 00:16:04.255 "data_size": 63488 00:16:04.255 } 00:16:04.255 ] 00:16:04.255 }' 00:16:04.255 08:29:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.255 08:29:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:04.822 08:29:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:04.822 08:29:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:04.822 [2024-07-23 08:29:17.194119] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:16:05.758 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:06.017 "name": "raid_bdev1", 00:16:06.017 "uuid": "a7f2edab-f690-4c32-adfb-3902a873098e", 00:16:06.017 "strip_size_kb": 64, 00:16:06.017 "state": "online", 00:16:06.017 "raid_level": "concat", 00:16:06.017 "superblock": true, 00:16:06.017 "num_base_bdevs": 3, 00:16:06.017 "num_base_bdevs_discovered": 3, 00:16:06.017 "num_base_bdevs_operational": 3, 00:16:06.017 "base_bdevs_list": [ 00:16:06.017 { 00:16:06.017 "name": "BaseBdev1", 00:16:06.017 "uuid": "1a0d82dc-c56f-5b37-8b7b-cfb94bd31c2f", 00:16:06.017 "is_configured": true, 00:16:06.017 "data_offset": 2048, 00:16:06.017 "data_size": 63488 00:16:06.017 }, 00:16:06.017 { 00:16:06.017 "name": "BaseBdev2", 00:16:06.017 "uuid": "41d3a92f-e9c6-5a22-badf-76c941838e29", 00:16:06.017 "is_configured": true, 00:16:06.017 "data_offset": 2048, 00:16:06.017 "data_size": 63488 00:16:06.017 }, 00:16:06.017 { 00:16:06.017 "name": "BaseBdev3", 00:16:06.017 "uuid": "6022f3ac-a5d1-53fb-9647-69dcc7c098ca", 00:16:06.017 "is_configured": true, 00:16:06.017 "data_offset": 2048, 00:16:06.017 "data_size": 63488 00:16:06.017 } 00:16:06.017 ] 00:16:06.017 }' 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:06.017 08:29:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:06.585 08:29:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:06.845 [2024-07-23 08:29:19.111215] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:06.845 [2024-07-23 08:29:19.111249] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:06.845 [2024-07-23 08:29:19.113627] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:06.845 [2024-07-23 08:29:19.113677] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:06.845 [2024-07-23 08:29:19.113711] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:06.845 [2024-07-23 08:29:19.113721] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036f80 name raid_bdev1, state offline 00:16:06.845 0 00:16:06.845 08:29:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1452763 00:16:06.845 08:29:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1452763 ']' 00:16:06.845 08:29:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1452763 00:16:06.845 08:29:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:16:06.845 08:29:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:06.845 08:29:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1452763 00:16:06.845 08:29:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:06.845 08:29:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:06.845 08:29:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1452763' 00:16:06.845 killing process with pid 1452763 00:16:06.845 08:29:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1452763 00:16:06.845 [2024-07-23 08:29:19.174773] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:06.845 08:29:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1452763 00:16:06.845 [2024-07-23 08:29:19.336918] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:08.223 08:29:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.XYv0oMtjW9 00:16:08.223 08:29:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:08.223 08:29:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:08.223 08:29:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:16:08.223 08:29:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:08.223 08:29:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:08.223 08:29:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:08.223 08:29:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:16:08.223 00:16:08.223 real 0m6.884s 00:16:08.223 user 0m9.736s 00:16:08.223 sys 0m0.931s 00:16:08.223 08:29:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:08.223 08:29:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:08.223 ************************************ 00:16:08.223 END TEST raid_write_error_test 00:16:08.223 ************************************ 00:16:08.223 08:29:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:08.223 08:29:20 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:08.223 08:29:20 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:16:08.223 08:29:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:08.223 08:29:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:08.223 08:29:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:08.223 ************************************ 00:16:08.223 START TEST raid_state_function_test 00:16:08.223 ************************************ 00:16:08.223 08:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:16:08.223 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:08.223 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1454128 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1454128' 00:16:08.482 Process raid pid: 1454128 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1454128 /var/tmp/spdk-raid.sock 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1454128 ']' 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:08.482 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:08.482 08:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:08.482 [2024-07-23 08:29:20.828051] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:16:08.482 [2024-07-23 08:29:20.828135] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:08.482 [2024-07-23 08:29:20.954688] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:08.741 [2024-07-23 08:29:21.177356] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:09.000 [2024-07-23 08:29:21.434340] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:09.000 [2024-07-23 08:29:21.434372] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:09.259 08:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:09.259 08:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:16:09.259 08:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:09.259 [2024-07-23 08:29:21.744724] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:09.259 [2024-07-23 08:29:21.744771] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:09.259 [2024-07-23 08:29:21.744781] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:09.259 [2024-07-23 08:29:21.744795] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:09.259 [2024-07-23 08:29:21.744802] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:09.259 [2024-07-23 08:29:21.744811] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:09.259 08:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:09.259 08:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:09.259 08:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:09.259 08:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:09.259 08:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:09.259 08:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:09.259 08:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:09.259 08:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:09.259 08:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:09.259 08:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:09.259 08:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.259 08:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:09.518 08:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:09.519 "name": "Existed_Raid", 00:16:09.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:09.519 "strip_size_kb": 0, 00:16:09.519 "state": "configuring", 00:16:09.519 "raid_level": "raid1", 00:16:09.519 "superblock": false, 00:16:09.519 "num_base_bdevs": 3, 00:16:09.519 "num_base_bdevs_discovered": 0, 00:16:09.519 "num_base_bdevs_operational": 3, 00:16:09.519 "base_bdevs_list": [ 00:16:09.519 { 00:16:09.519 "name": "BaseBdev1", 00:16:09.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:09.519 "is_configured": false, 00:16:09.519 "data_offset": 0, 00:16:09.519 "data_size": 0 00:16:09.519 }, 00:16:09.519 { 00:16:09.519 "name": "BaseBdev2", 00:16:09.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:09.519 "is_configured": false, 00:16:09.519 "data_offset": 0, 00:16:09.519 "data_size": 0 00:16:09.519 }, 00:16:09.519 { 00:16:09.519 "name": "BaseBdev3", 00:16:09.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:09.519 "is_configured": false, 00:16:09.519 "data_offset": 0, 00:16:09.519 "data_size": 0 00:16:09.519 } 00:16:09.519 ] 00:16:09.519 }' 00:16:09.519 08:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:09.519 08:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:10.087 08:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:10.087 [2024-07-23 08:29:22.566753] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:10.087 [2024-07-23 08:29:22.566787] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:16:10.087 08:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:10.345 [2024-07-23 08:29:22.723169] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:10.345 [2024-07-23 08:29:22.723207] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:10.345 [2024-07-23 08:29:22.723216] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:10.345 [2024-07-23 08:29:22.723227] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:10.345 [2024-07-23 08:29:22.723234] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:10.345 [2024-07-23 08:29:22.723248] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:10.345 08:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:10.604 [2024-07-23 08:29:22.919555] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:10.604 BaseBdev1 00:16:10.604 08:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:10.604 08:29:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:10.604 08:29:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:10.604 08:29:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:10.604 08:29:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:10.604 08:29:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:10.604 08:29:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:10.604 08:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:10.862 [ 00:16:10.862 { 00:16:10.862 "name": "BaseBdev1", 00:16:10.862 "aliases": [ 00:16:10.862 "27553a31-2ce5-4ccb-8d12-aecec9f7a9ca" 00:16:10.862 ], 00:16:10.862 "product_name": "Malloc disk", 00:16:10.862 "block_size": 512, 00:16:10.862 "num_blocks": 65536, 00:16:10.862 "uuid": "27553a31-2ce5-4ccb-8d12-aecec9f7a9ca", 00:16:10.862 "assigned_rate_limits": { 00:16:10.862 "rw_ios_per_sec": 0, 00:16:10.862 "rw_mbytes_per_sec": 0, 00:16:10.862 "r_mbytes_per_sec": 0, 00:16:10.862 "w_mbytes_per_sec": 0 00:16:10.862 }, 00:16:10.862 "claimed": true, 00:16:10.862 "claim_type": "exclusive_write", 00:16:10.862 "zoned": false, 00:16:10.862 "supported_io_types": { 00:16:10.862 "read": true, 00:16:10.862 "write": true, 00:16:10.862 "unmap": true, 00:16:10.862 "flush": true, 00:16:10.862 "reset": true, 00:16:10.862 "nvme_admin": false, 00:16:10.862 "nvme_io": false, 00:16:10.862 "nvme_io_md": false, 00:16:10.862 "write_zeroes": true, 00:16:10.862 "zcopy": true, 00:16:10.862 "get_zone_info": false, 00:16:10.862 "zone_management": false, 00:16:10.862 "zone_append": false, 00:16:10.862 "compare": false, 00:16:10.862 "compare_and_write": false, 00:16:10.862 "abort": true, 00:16:10.862 "seek_hole": false, 00:16:10.862 "seek_data": false, 00:16:10.862 "copy": true, 00:16:10.862 "nvme_iov_md": false 00:16:10.862 }, 00:16:10.862 "memory_domains": [ 00:16:10.862 { 00:16:10.862 "dma_device_id": "system", 00:16:10.862 "dma_device_type": 1 00:16:10.862 }, 00:16:10.862 { 00:16:10.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.862 "dma_device_type": 2 00:16:10.862 } 00:16:10.862 ], 00:16:10.862 "driver_specific": {} 00:16:10.862 } 00:16:10.862 ] 00:16:10.862 08:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:10.862 08:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:10.862 08:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:10.863 08:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:10.863 08:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:10.863 08:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:10.863 08:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:10.863 08:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:10.863 08:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:10.863 08:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:10.863 08:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:10.863 08:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.863 08:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:11.122 08:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:11.122 "name": "Existed_Raid", 00:16:11.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.122 "strip_size_kb": 0, 00:16:11.122 "state": "configuring", 00:16:11.122 "raid_level": "raid1", 00:16:11.122 "superblock": false, 00:16:11.122 "num_base_bdevs": 3, 00:16:11.122 "num_base_bdevs_discovered": 1, 00:16:11.122 "num_base_bdevs_operational": 3, 00:16:11.122 "base_bdevs_list": [ 00:16:11.122 { 00:16:11.122 "name": "BaseBdev1", 00:16:11.122 "uuid": "27553a31-2ce5-4ccb-8d12-aecec9f7a9ca", 00:16:11.122 "is_configured": true, 00:16:11.122 "data_offset": 0, 00:16:11.122 "data_size": 65536 00:16:11.122 }, 00:16:11.122 { 00:16:11.122 "name": "BaseBdev2", 00:16:11.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.122 "is_configured": false, 00:16:11.122 "data_offset": 0, 00:16:11.122 "data_size": 0 00:16:11.122 }, 00:16:11.122 { 00:16:11.122 "name": "BaseBdev3", 00:16:11.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.122 "is_configured": false, 00:16:11.122 "data_offset": 0, 00:16:11.122 "data_size": 0 00:16:11.122 } 00:16:11.122 ] 00:16:11.122 }' 00:16:11.122 08:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:11.122 08:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:11.690 08:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:11.690 [2024-07-23 08:29:24.082675] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:11.690 [2024-07-23 08:29:24.082725] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:16:11.690 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:11.949 [2024-07-23 08:29:24.251151] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:11.949 [2024-07-23 08:29:24.252791] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:11.949 [2024-07-23 08:29:24.252838] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:11.949 [2024-07-23 08:29:24.252847] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:11.949 [2024-07-23 08:29:24.252856] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:11.949 "name": "Existed_Raid", 00:16:11.949 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.949 "strip_size_kb": 0, 00:16:11.949 "state": "configuring", 00:16:11.949 "raid_level": "raid1", 00:16:11.949 "superblock": false, 00:16:11.949 "num_base_bdevs": 3, 00:16:11.949 "num_base_bdevs_discovered": 1, 00:16:11.949 "num_base_bdevs_operational": 3, 00:16:11.949 "base_bdevs_list": [ 00:16:11.949 { 00:16:11.949 "name": "BaseBdev1", 00:16:11.949 "uuid": "27553a31-2ce5-4ccb-8d12-aecec9f7a9ca", 00:16:11.949 "is_configured": true, 00:16:11.949 "data_offset": 0, 00:16:11.949 "data_size": 65536 00:16:11.949 }, 00:16:11.949 { 00:16:11.949 "name": "BaseBdev2", 00:16:11.949 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.949 "is_configured": false, 00:16:11.949 "data_offset": 0, 00:16:11.949 "data_size": 0 00:16:11.949 }, 00:16:11.949 { 00:16:11.949 "name": "BaseBdev3", 00:16:11.949 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.949 "is_configured": false, 00:16:11.949 "data_offset": 0, 00:16:11.949 "data_size": 0 00:16:11.949 } 00:16:11.949 ] 00:16:11.949 }' 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:11.949 08:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:12.516 08:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:12.775 [2024-07-23 08:29:25.112361] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:12.775 BaseBdev2 00:16:12.775 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:12.775 08:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:12.775 08:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:12.775 08:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:12.775 08:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:12.775 08:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:12.775 08:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:13.034 08:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:13.034 [ 00:16:13.034 { 00:16:13.034 "name": "BaseBdev2", 00:16:13.034 "aliases": [ 00:16:13.034 "d85ee3e3-4785-4ed9-80ef-919ed94df6b5" 00:16:13.034 ], 00:16:13.034 "product_name": "Malloc disk", 00:16:13.034 "block_size": 512, 00:16:13.034 "num_blocks": 65536, 00:16:13.034 "uuid": "d85ee3e3-4785-4ed9-80ef-919ed94df6b5", 00:16:13.034 "assigned_rate_limits": { 00:16:13.034 "rw_ios_per_sec": 0, 00:16:13.034 "rw_mbytes_per_sec": 0, 00:16:13.034 "r_mbytes_per_sec": 0, 00:16:13.034 "w_mbytes_per_sec": 0 00:16:13.034 }, 00:16:13.034 "claimed": true, 00:16:13.034 "claim_type": "exclusive_write", 00:16:13.034 "zoned": false, 00:16:13.034 "supported_io_types": { 00:16:13.034 "read": true, 00:16:13.034 "write": true, 00:16:13.034 "unmap": true, 00:16:13.034 "flush": true, 00:16:13.034 "reset": true, 00:16:13.034 "nvme_admin": false, 00:16:13.034 "nvme_io": false, 00:16:13.034 "nvme_io_md": false, 00:16:13.034 "write_zeroes": true, 00:16:13.034 "zcopy": true, 00:16:13.034 "get_zone_info": false, 00:16:13.034 "zone_management": false, 00:16:13.034 "zone_append": false, 00:16:13.034 "compare": false, 00:16:13.034 "compare_and_write": false, 00:16:13.034 "abort": true, 00:16:13.034 "seek_hole": false, 00:16:13.034 "seek_data": false, 00:16:13.034 "copy": true, 00:16:13.034 "nvme_iov_md": false 00:16:13.034 }, 00:16:13.034 "memory_domains": [ 00:16:13.034 { 00:16:13.034 "dma_device_id": "system", 00:16:13.034 "dma_device_type": 1 00:16:13.034 }, 00:16:13.034 { 00:16:13.034 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:13.034 "dma_device_type": 2 00:16:13.034 } 00:16:13.034 ], 00:16:13.034 "driver_specific": {} 00:16:13.034 } 00:16:13.034 ] 00:16:13.034 08:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:13.034 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:13.034 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:13.034 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:13.034 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:13.034 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:13.034 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:13.034 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:13.034 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:13.034 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:13.034 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:13.034 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:13.034 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:13.034 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.034 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:13.293 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:13.293 "name": "Existed_Raid", 00:16:13.293 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:13.293 "strip_size_kb": 0, 00:16:13.293 "state": "configuring", 00:16:13.293 "raid_level": "raid1", 00:16:13.293 "superblock": false, 00:16:13.293 "num_base_bdevs": 3, 00:16:13.293 "num_base_bdevs_discovered": 2, 00:16:13.293 "num_base_bdevs_operational": 3, 00:16:13.293 "base_bdevs_list": [ 00:16:13.293 { 00:16:13.293 "name": "BaseBdev1", 00:16:13.293 "uuid": "27553a31-2ce5-4ccb-8d12-aecec9f7a9ca", 00:16:13.293 "is_configured": true, 00:16:13.293 "data_offset": 0, 00:16:13.293 "data_size": 65536 00:16:13.293 }, 00:16:13.293 { 00:16:13.293 "name": "BaseBdev2", 00:16:13.293 "uuid": "d85ee3e3-4785-4ed9-80ef-919ed94df6b5", 00:16:13.293 "is_configured": true, 00:16:13.293 "data_offset": 0, 00:16:13.293 "data_size": 65536 00:16:13.293 }, 00:16:13.293 { 00:16:13.293 "name": "BaseBdev3", 00:16:13.293 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:13.293 "is_configured": false, 00:16:13.293 "data_offset": 0, 00:16:13.293 "data_size": 0 00:16:13.293 } 00:16:13.293 ] 00:16:13.293 }' 00:16:13.293 08:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:13.293 08:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:13.859 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:13.859 [2024-07-23 08:29:26.305923] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:13.859 [2024-07-23 08:29:26.305970] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:16:13.859 [2024-07-23 08:29:26.305980] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:13.859 [2024-07-23 08:29:26.306201] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:16:13.859 [2024-07-23 08:29:26.306386] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:16:13.859 [2024-07-23 08:29:26.306396] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:16:13.859 [2024-07-23 08:29:26.306677] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:13.859 BaseBdev3 00:16:13.859 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:13.859 08:29:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:13.859 08:29:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:13.859 08:29:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:13.859 08:29:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:13.859 08:29:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:13.859 08:29:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:14.117 08:29:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:14.401 [ 00:16:14.401 { 00:16:14.401 "name": "BaseBdev3", 00:16:14.401 "aliases": [ 00:16:14.401 "b7ba07df-2273-4e49-a5dd-26c800189c65" 00:16:14.401 ], 00:16:14.401 "product_name": "Malloc disk", 00:16:14.401 "block_size": 512, 00:16:14.401 "num_blocks": 65536, 00:16:14.401 "uuid": "b7ba07df-2273-4e49-a5dd-26c800189c65", 00:16:14.401 "assigned_rate_limits": { 00:16:14.401 "rw_ios_per_sec": 0, 00:16:14.401 "rw_mbytes_per_sec": 0, 00:16:14.401 "r_mbytes_per_sec": 0, 00:16:14.401 "w_mbytes_per_sec": 0 00:16:14.401 }, 00:16:14.401 "claimed": true, 00:16:14.401 "claim_type": "exclusive_write", 00:16:14.401 "zoned": false, 00:16:14.401 "supported_io_types": { 00:16:14.401 "read": true, 00:16:14.401 "write": true, 00:16:14.401 "unmap": true, 00:16:14.401 "flush": true, 00:16:14.401 "reset": true, 00:16:14.401 "nvme_admin": false, 00:16:14.401 "nvme_io": false, 00:16:14.401 "nvme_io_md": false, 00:16:14.401 "write_zeroes": true, 00:16:14.401 "zcopy": true, 00:16:14.401 "get_zone_info": false, 00:16:14.401 "zone_management": false, 00:16:14.401 "zone_append": false, 00:16:14.401 "compare": false, 00:16:14.401 "compare_and_write": false, 00:16:14.401 "abort": true, 00:16:14.401 "seek_hole": false, 00:16:14.401 "seek_data": false, 00:16:14.401 "copy": true, 00:16:14.401 "nvme_iov_md": false 00:16:14.401 }, 00:16:14.401 "memory_domains": [ 00:16:14.401 { 00:16:14.401 "dma_device_id": "system", 00:16:14.401 "dma_device_type": 1 00:16:14.401 }, 00:16:14.401 { 00:16:14.401 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.401 "dma_device_type": 2 00:16:14.401 } 00:16:14.401 ], 00:16:14.401 "driver_specific": {} 00:16:14.401 } 00:16:14.401 ] 00:16:14.401 08:29:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:14.401 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:14.401 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:14.401 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:14.401 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:14.401 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:14.402 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:14.402 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:14.402 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:14.402 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:14.402 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:14.402 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:14.402 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:14.402 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.402 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:14.402 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:14.402 "name": "Existed_Raid", 00:16:14.402 "uuid": "f36eee7b-9a24-4012-acbd-2997e8ad7d7d", 00:16:14.402 "strip_size_kb": 0, 00:16:14.402 "state": "online", 00:16:14.402 "raid_level": "raid1", 00:16:14.402 "superblock": false, 00:16:14.402 "num_base_bdevs": 3, 00:16:14.402 "num_base_bdevs_discovered": 3, 00:16:14.402 "num_base_bdevs_operational": 3, 00:16:14.402 "base_bdevs_list": [ 00:16:14.402 { 00:16:14.402 "name": "BaseBdev1", 00:16:14.402 "uuid": "27553a31-2ce5-4ccb-8d12-aecec9f7a9ca", 00:16:14.402 "is_configured": true, 00:16:14.402 "data_offset": 0, 00:16:14.402 "data_size": 65536 00:16:14.402 }, 00:16:14.402 { 00:16:14.402 "name": "BaseBdev2", 00:16:14.402 "uuid": "d85ee3e3-4785-4ed9-80ef-919ed94df6b5", 00:16:14.402 "is_configured": true, 00:16:14.402 "data_offset": 0, 00:16:14.402 "data_size": 65536 00:16:14.402 }, 00:16:14.402 { 00:16:14.402 "name": "BaseBdev3", 00:16:14.402 "uuid": "b7ba07df-2273-4e49-a5dd-26c800189c65", 00:16:14.402 "is_configured": true, 00:16:14.402 "data_offset": 0, 00:16:14.402 "data_size": 65536 00:16:14.402 } 00:16:14.402 ] 00:16:14.402 }' 00:16:14.402 08:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:14.402 08:29:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:14.981 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:14.981 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:14.981 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:14.981 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:14.981 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:14.981 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:14.981 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:14.981 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:14.981 [2024-07-23 08:29:27.445250] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:14.981 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:14.981 "name": "Existed_Raid", 00:16:14.981 "aliases": [ 00:16:14.981 "f36eee7b-9a24-4012-acbd-2997e8ad7d7d" 00:16:14.981 ], 00:16:14.981 "product_name": "Raid Volume", 00:16:14.981 "block_size": 512, 00:16:14.981 "num_blocks": 65536, 00:16:14.981 "uuid": "f36eee7b-9a24-4012-acbd-2997e8ad7d7d", 00:16:14.981 "assigned_rate_limits": { 00:16:14.981 "rw_ios_per_sec": 0, 00:16:14.981 "rw_mbytes_per_sec": 0, 00:16:14.981 "r_mbytes_per_sec": 0, 00:16:14.981 "w_mbytes_per_sec": 0 00:16:14.981 }, 00:16:14.981 "claimed": false, 00:16:14.981 "zoned": false, 00:16:14.981 "supported_io_types": { 00:16:14.981 "read": true, 00:16:14.981 "write": true, 00:16:14.981 "unmap": false, 00:16:14.981 "flush": false, 00:16:14.981 "reset": true, 00:16:14.981 "nvme_admin": false, 00:16:14.981 "nvme_io": false, 00:16:14.981 "nvme_io_md": false, 00:16:14.981 "write_zeroes": true, 00:16:14.981 "zcopy": false, 00:16:14.981 "get_zone_info": false, 00:16:14.981 "zone_management": false, 00:16:14.981 "zone_append": false, 00:16:14.981 "compare": false, 00:16:14.981 "compare_and_write": false, 00:16:14.981 "abort": false, 00:16:14.981 "seek_hole": false, 00:16:14.981 "seek_data": false, 00:16:14.981 "copy": false, 00:16:14.981 "nvme_iov_md": false 00:16:14.981 }, 00:16:14.981 "memory_domains": [ 00:16:14.981 { 00:16:14.981 "dma_device_id": "system", 00:16:14.981 "dma_device_type": 1 00:16:14.981 }, 00:16:14.981 { 00:16:14.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.981 "dma_device_type": 2 00:16:14.981 }, 00:16:14.981 { 00:16:14.981 "dma_device_id": "system", 00:16:14.981 "dma_device_type": 1 00:16:14.981 }, 00:16:14.981 { 00:16:14.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.981 "dma_device_type": 2 00:16:14.981 }, 00:16:14.981 { 00:16:14.981 "dma_device_id": "system", 00:16:14.981 "dma_device_type": 1 00:16:14.981 }, 00:16:14.981 { 00:16:14.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.981 "dma_device_type": 2 00:16:14.981 } 00:16:14.981 ], 00:16:14.981 "driver_specific": { 00:16:14.981 "raid": { 00:16:14.981 "uuid": "f36eee7b-9a24-4012-acbd-2997e8ad7d7d", 00:16:14.981 "strip_size_kb": 0, 00:16:14.981 "state": "online", 00:16:14.981 "raid_level": "raid1", 00:16:14.981 "superblock": false, 00:16:14.981 "num_base_bdevs": 3, 00:16:14.981 "num_base_bdevs_discovered": 3, 00:16:14.981 "num_base_bdevs_operational": 3, 00:16:14.981 "base_bdevs_list": [ 00:16:14.981 { 00:16:14.981 "name": "BaseBdev1", 00:16:14.981 "uuid": "27553a31-2ce5-4ccb-8d12-aecec9f7a9ca", 00:16:14.981 "is_configured": true, 00:16:14.981 "data_offset": 0, 00:16:14.981 "data_size": 65536 00:16:14.981 }, 00:16:14.981 { 00:16:14.981 "name": "BaseBdev2", 00:16:14.981 "uuid": "d85ee3e3-4785-4ed9-80ef-919ed94df6b5", 00:16:14.981 "is_configured": true, 00:16:14.981 "data_offset": 0, 00:16:14.981 "data_size": 65536 00:16:14.981 }, 00:16:14.981 { 00:16:14.981 "name": "BaseBdev3", 00:16:14.981 "uuid": "b7ba07df-2273-4e49-a5dd-26c800189c65", 00:16:14.981 "is_configured": true, 00:16:14.981 "data_offset": 0, 00:16:14.981 "data_size": 65536 00:16:14.981 } 00:16:14.981 ] 00:16:14.981 } 00:16:14.981 } 00:16:14.981 }' 00:16:14.981 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:15.240 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:15.240 BaseBdev2 00:16:15.240 BaseBdev3' 00:16:15.240 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:15.240 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:15.240 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:15.240 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:15.240 "name": "BaseBdev1", 00:16:15.240 "aliases": [ 00:16:15.240 "27553a31-2ce5-4ccb-8d12-aecec9f7a9ca" 00:16:15.240 ], 00:16:15.240 "product_name": "Malloc disk", 00:16:15.240 "block_size": 512, 00:16:15.240 "num_blocks": 65536, 00:16:15.240 "uuid": "27553a31-2ce5-4ccb-8d12-aecec9f7a9ca", 00:16:15.240 "assigned_rate_limits": { 00:16:15.240 "rw_ios_per_sec": 0, 00:16:15.240 "rw_mbytes_per_sec": 0, 00:16:15.240 "r_mbytes_per_sec": 0, 00:16:15.240 "w_mbytes_per_sec": 0 00:16:15.240 }, 00:16:15.240 "claimed": true, 00:16:15.240 "claim_type": "exclusive_write", 00:16:15.240 "zoned": false, 00:16:15.241 "supported_io_types": { 00:16:15.241 "read": true, 00:16:15.241 "write": true, 00:16:15.241 "unmap": true, 00:16:15.241 "flush": true, 00:16:15.241 "reset": true, 00:16:15.241 "nvme_admin": false, 00:16:15.241 "nvme_io": false, 00:16:15.241 "nvme_io_md": false, 00:16:15.241 "write_zeroes": true, 00:16:15.241 "zcopy": true, 00:16:15.241 "get_zone_info": false, 00:16:15.241 "zone_management": false, 00:16:15.241 "zone_append": false, 00:16:15.241 "compare": false, 00:16:15.241 "compare_and_write": false, 00:16:15.241 "abort": true, 00:16:15.241 "seek_hole": false, 00:16:15.241 "seek_data": false, 00:16:15.241 "copy": true, 00:16:15.241 "nvme_iov_md": false 00:16:15.241 }, 00:16:15.241 "memory_domains": [ 00:16:15.241 { 00:16:15.241 "dma_device_id": "system", 00:16:15.241 "dma_device_type": 1 00:16:15.241 }, 00:16:15.241 { 00:16:15.241 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:15.241 "dma_device_type": 2 00:16:15.241 } 00:16:15.241 ], 00:16:15.241 "driver_specific": {} 00:16:15.241 }' 00:16:15.241 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:15.241 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:15.500 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:15.500 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:15.500 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:15.500 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:15.500 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:15.500 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:15.500 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:15.500 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:15.500 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:15.500 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:15.500 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:15.500 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:15.500 08:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:15.759 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:15.759 "name": "BaseBdev2", 00:16:15.759 "aliases": [ 00:16:15.759 "d85ee3e3-4785-4ed9-80ef-919ed94df6b5" 00:16:15.759 ], 00:16:15.759 "product_name": "Malloc disk", 00:16:15.759 "block_size": 512, 00:16:15.759 "num_blocks": 65536, 00:16:15.759 "uuid": "d85ee3e3-4785-4ed9-80ef-919ed94df6b5", 00:16:15.759 "assigned_rate_limits": { 00:16:15.759 "rw_ios_per_sec": 0, 00:16:15.759 "rw_mbytes_per_sec": 0, 00:16:15.759 "r_mbytes_per_sec": 0, 00:16:15.759 "w_mbytes_per_sec": 0 00:16:15.759 }, 00:16:15.759 "claimed": true, 00:16:15.759 "claim_type": "exclusive_write", 00:16:15.759 "zoned": false, 00:16:15.759 "supported_io_types": { 00:16:15.759 "read": true, 00:16:15.759 "write": true, 00:16:15.759 "unmap": true, 00:16:15.759 "flush": true, 00:16:15.759 "reset": true, 00:16:15.759 "nvme_admin": false, 00:16:15.759 "nvme_io": false, 00:16:15.759 "nvme_io_md": false, 00:16:15.759 "write_zeroes": true, 00:16:15.759 "zcopy": true, 00:16:15.759 "get_zone_info": false, 00:16:15.759 "zone_management": false, 00:16:15.759 "zone_append": false, 00:16:15.759 "compare": false, 00:16:15.759 "compare_and_write": false, 00:16:15.759 "abort": true, 00:16:15.759 "seek_hole": false, 00:16:15.759 "seek_data": false, 00:16:15.759 "copy": true, 00:16:15.759 "nvme_iov_md": false 00:16:15.759 }, 00:16:15.759 "memory_domains": [ 00:16:15.759 { 00:16:15.759 "dma_device_id": "system", 00:16:15.759 "dma_device_type": 1 00:16:15.759 }, 00:16:15.759 { 00:16:15.759 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:15.759 "dma_device_type": 2 00:16:15.759 } 00:16:15.759 ], 00:16:15.759 "driver_specific": {} 00:16:15.759 }' 00:16:15.759 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:15.759 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:15.759 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:15.759 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:15.759 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:16.018 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:16.018 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:16.018 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:16.018 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:16.018 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:16.019 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:16.019 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:16.019 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:16.019 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:16.019 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:16.277 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:16.277 "name": "BaseBdev3", 00:16:16.277 "aliases": [ 00:16:16.277 "b7ba07df-2273-4e49-a5dd-26c800189c65" 00:16:16.277 ], 00:16:16.277 "product_name": "Malloc disk", 00:16:16.277 "block_size": 512, 00:16:16.277 "num_blocks": 65536, 00:16:16.277 "uuid": "b7ba07df-2273-4e49-a5dd-26c800189c65", 00:16:16.277 "assigned_rate_limits": { 00:16:16.277 "rw_ios_per_sec": 0, 00:16:16.277 "rw_mbytes_per_sec": 0, 00:16:16.277 "r_mbytes_per_sec": 0, 00:16:16.277 "w_mbytes_per_sec": 0 00:16:16.277 }, 00:16:16.277 "claimed": true, 00:16:16.277 "claim_type": "exclusive_write", 00:16:16.277 "zoned": false, 00:16:16.277 "supported_io_types": { 00:16:16.277 "read": true, 00:16:16.277 "write": true, 00:16:16.277 "unmap": true, 00:16:16.277 "flush": true, 00:16:16.277 "reset": true, 00:16:16.277 "nvme_admin": false, 00:16:16.277 "nvme_io": false, 00:16:16.277 "nvme_io_md": false, 00:16:16.277 "write_zeroes": true, 00:16:16.277 "zcopy": true, 00:16:16.277 "get_zone_info": false, 00:16:16.277 "zone_management": false, 00:16:16.277 "zone_append": false, 00:16:16.277 "compare": false, 00:16:16.277 "compare_and_write": false, 00:16:16.277 "abort": true, 00:16:16.277 "seek_hole": false, 00:16:16.277 "seek_data": false, 00:16:16.277 "copy": true, 00:16:16.277 "nvme_iov_md": false 00:16:16.277 }, 00:16:16.277 "memory_domains": [ 00:16:16.277 { 00:16:16.277 "dma_device_id": "system", 00:16:16.277 "dma_device_type": 1 00:16:16.277 }, 00:16:16.277 { 00:16:16.277 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.277 "dma_device_type": 2 00:16:16.277 } 00:16:16.277 ], 00:16:16.277 "driver_specific": {} 00:16:16.277 }' 00:16:16.277 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:16.277 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:16.277 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:16.277 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:16.277 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:16.277 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:16.536 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:16.536 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:16.536 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:16.536 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:16.536 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:16.536 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:16.536 08:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:16.795 [2024-07-23 08:29:29.113416] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.795 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:17.054 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:17.054 "name": "Existed_Raid", 00:16:17.054 "uuid": "f36eee7b-9a24-4012-acbd-2997e8ad7d7d", 00:16:17.054 "strip_size_kb": 0, 00:16:17.054 "state": "online", 00:16:17.054 "raid_level": "raid1", 00:16:17.054 "superblock": false, 00:16:17.054 "num_base_bdevs": 3, 00:16:17.054 "num_base_bdevs_discovered": 2, 00:16:17.054 "num_base_bdevs_operational": 2, 00:16:17.054 "base_bdevs_list": [ 00:16:17.054 { 00:16:17.054 "name": null, 00:16:17.054 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:17.054 "is_configured": false, 00:16:17.054 "data_offset": 0, 00:16:17.054 "data_size": 65536 00:16:17.054 }, 00:16:17.054 { 00:16:17.054 "name": "BaseBdev2", 00:16:17.054 "uuid": "d85ee3e3-4785-4ed9-80ef-919ed94df6b5", 00:16:17.054 "is_configured": true, 00:16:17.054 "data_offset": 0, 00:16:17.054 "data_size": 65536 00:16:17.054 }, 00:16:17.054 { 00:16:17.054 "name": "BaseBdev3", 00:16:17.054 "uuid": "b7ba07df-2273-4e49-a5dd-26c800189c65", 00:16:17.054 "is_configured": true, 00:16:17.054 "data_offset": 0, 00:16:17.054 "data_size": 65536 00:16:17.054 } 00:16:17.054 ] 00:16:17.054 }' 00:16:17.054 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:17.054 08:29:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:17.312 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:17.312 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:17.312 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.312 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:17.573 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:17.573 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:17.573 08:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:17.837 [2024-07-23 08:29:30.156603] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:17.837 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:17.837 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:17.837 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.837 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:18.096 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:18.096 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:18.096 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:18.096 [2024-07-23 08:29:30.603577] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:18.096 [2024-07-23 08:29:30.603676] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:18.355 [2024-07-23 08:29:30.699126] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:18.355 [2024-07-23 08:29:30.699173] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:18.355 [2024-07-23 08:29:30.699184] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:16:18.355 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:18.355 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:18.355 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.355 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:18.614 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:18.614 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:18.614 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:18.614 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:18.614 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:18.614 08:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:18.614 BaseBdev2 00:16:18.614 08:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:18.614 08:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:18.614 08:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:18.614 08:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:18.614 08:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:18.614 08:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:18.614 08:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:18.874 08:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:19.133 [ 00:16:19.133 { 00:16:19.133 "name": "BaseBdev2", 00:16:19.133 "aliases": [ 00:16:19.133 "061e5bab-3dc6-4ebd-a3f7-d88366e509b1" 00:16:19.133 ], 00:16:19.133 "product_name": "Malloc disk", 00:16:19.133 "block_size": 512, 00:16:19.133 "num_blocks": 65536, 00:16:19.133 "uuid": "061e5bab-3dc6-4ebd-a3f7-d88366e509b1", 00:16:19.133 "assigned_rate_limits": { 00:16:19.133 "rw_ios_per_sec": 0, 00:16:19.133 "rw_mbytes_per_sec": 0, 00:16:19.133 "r_mbytes_per_sec": 0, 00:16:19.133 "w_mbytes_per_sec": 0 00:16:19.133 }, 00:16:19.133 "claimed": false, 00:16:19.133 "zoned": false, 00:16:19.133 "supported_io_types": { 00:16:19.133 "read": true, 00:16:19.133 "write": true, 00:16:19.133 "unmap": true, 00:16:19.133 "flush": true, 00:16:19.133 "reset": true, 00:16:19.133 "nvme_admin": false, 00:16:19.133 "nvme_io": false, 00:16:19.133 "nvme_io_md": false, 00:16:19.133 "write_zeroes": true, 00:16:19.133 "zcopy": true, 00:16:19.133 "get_zone_info": false, 00:16:19.133 "zone_management": false, 00:16:19.133 "zone_append": false, 00:16:19.133 "compare": false, 00:16:19.133 "compare_and_write": false, 00:16:19.133 "abort": true, 00:16:19.133 "seek_hole": false, 00:16:19.133 "seek_data": false, 00:16:19.133 "copy": true, 00:16:19.133 "nvme_iov_md": false 00:16:19.133 }, 00:16:19.133 "memory_domains": [ 00:16:19.133 { 00:16:19.133 "dma_device_id": "system", 00:16:19.133 "dma_device_type": 1 00:16:19.133 }, 00:16:19.133 { 00:16:19.133 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.133 "dma_device_type": 2 00:16:19.133 } 00:16:19.133 ], 00:16:19.133 "driver_specific": {} 00:16:19.133 } 00:16:19.133 ] 00:16:19.133 08:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:19.133 08:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:19.133 08:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:19.133 08:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:19.133 BaseBdev3 00:16:19.133 08:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:19.133 08:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:19.133 08:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:19.133 08:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:19.133 08:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:19.133 08:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:19.392 08:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:19.392 08:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:19.650 [ 00:16:19.650 { 00:16:19.650 "name": "BaseBdev3", 00:16:19.650 "aliases": [ 00:16:19.650 "5819584a-67f9-4575-9f77-d7f7cfadb3c8" 00:16:19.650 ], 00:16:19.650 "product_name": "Malloc disk", 00:16:19.650 "block_size": 512, 00:16:19.650 "num_blocks": 65536, 00:16:19.650 "uuid": "5819584a-67f9-4575-9f77-d7f7cfadb3c8", 00:16:19.650 "assigned_rate_limits": { 00:16:19.650 "rw_ios_per_sec": 0, 00:16:19.650 "rw_mbytes_per_sec": 0, 00:16:19.650 "r_mbytes_per_sec": 0, 00:16:19.650 "w_mbytes_per_sec": 0 00:16:19.650 }, 00:16:19.650 "claimed": false, 00:16:19.650 "zoned": false, 00:16:19.650 "supported_io_types": { 00:16:19.650 "read": true, 00:16:19.650 "write": true, 00:16:19.650 "unmap": true, 00:16:19.650 "flush": true, 00:16:19.650 "reset": true, 00:16:19.650 "nvme_admin": false, 00:16:19.650 "nvme_io": false, 00:16:19.650 "nvme_io_md": false, 00:16:19.650 "write_zeroes": true, 00:16:19.650 "zcopy": true, 00:16:19.650 "get_zone_info": false, 00:16:19.650 "zone_management": false, 00:16:19.650 "zone_append": false, 00:16:19.650 "compare": false, 00:16:19.650 "compare_and_write": false, 00:16:19.650 "abort": true, 00:16:19.650 "seek_hole": false, 00:16:19.650 "seek_data": false, 00:16:19.650 "copy": true, 00:16:19.650 "nvme_iov_md": false 00:16:19.650 }, 00:16:19.650 "memory_domains": [ 00:16:19.650 { 00:16:19.650 "dma_device_id": "system", 00:16:19.650 "dma_device_type": 1 00:16:19.650 }, 00:16:19.650 { 00:16:19.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.650 "dma_device_type": 2 00:16:19.650 } 00:16:19.650 ], 00:16:19.650 "driver_specific": {} 00:16:19.650 } 00:16:19.650 ] 00:16:19.650 08:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:19.651 08:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:19.651 08:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:19.651 08:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:19.651 [2024-07-23 08:29:32.132468] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:19.651 [2024-07-23 08:29:32.132511] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:19.651 [2024-07-23 08:29:32.132534] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:19.651 [2024-07-23 08:29:32.134174] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:19.651 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:19.651 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:19.651 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:19.651 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:19.651 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:19.651 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:19.651 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:19.651 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:19.651 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:19.651 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:19.651 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:19.651 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.909 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:19.909 "name": "Existed_Raid", 00:16:19.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:19.909 "strip_size_kb": 0, 00:16:19.909 "state": "configuring", 00:16:19.909 "raid_level": "raid1", 00:16:19.909 "superblock": false, 00:16:19.909 "num_base_bdevs": 3, 00:16:19.909 "num_base_bdevs_discovered": 2, 00:16:19.909 "num_base_bdevs_operational": 3, 00:16:19.909 "base_bdevs_list": [ 00:16:19.909 { 00:16:19.909 "name": "BaseBdev1", 00:16:19.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:19.909 "is_configured": false, 00:16:19.909 "data_offset": 0, 00:16:19.909 "data_size": 0 00:16:19.909 }, 00:16:19.909 { 00:16:19.909 "name": "BaseBdev2", 00:16:19.909 "uuid": "061e5bab-3dc6-4ebd-a3f7-d88366e509b1", 00:16:19.909 "is_configured": true, 00:16:19.909 "data_offset": 0, 00:16:19.909 "data_size": 65536 00:16:19.909 }, 00:16:19.909 { 00:16:19.909 "name": "BaseBdev3", 00:16:19.909 "uuid": "5819584a-67f9-4575-9f77-d7f7cfadb3c8", 00:16:19.909 "is_configured": true, 00:16:19.909 "data_offset": 0, 00:16:19.909 "data_size": 65536 00:16:19.909 } 00:16:19.909 ] 00:16:19.909 }' 00:16:19.909 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:19.909 08:29:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:20.477 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:20.477 [2024-07-23 08:29:32.974754] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:20.477 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:20.477 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:20.477 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:20.477 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:20.477 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:20.477 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:20.477 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:20.477 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:20.477 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:20.477 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:20.477 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.477 08:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:20.735 08:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:20.735 "name": "Existed_Raid", 00:16:20.736 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:20.736 "strip_size_kb": 0, 00:16:20.736 "state": "configuring", 00:16:20.736 "raid_level": "raid1", 00:16:20.736 "superblock": false, 00:16:20.736 "num_base_bdevs": 3, 00:16:20.736 "num_base_bdevs_discovered": 1, 00:16:20.736 "num_base_bdevs_operational": 3, 00:16:20.736 "base_bdevs_list": [ 00:16:20.736 { 00:16:20.736 "name": "BaseBdev1", 00:16:20.736 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:20.736 "is_configured": false, 00:16:20.736 "data_offset": 0, 00:16:20.736 "data_size": 0 00:16:20.736 }, 00:16:20.736 { 00:16:20.736 "name": null, 00:16:20.736 "uuid": "061e5bab-3dc6-4ebd-a3f7-d88366e509b1", 00:16:20.736 "is_configured": false, 00:16:20.736 "data_offset": 0, 00:16:20.736 "data_size": 65536 00:16:20.736 }, 00:16:20.736 { 00:16:20.736 "name": "BaseBdev3", 00:16:20.736 "uuid": "5819584a-67f9-4575-9f77-d7f7cfadb3c8", 00:16:20.736 "is_configured": true, 00:16:20.736 "data_offset": 0, 00:16:20.736 "data_size": 65536 00:16:20.736 } 00:16:20.736 ] 00:16:20.736 }' 00:16:20.736 08:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:20.736 08:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:21.311 08:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.311 08:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:21.570 08:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:21.570 08:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:21.570 [2024-07-23 08:29:34.018200] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:21.570 BaseBdev1 00:16:21.570 08:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:21.570 08:29:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:21.570 08:29:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:21.570 08:29:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:21.570 08:29:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:21.570 08:29:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:21.570 08:29:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:21.829 08:29:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:22.087 [ 00:16:22.087 { 00:16:22.087 "name": "BaseBdev1", 00:16:22.087 "aliases": [ 00:16:22.087 "d4a07b0b-71e9-4786-8992-af142ea7df67" 00:16:22.087 ], 00:16:22.087 "product_name": "Malloc disk", 00:16:22.087 "block_size": 512, 00:16:22.087 "num_blocks": 65536, 00:16:22.087 "uuid": "d4a07b0b-71e9-4786-8992-af142ea7df67", 00:16:22.087 "assigned_rate_limits": { 00:16:22.087 "rw_ios_per_sec": 0, 00:16:22.087 "rw_mbytes_per_sec": 0, 00:16:22.087 "r_mbytes_per_sec": 0, 00:16:22.087 "w_mbytes_per_sec": 0 00:16:22.087 }, 00:16:22.087 "claimed": true, 00:16:22.087 "claim_type": "exclusive_write", 00:16:22.087 "zoned": false, 00:16:22.087 "supported_io_types": { 00:16:22.087 "read": true, 00:16:22.087 "write": true, 00:16:22.087 "unmap": true, 00:16:22.087 "flush": true, 00:16:22.087 "reset": true, 00:16:22.087 "nvme_admin": false, 00:16:22.087 "nvme_io": false, 00:16:22.087 "nvme_io_md": false, 00:16:22.087 "write_zeroes": true, 00:16:22.087 "zcopy": true, 00:16:22.087 "get_zone_info": false, 00:16:22.087 "zone_management": false, 00:16:22.087 "zone_append": false, 00:16:22.087 "compare": false, 00:16:22.087 "compare_and_write": false, 00:16:22.087 "abort": true, 00:16:22.087 "seek_hole": false, 00:16:22.087 "seek_data": false, 00:16:22.088 "copy": true, 00:16:22.088 "nvme_iov_md": false 00:16:22.088 }, 00:16:22.088 "memory_domains": [ 00:16:22.088 { 00:16:22.088 "dma_device_id": "system", 00:16:22.088 "dma_device_type": 1 00:16:22.088 }, 00:16:22.088 { 00:16:22.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.088 "dma_device_type": 2 00:16:22.088 } 00:16:22.088 ], 00:16:22.088 "driver_specific": {} 00:16:22.088 } 00:16:22.088 ] 00:16:22.088 08:29:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:22.088 08:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:22.088 08:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:22.088 08:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:22.088 08:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:22.088 08:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:22.088 08:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:22.088 08:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:22.088 08:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:22.088 08:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:22.088 08:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:22.088 08:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.088 08:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:22.088 08:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:22.088 "name": "Existed_Raid", 00:16:22.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:22.088 "strip_size_kb": 0, 00:16:22.088 "state": "configuring", 00:16:22.088 "raid_level": "raid1", 00:16:22.088 "superblock": false, 00:16:22.088 "num_base_bdevs": 3, 00:16:22.088 "num_base_bdevs_discovered": 2, 00:16:22.088 "num_base_bdevs_operational": 3, 00:16:22.088 "base_bdevs_list": [ 00:16:22.088 { 00:16:22.088 "name": "BaseBdev1", 00:16:22.088 "uuid": "d4a07b0b-71e9-4786-8992-af142ea7df67", 00:16:22.088 "is_configured": true, 00:16:22.088 "data_offset": 0, 00:16:22.088 "data_size": 65536 00:16:22.088 }, 00:16:22.088 { 00:16:22.088 "name": null, 00:16:22.088 "uuid": "061e5bab-3dc6-4ebd-a3f7-d88366e509b1", 00:16:22.088 "is_configured": false, 00:16:22.088 "data_offset": 0, 00:16:22.088 "data_size": 65536 00:16:22.088 }, 00:16:22.088 { 00:16:22.088 "name": "BaseBdev3", 00:16:22.088 "uuid": "5819584a-67f9-4575-9f77-d7f7cfadb3c8", 00:16:22.088 "is_configured": true, 00:16:22.088 "data_offset": 0, 00:16:22.088 "data_size": 65536 00:16:22.088 } 00:16:22.088 ] 00:16:22.088 }' 00:16:22.088 08:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:22.088 08:29:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:22.660 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.660 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:22.918 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:22.918 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:22.919 [2024-07-23 08:29:35.353796] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:22.919 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:22.919 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:22.919 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:22.919 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:22.919 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:22.919 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:22.919 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:22.919 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:22.919 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:22.919 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:22.919 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.919 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:23.177 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:23.177 "name": "Existed_Raid", 00:16:23.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:23.177 "strip_size_kb": 0, 00:16:23.177 "state": "configuring", 00:16:23.178 "raid_level": "raid1", 00:16:23.178 "superblock": false, 00:16:23.178 "num_base_bdevs": 3, 00:16:23.178 "num_base_bdevs_discovered": 1, 00:16:23.178 "num_base_bdevs_operational": 3, 00:16:23.178 "base_bdevs_list": [ 00:16:23.178 { 00:16:23.178 "name": "BaseBdev1", 00:16:23.178 "uuid": "d4a07b0b-71e9-4786-8992-af142ea7df67", 00:16:23.178 "is_configured": true, 00:16:23.178 "data_offset": 0, 00:16:23.178 "data_size": 65536 00:16:23.178 }, 00:16:23.178 { 00:16:23.178 "name": null, 00:16:23.178 "uuid": "061e5bab-3dc6-4ebd-a3f7-d88366e509b1", 00:16:23.178 "is_configured": false, 00:16:23.178 "data_offset": 0, 00:16:23.178 "data_size": 65536 00:16:23.178 }, 00:16:23.178 { 00:16:23.178 "name": null, 00:16:23.178 "uuid": "5819584a-67f9-4575-9f77-d7f7cfadb3c8", 00:16:23.178 "is_configured": false, 00:16:23.178 "data_offset": 0, 00:16:23.178 "data_size": 65536 00:16:23.178 } 00:16:23.178 ] 00:16:23.178 }' 00:16:23.178 08:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:23.178 08:29:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.745 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:23.745 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.746 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:23.746 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:24.005 [2024-07-23 08:29:36.372492] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:24.005 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:24.005 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:24.005 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:24.005 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:24.005 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:24.005 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:24.005 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:24.005 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:24.005 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:24.005 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:24.005 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.005 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:24.264 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:24.264 "name": "Existed_Raid", 00:16:24.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:24.264 "strip_size_kb": 0, 00:16:24.264 "state": "configuring", 00:16:24.264 "raid_level": "raid1", 00:16:24.264 "superblock": false, 00:16:24.264 "num_base_bdevs": 3, 00:16:24.264 "num_base_bdevs_discovered": 2, 00:16:24.264 "num_base_bdevs_operational": 3, 00:16:24.264 "base_bdevs_list": [ 00:16:24.264 { 00:16:24.264 "name": "BaseBdev1", 00:16:24.264 "uuid": "d4a07b0b-71e9-4786-8992-af142ea7df67", 00:16:24.264 "is_configured": true, 00:16:24.264 "data_offset": 0, 00:16:24.264 "data_size": 65536 00:16:24.264 }, 00:16:24.264 { 00:16:24.264 "name": null, 00:16:24.264 "uuid": "061e5bab-3dc6-4ebd-a3f7-d88366e509b1", 00:16:24.264 "is_configured": false, 00:16:24.264 "data_offset": 0, 00:16:24.264 "data_size": 65536 00:16:24.264 }, 00:16:24.264 { 00:16:24.264 "name": "BaseBdev3", 00:16:24.264 "uuid": "5819584a-67f9-4575-9f77-d7f7cfadb3c8", 00:16:24.264 "is_configured": true, 00:16:24.264 "data_offset": 0, 00:16:24.264 "data_size": 65536 00:16:24.264 } 00:16:24.264 ] 00:16:24.264 }' 00:16:24.264 08:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:24.264 08:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:24.522 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.522 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:24.782 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:24.782 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:25.042 [2024-07-23 08:29:37.335064] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:25.042 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:25.042 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:25.042 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:25.042 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:25.042 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:25.042 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:25.042 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:25.042 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:25.042 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:25.042 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:25.042 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.042 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:25.301 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:25.301 "name": "Existed_Raid", 00:16:25.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:25.301 "strip_size_kb": 0, 00:16:25.301 "state": "configuring", 00:16:25.301 "raid_level": "raid1", 00:16:25.301 "superblock": false, 00:16:25.301 "num_base_bdevs": 3, 00:16:25.301 "num_base_bdevs_discovered": 1, 00:16:25.301 "num_base_bdevs_operational": 3, 00:16:25.301 "base_bdevs_list": [ 00:16:25.301 { 00:16:25.301 "name": null, 00:16:25.301 "uuid": "d4a07b0b-71e9-4786-8992-af142ea7df67", 00:16:25.301 "is_configured": false, 00:16:25.301 "data_offset": 0, 00:16:25.301 "data_size": 65536 00:16:25.301 }, 00:16:25.301 { 00:16:25.301 "name": null, 00:16:25.301 "uuid": "061e5bab-3dc6-4ebd-a3f7-d88366e509b1", 00:16:25.301 "is_configured": false, 00:16:25.301 "data_offset": 0, 00:16:25.301 "data_size": 65536 00:16:25.301 }, 00:16:25.301 { 00:16:25.301 "name": "BaseBdev3", 00:16:25.301 "uuid": "5819584a-67f9-4575-9f77-d7f7cfadb3c8", 00:16:25.301 "is_configured": true, 00:16:25.301 "data_offset": 0, 00:16:25.301 "data_size": 65536 00:16:25.301 } 00:16:25.301 ] 00:16:25.301 }' 00:16:25.301 08:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:25.301 08:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:25.868 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.868 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:25.868 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:25.868 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:26.127 [2024-07-23 08:29:38.416903] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:26.127 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:26.127 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:26.127 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:26.127 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:26.127 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:26.127 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:26.127 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:26.127 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:26.127 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:26.127 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:26.127 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:26.127 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:26.127 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:26.127 "name": "Existed_Raid", 00:16:26.127 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:26.127 "strip_size_kb": 0, 00:16:26.127 "state": "configuring", 00:16:26.127 "raid_level": "raid1", 00:16:26.127 "superblock": false, 00:16:26.127 "num_base_bdevs": 3, 00:16:26.127 "num_base_bdevs_discovered": 2, 00:16:26.127 "num_base_bdevs_operational": 3, 00:16:26.127 "base_bdevs_list": [ 00:16:26.127 { 00:16:26.127 "name": null, 00:16:26.127 "uuid": "d4a07b0b-71e9-4786-8992-af142ea7df67", 00:16:26.127 "is_configured": false, 00:16:26.127 "data_offset": 0, 00:16:26.127 "data_size": 65536 00:16:26.127 }, 00:16:26.127 { 00:16:26.127 "name": "BaseBdev2", 00:16:26.127 "uuid": "061e5bab-3dc6-4ebd-a3f7-d88366e509b1", 00:16:26.127 "is_configured": true, 00:16:26.127 "data_offset": 0, 00:16:26.127 "data_size": 65536 00:16:26.127 }, 00:16:26.127 { 00:16:26.127 "name": "BaseBdev3", 00:16:26.127 "uuid": "5819584a-67f9-4575-9f77-d7f7cfadb3c8", 00:16:26.127 "is_configured": true, 00:16:26.127 "data_offset": 0, 00:16:26.127 "data_size": 65536 00:16:26.127 } 00:16:26.127 ] 00:16:26.127 }' 00:16:26.127 08:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:26.127 08:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:26.693 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:26.693 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:26.952 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:26.952 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:26.952 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:26.952 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u d4a07b0b-71e9-4786-8992-af142ea7df67 00:16:27.210 [2024-07-23 08:29:39.596248] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:27.210 [2024-07-23 08:29:39.596294] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036980 00:16:27.210 [2024-07-23 08:29:39.596302] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:27.210 [2024-07-23 08:29:39.596535] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c200 00:16:27.210 [2024-07-23 08:29:39.596702] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036980 00:16:27.210 [2024-07-23 08:29:39.596717] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000036980 00:16:27.210 [2024-07-23 08:29:39.596944] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:27.210 NewBaseBdev 00:16:27.210 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:27.210 08:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:27.210 08:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:27.210 08:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:27.210 08:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:27.210 08:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:27.210 08:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:27.469 08:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:27.469 [ 00:16:27.469 { 00:16:27.469 "name": "NewBaseBdev", 00:16:27.469 "aliases": [ 00:16:27.469 "d4a07b0b-71e9-4786-8992-af142ea7df67" 00:16:27.469 ], 00:16:27.469 "product_name": "Malloc disk", 00:16:27.469 "block_size": 512, 00:16:27.469 "num_blocks": 65536, 00:16:27.469 "uuid": "d4a07b0b-71e9-4786-8992-af142ea7df67", 00:16:27.469 "assigned_rate_limits": { 00:16:27.469 "rw_ios_per_sec": 0, 00:16:27.469 "rw_mbytes_per_sec": 0, 00:16:27.469 "r_mbytes_per_sec": 0, 00:16:27.469 "w_mbytes_per_sec": 0 00:16:27.469 }, 00:16:27.469 "claimed": true, 00:16:27.469 "claim_type": "exclusive_write", 00:16:27.469 "zoned": false, 00:16:27.469 "supported_io_types": { 00:16:27.469 "read": true, 00:16:27.469 "write": true, 00:16:27.469 "unmap": true, 00:16:27.469 "flush": true, 00:16:27.469 "reset": true, 00:16:27.469 "nvme_admin": false, 00:16:27.469 "nvme_io": false, 00:16:27.469 "nvme_io_md": false, 00:16:27.469 "write_zeroes": true, 00:16:27.469 "zcopy": true, 00:16:27.469 "get_zone_info": false, 00:16:27.469 "zone_management": false, 00:16:27.469 "zone_append": false, 00:16:27.469 "compare": false, 00:16:27.469 "compare_and_write": false, 00:16:27.469 "abort": true, 00:16:27.469 "seek_hole": false, 00:16:27.469 "seek_data": false, 00:16:27.469 "copy": true, 00:16:27.469 "nvme_iov_md": false 00:16:27.469 }, 00:16:27.469 "memory_domains": [ 00:16:27.469 { 00:16:27.469 "dma_device_id": "system", 00:16:27.469 "dma_device_type": 1 00:16:27.469 }, 00:16:27.469 { 00:16:27.469 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:27.469 "dma_device_type": 2 00:16:27.469 } 00:16:27.469 ], 00:16:27.469 "driver_specific": {} 00:16:27.469 } 00:16:27.469 ] 00:16:27.469 08:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:27.469 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:27.469 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:27.469 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:27.469 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:27.469 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:27.469 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:27.469 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:27.469 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:27.470 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:27.470 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:27.470 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.470 08:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:27.728 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:27.728 "name": "Existed_Raid", 00:16:27.728 "uuid": "14b2bd96-71b2-48cd-a3e8-634bf54ec548", 00:16:27.728 "strip_size_kb": 0, 00:16:27.728 "state": "online", 00:16:27.728 "raid_level": "raid1", 00:16:27.728 "superblock": false, 00:16:27.728 "num_base_bdevs": 3, 00:16:27.728 "num_base_bdevs_discovered": 3, 00:16:27.728 "num_base_bdevs_operational": 3, 00:16:27.728 "base_bdevs_list": [ 00:16:27.728 { 00:16:27.728 "name": "NewBaseBdev", 00:16:27.728 "uuid": "d4a07b0b-71e9-4786-8992-af142ea7df67", 00:16:27.728 "is_configured": true, 00:16:27.728 "data_offset": 0, 00:16:27.728 "data_size": 65536 00:16:27.728 }, 00:16:27.728 { 00:16:27.728 "name": "BaseBdev2", 00:16:27.728 "uuid": "061e5bab-3dc6-4ebd-a3f7-d88366e509b1", 00:16:27.728 "is_configured": true, 00:16:27.728 "data_offset": 0, 00:16:27.728 "data_size": 65536 00:16:27.728 }, 00:16:27.728 { 00:16:27.728 "name": "BaseBdev3", 00:16:27.728 "uuid": "5819584a-67f9-4575-9f77-d7f7cfadb3c8", 00:16:27.728 "is_configured": true, 00:16:27.728 "data_offset": 0, 00:16:27.728 "data_size": 65536 00:16:27.728 } 00:16:27.728 ] 00:16:27.728 }' 00:16:27.728 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:27.728 08:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:28.295 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:28.295 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:28.295 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:28.295 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:28.295 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:28.295 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:28.295 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:28.295 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:28.295 [2024-07-23 08:29:40.767699] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:28.295 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:28.295 "name": "Existed_Raid", 00:16:28.295 "aliases": [ 00:16:28.295 "14b2bd96-71b2-48cd-a3e8-634bf54ec548" 00:16:28.295 ], 00:16:28.295 "product_name": "Raid Volume", 00:16:28.295 "block_size": 512, 00:16:28.295 "num_blocks": 65536, 00:16:28.295 "uuid": "14b2bd96-71b2-48cd-a3e8-634bf54ec548", 00:16:28.295 "assigned_rate_limits": { 00:16:28.295 "rw_ios_per_sec": 0, 00:16:28.295 "rw_mbytes_per_sec": 0, 00:16:28.295 "r_mbytes_per_sec": 0, 00:16:28.295 "w_mbytes_per_sec": 0 00:16:28.295 }, 00:16:28.295 "claimed": false, 00:16:28.295 "zoned": false, 00:16:28.295 "supported_io_types": { 00:16:28.295 "read": true, 00:16:28.295 "write": true, 00:16:28.295 "unmap": false, 00:16:28.295 "flush": false, 00:16:28.295 "reset": true, 00:16:28.295 "nvme_admin": false, 00:16:28.295 "nvme_io": false, 00:16:28.295 "nvme_io_md": false, 00:16:28.295 "write_zeroes": true, 00:16:28.295 "zcopy": false, 00:16:28.295 "get_zone_info": false, 00:16:28.295 "zone_management": false, 00:16:28.295 "zone_append": false, 00:16:28.295 "compare": false, 00:16:28.295 "compare_and_write": false, 00:16:28.295 "abort": false, 00:16:28.295 "seek_hole": false, 00:16:28.295 "seek_data": false, 00:16:28.295 "copy": false, 00:16:28.295 "nvme_iov_md": false 00:16:28.295 }, 00:16:28.295 "memory_domains": [ 00:16:28.295 { 00:16:28.295 "dma_device_id": "system", 00:16:28.295 "dma_device_type": 1 00:16:28.295 }, 00:16:28.295 { 00:16:28.295 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.295 "dma_device_type": 2 00:16:28.295 }, 00:16:28.295 { 00:16:28.295 "dma_device_id": "system", 00:16:28.295 "dma_device_type": 1 00:16:28.295 }, 00:16:28.295 { 00:16:28.295 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.295 "dma_device_type": 2 00:16:28.295 }, 00:16:28.295 { 00:16:28.295 "dma_device_id": "system", 00:16:28.295 "dma_device_type": 1 00:16:28.295 }, 00:16:28.295 { 00:16:28.295 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.295 "dma_device_type": 2 00:16:28.295 } 00:16:28.295 ], 00:16:28.295 "driver_specific": { 00:16:28.295 "raid": { 00:16:28.295 "uuid": "14b2bd96-71b2-48cd-a3e8-634bf54ec548", 00:16:28.295 "strip_size_kb": 0, 00:16:28.295 "state": "online", 00:16:28.295 "raid_level": "raid1", 00:16:28.295 "superblock": false, 00:16:28.295 "num_base_bdevs": 3, 00:16:28.295 "num_base_bdevs_discovered": 3, 00:16:28.296 "num_base_bdevs_operational": 3, 00:16:28.296 "base_bdevs_list": [ 00:16:28.296 { 00:16:28.296 "name": "NewBaseBdev", 00:16:28.296 "uuid": "d4a07b0b-71e9-4786-8992-af142ea7df67", 00:16:28.296 "is_configured": true, 00:16:28.296 "data_offset": 0, 00:16:28.296 "data_size": 65536 00:16:28.296 }, 00:16:28.296 { 00:16:28.296 "name": "BaseBdev2", 00:16:28.296 "uuid": "061e5bab-3dc6-4ebd-a3f7-d88366e509b1", 00:16:28.296 "is_configured": true, 00:16:28.296 "data_offset": 0, 00:16:28.296 "data_size": 65536 00:16:28.296 }, 00:16:28.296 { 00:16:28.296 "name": "BaseBdev3", 00:16:28.296 "uuid": "5819584a-67f9-4575-9f77-d7f7cfadb3c8", 00:16:28.296 "is_configured": true, 00:16:28.296 "data_offset": 0, 00:16:28.296 "data_size": 65536 00:16:28.296 } 00:16:28.296 ] 00:16:28.296 } 00:16:28.296 } 00:16:28.296 }' 00:16:28.296 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:28.587 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:28.587 BaseBdev2 00:16:28.587 BaseBdev3' 00:16:28.587 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:28.587 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:28.587 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:28.587 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:28.587 "name": "NewBaseBdev", 00:16:28.587 "aliases": [ 00:16:28.587 "d4a07b0b-71e9-4786-8992-af142ea7df67" 00:16:28.587 ], 00:16:28.587 "product_name": "Malloc disk", 00:16:28.587 "block_size": 512, 00:16:28.587 "num_blocks": 65536, 00:16:28.587 "uuid": "d4a07b0b-71e9-4786-8992-af142ea7df67", 00:16:28.587 "assigned_rate_limits": { 00:16:28.587 "rw_ios_per_sec": 0, 00:16:28.587 "rw_mbytes_per_sec": 0, 00:16:28.587 "r_mbytes_per_sec": 0, 00:16:28.587 "w_mbytes_per_sec": 0 00:16:28.587 }, 00:16:28.587 "claimed": true, 00:16:28.587 "claim_type": "exclusive_write", 00:16:28.587 "zoned": false, 00:16:28.587 "supported_io_types": { 00:16:28.587 "read": true, 00:16:28.587 "write": true, 00:16:28.587 "unmap": true, 00:16:28.587 "flush": true, 00:16:28.587 "reset": true, 00:16:28.587 "nvme_admin": false, 00:16:28.587 "nvme_io": false, 00:16:28.587 "nvme_io_md": false, 00:16:28.587 "write_zeroes": true, 00:16:28.587 "zcopy": true, 00:16:28.587 "get_zone_info": false, 00:16:28.587 "zone_management": false, 00:16:28.587 "zone_append": false, 00:16:28.587 "compare": false, 00:16:28.587 "compare_and_write": false, 00:16:28.587 "abort": true, 00:16:28.587 "seek_hole": false, 00:16:28.587 "seek_data": false, 00:16:28.587 "copy": true, 00:16:28.587 "nvme_iov_md": false 00:16:28.587 }, 00:16:28.587 "memory_domains": [ 00:16:28.587 { 00:16:28.587 "dma_device_id": "system", 00:16:28.587 "dma_device_type": 1 00:16:28.587 }, 00:16:28.587 { 00:16:28.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.587 "dma_device_type": 2 00:16:28.587 } 00:16:28.587 ], 00:16:28.587 "driver_specific": {} 00:16:28.587 }' 00:16:28.587 08:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:28.587 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:28.587 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:28.587 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:28.587 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:28.846 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:28.846 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:28.846 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:28.846 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:28.846 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:28.846 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:28.846 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:28.846 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:28.846 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:28.846 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:29.105 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:29.105 "name": "BaseBdev2", 00:16:29.105 "aliases": [ 00:16:29.105 "061e5bab-3dc6-4ebd-a3f7-d88366e509b1" 00:16:29.105 ], 00:16:29.105 "product_name": "Malloc disk", 00:16:29.105 "block_size": 512, 00:16:29.105 "num_blocks": 65536, 00:16:29.105 "uuid": "061e5bab-3dc6-4ebd-a3f7-d88366e509b1", 00:16:29.105 "assigned_rate_limits": { 00:16:29.105 "rw_ios_per_sec": 0, 00:16:29.105 "rw_mbytes_per_sec": 0, 00:16:29.105 "r_mbytes_per_sec": 0, 00:16:29.105 "w_mbytes_per_sec": 0 00:16:29.105 }, 00:16:29.105 "claimed": true, 00:16:29.105 "claim_type": "exclusive_write", 00:16:29.105 "zoned": false, 00:16:29.105 "supported_io_types": { 00:16:29.105 "read": true, 00:16:29.105 "write": true, 00:16:29.105 "unmap": true, 00:16:29.105 "flush": true, 00:16:29.105 "reset": true, 00:16:29.105 "nvme_admin": false, 00:16:29.105 "nvme_io": false, 00:16:29.105 "nvme_io_md": false, 00:16:29.105 "write_zeroes": true, 00:16:29.105 "zcopy": true, 00:16:29.105 "get_zone_info": false, 00:16:29.105 "zone_management": false, 00:16:29.105 "zone_append": false, 00:16:29.105 "compare": false, 00:16:29.105 "compare_and_write": false, 00:16:29.105 "abort": true, 00:16:29.105 "seek_hole": false, 00:16:29.105 "seek_data": false, 00:16:29.105 "copy": true, 00:16:29.105 "nvme_iov_md": false 00:16:29.105 }, 00:16:29.105 "memory_domains": [ 00:16:29.105 { 00:16:29.105 "dma_device_id": "system", 00:16:29.105 "dma_device_type": 1 00:16:29.105 }, 00:16:29.105 { 00:16:29.105 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.105 "dma_device_type": 2 00:16:29.105 } 00:16:29.105 ], 00:16:29.105 "driver_specific": {} 00:16:29.105 }' 00:16:29.105 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.105 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.105 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:29.105 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.105 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.105 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:29.105 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:29.364 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:29.364 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:29.364 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:29.364 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:29.364 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:29.364 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:29.364 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:29.364 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:29.622 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:29.622 "name": "BaseBdev3", 00:16:29.622 "aliases": [ 00:16:29.622 "5819584a-67f9-4575-9f77-d7f7cfadb3c8" 00:16:29.622 ], 00:16:29.622 "product_name": "Malloc disk", 00:16:29.622 "block_size": 512, 00:16:29.622 "num_blocks": 65536, 00:16:29.622 "uuid": "5819584a-67f9-4575-9f77-d7f7cfadb3c8", 00:16:29.622 "assigned_rate_limits": { 00:16:29.622 "rw_ios_per_sec": 0, 00:16:29.622 "rw_mbytes_per_sec": 0, 00:16:29.622 "r_mbytes_per_sec": 0, 00:16:29.622 "w_mbytes_per_sec": 0 00:16:29.622 }, 00:16:29.622 "claimed": true, 00:16:29.622 "claim_type": "exclusive_write", 00:16:29.622 "zoned": false, 00:16:29.622 "supported_io_types": { 00:16:29.622 "read": true, 00:16:29.622 "write": true, 00:16:29.622 "unmap": true, 00:16:29.622 "flush": true, 00:16:29.622 "reset": true, 00:16:29.622 "nvme_admin": false, 00:16:29.622 "nvme_io": false, 00:16:29.622 "nvme_io_md": false, 00:16:29.622 "write_zeroes": true, 00:16:29.622 "zcopy": true, 00:16:29.622 "get_zone_info": false, 00:16:29.622 "zone_management": false, 00:16:29.622 "zone_append": false, 00:16:29.622 "compare": false, 00:16:29.622 "compare_and_write": false, 00:16:29.622 "abort": true, 00:16:29.622 "seek_hole": false, 00:16:29.622 "seek_data": false, 00:16:29.622 "copy": true, 00:16:29.622 "nvme_iov_md": false 00:16:29.622 }, 00:16:29.622 "memory_domains": [ 00:16:29.622 { 00:16:29.622 "dma_device_id": "system", 00:16:29.622 "dma_device_type": 1 00:16:29.622 }, 00:16:29.622 { 00:16:29.622 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.622 "dma_device_type": 2 00:16:29.622 } 00:16:29.622 ], 00:16:29.622 "driver_specific": {} 00:16:29.622 }' 00:16:29.622 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.622 08:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.622 08:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:29.622 08:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.622 08:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.622 08:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:29.622 08:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:29.622 08:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:29.881 08:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:29.881 08:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:29.881 08:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:29.881 08:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:29.881 08:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:30.139 [2024-07-23 08:29:42.403713] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:30.139 [2024-07-23 08:29:42.403742] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:30.139 [2024-07-23 08:29:42.403814] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:30.139 [2024-07-23 08:29:42.404078] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:30.139 [2024-07-23 08:29:42.404090] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036980 name Existed_Raid, state offline 00:16:30.139 08:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1454128 00:16:30.139 08:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1454128 ']' 00:16:30.139 08:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1454128 00:16:30.139 08:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:16:30.139 08:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:30.139 08:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1454128 00:16:30.139 08:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:30.139 08:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:30.139 08:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1454128' 00:16:30.139 killing process with pid 1454128 00:16:30.139 08:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1454128 00:16:30.139 [2024-07-23 08:29:42.461536] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:30.139 08:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1454128 00:16:30.398 [2024-07-23 08:29:42.693726] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:31.789 00:16:31.789 real 0m23.183s 00:16:31.789 user 0m41.382s 00:16:31.789 sys 0m3.540s 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:31.789 ************************************ 00:16:31.789 END TEST raid_state_function_test 00:16:31.789 ************************************ 00:16:31.789 08:29:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:31.789 08:29:43 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:16:31.789 08:29:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:31.789 08:29:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:31.789 08:29:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:31.789 ************************************ 00:16:31.789 START TEST raid_state_function_test_sb 00:16:31.789 ************************************ 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:31.789 08:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:31.789 08:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:31.789 08:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:31.789 08:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:31.789 08:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1459009 00:16:31.789 08:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1459009' 00:16:31.789 Process raid pid: 1459009 00:16:31.789 08:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:31.789 08:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1459009 /var/tmp/spdk-raid.sock 00:16:31.789 08:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1459009 ']' 00:16:31.789 08:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:31.789 08:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:31.789 08:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:31.789 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:31.789 08:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:31.789 08:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:31.789 [2024-07-23 08:29:44.080279] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:16:31.789 [2024-07-23 08:29:44.080362] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:31.789 [2024-07-23 08:29:44.205219] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:32.047 [2024-07-23 08:29:44.421606] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:32.306 [2024-07-23 08:29:44.683968] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:32.306 [2024-07-23 08:29:44.683998] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:32.564 08:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:32.564 08:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:16:32.564 08:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:32.564 [2024-07-23 08:29:44.998156] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:32.564 [2024-07-23 08:29:44.998198] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:32.564 [2024-07-23 08:29:44.998208] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:32.564 [2024-07-23 08:29:44.998219] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:32.564 [2024-07-23 08:29:44.998226] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:32.564 [2024-07-23 08:29:44.998234] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:32.564 08:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:32.564 08:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:32.564 08:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:32.564 08:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:32.564 08:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:32.564 08:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:32.564 08:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:32.564 08:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:32.564 08:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:32.564 08:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:32.564 08:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:32.564 08:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:32.823 08:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:32.823 "name": "Existed_Raid", 00:16:32.823 "uuid": "abed5502-ac57-426b-a391-572f5b3d48ee", 00:16:32.823 "strip_size_kb": 0, 00:16:32.823 "state": "configuring", 00:16:32.823 "raid_level": "raid1", 00:16:32.823 "superblock": true, 00:16:32.823 "num_base_bdevs": 3, 00:16:32.823 "num_base_bdevs_discovered": 0, 00:16:32.823 "num_base_bdevs_operational": 3, 00:16:32.823 "base_bdevs_list": [ 00:16:32.823 { 00:16:32.823 "name": "BaseBdev1", 00:16:32.823 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.823 "is_configured": false, 00:16:32.823 "data_offset": 0, 00:16:32.823 "data_size": 0 00:16:32.823 }, 00:16:32.823 { 00:16:32.823 "name": "BaseBdev2", 00:16:32.823 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.823 "is_configured": false, 00:16:32.823 "data_offset": 0, 00:16:32.823 "data_size": 0 00:16:32.823 }, 00:16:32.823 { 00:16:32.823 "name": "BaseBdev3", 00:16:32.823 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.823 "is_configured": false, 00:16:32.823 "data_offset": 0, 00:16:32.823 "data_size": 0 00:16:32.823 } 00:16:32.823 ] 00:16:32.823 }' 00:16:32.823 08:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:32.823 08:29:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:33.390 08:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:33.390 [2024-07-23 08:29:45.820208] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:33.390 [2024-07-23 08:29:45.820244] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:16:33.390 08:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:33.649 [2024-07-23 08:29:45.988679] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:33.649 [2024-07-23 08:29:45.988721] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:33.649 [2024-07-23 08:29:45.988730] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:33.649 [2024-07-23 08:29:45.988742] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:33.649 [2024-07-23 08:29:45.988748] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:33.649 [2024-07-23 08:29:45.988760] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:33.649 08:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:33.907 [2024-07-23 08:29:46.204975] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:33.907 BaseBdev1 00:16:33.907 08:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:33.907 08:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:33.907 08:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:33.907 08:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:33.907 08:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:33.907 08:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:33.907 08:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:33.907 08:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:34.167 [ 00:16:34.167 { 00:16:34.167 "name": "BaseBdev1", 00:16:34.167 "aliases": [ 00:16:34.167 "99f183e5-db7c-4111-bc8d-61d7a7b98299" 00:16:34.167 ], 00:16:34.167 "product_name": "Malloc disk", 00:16:34.167 "block_size": 512, 00:16:34.167 "num_blocks": 65536, 00:16:34.167 "uuid": "99f183e5-db7c-4111-bc8d-61d7a7b98299", 00:16:34.167 "assigned_rate_limits": { 00:16:34.167 "rw_ios_per_sec": 0, 00:16:34.167 "rw_mbytes_per_sec": 0, 00:16:34.167 "r_mbytes_per_sec": 0, 00:16:34.167 "w_mbytes_per_sec": 0 00:16:34.167 }, 00:16:34.167 "claimed": true, 00:16:34.167 "claim_type": "exclusive_write", 00:16:34.167 "zoned": false, 00:16:34.167 "supported_io_types": { 00:16:34.167 "read": true, 00:16:34.167 "write": true, 00:16:34.167 "unmap": true, 00:16:34.167 "flush": true, 00:16:34.167 "reset": true, 00:16:34.167 "nvme_admin": false, 00:16:34.167 "nvme_io": false, 00:16:34.167 "nvme_io_md": false, 00:16:34.167 "write_zeroes": true, 00:16:34.167 "zcopy": true, 00:16:34.167 "get_zone_info": false, 00:16:34.167 "zone_management": false, 00:16:34.167 "zone_append": false, 00:16:34.167 "compare": false, 00:16:34.167 "compare_and_write": false, 00:16:34.167 "abort": true, 00:16:34.167 "seek_hole": false, 00:16:34.167 "seek_data": false, 00:16:34.167 "copy": true, 00:16:34.167 "nvme_iov_md": false 00:16:34.167 }, 00:16:34.167 "memory_domains": [ 00:16:34.167 { 00:16:34.167 "dma_device_id": "system", 00:16:34.167 "dma_device_type": 1 00:16:34.167 }, 00:16:34.167 { 00:16:34.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.167 "dma_device_type": 2 00:16:34.167 } 00:16:34.167 ], 00:16:34.167 "driver_specific": {} 00:16:34.167 } 00:16:34.167 ] 00:16:34.167 08:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:34.167 08:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:34.167 08:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:34.167 08:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:34.167 08:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:34.167 08:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:34.167 08:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:34.167 08:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:34.167 08:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:34.167 08:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:34.167 08:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:34.167 08:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:34.167 08:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:34.425 08:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:34.425 "name": "Existed_Raid", 00:16:34.425 "uuid": "c93460b7-8bcd-4446-a8cc-7b9bfac883f4", 00:16:34.425 "strip_size_kb": 0, 00:16:34.425 "state": "configuring", 00:16:34.425 "raid_level": "raid1", 00:16:34.425 "superblock": true, 00:16:34.425 "num_base_bdevs": 3, 00:16:34.425 "num_base_bdevs_discovered": 1, 00:16:34.425 "num_base_bdevs_operational": 3, 00:16:34.425 "base_bdevs_list": [ 00:16:34.425 { 00:16:34.425 "name": "BaseBdev1", 00:16:34.425 "uuid": "99f183e5-db7c-4111-bc8d-61d7a7b98299", 00:16:34.425 "is_configured": true, 00:16:34.425 "data_offset": 2048, 00:16:34.425 "data_size": 63488 00:16:34.425 }, 00:16:34.425 { 00:16:34.425 "name": "BaseBdev2", 00:16:34.425 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.425 "is_configured": false, 00:16:34.425 "data_offset": 0, 00:16:34.425 "data_size": 0 00:16:34.425 }, 00:16:34.425 { 00:16:34.425 "name": "BaseBdev3", 00:16:34.425 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.425 "is_configured": false, 00:16:34.425 "data_offset": 0, 00:16:34.425 "data_size": 0 00:16:34.425 } 00:16:34.425 ] 00:16:34.425 }' 00:16:34.425 08:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:34.425 08:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:34.992 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:34.992 [2024-07-23 08:29:47.376108] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:34.992 [2024-07-23 08:29:47.376161] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:16:34.992 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:35.251 [2024-07-23 08:29:47.548603] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:35.251 [2024-07-23 08:29:47.550238] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:35.251 [2024-07-23 08:29:47.550275] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:35.251 [2024-07-23 08:29:47.550284] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:35.251 [2024-07-23 08:29:47.550294] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:35.251 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:35.251 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:35.252 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:35.252 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:35.252 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:35.252 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:35.252 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:35.252 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:35.252 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:35.252 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:35.252 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:35.252 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:35.252 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.252 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:35.252 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:35.252 "name": "Existed_Raid", 00:16:35.252 "uuid": "22086779-9c26-42d3-bf62-d9f4355b4761", 00:16:35.252 "strip_size_kb": 0, 00:16:35.252 "state": "configuring", 00:16:35.252 "raid_level": "raid1", 00:16:35.252 "superblock": true, 00:16:35.252 "num_base_bdevs": 3, 00:16:35.252 "num_base_bdevs_discovered": 1, 00:16:35.252 "num_base_bdevs_operational": 3, 00:16:35.252 "base_bdevs_list": [ 00:16:35.252 { 00:16:35.252 "name": "BaseBdev1", 00:16:35.252 "uuid": "99f183e5-db7c-4111-bc8d-61d7a7b98299", 00:16:35.252 "is_configured": true, 00:16:35.252 "data_offset": 2048, 00:16:35.252 "data_size": 63488 00:16:35.252 }, 00:16:35.252 { 00:16:35.252 "name": "BaseBdev2", 00:16:35.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:35.252 "is_configured": false, 00:16:35.252 "data_offset": 0, 00:16:35.252 "data_size": 0 00:16:35.252 }, 00:16:35.252 { 00:16:35.252 "name": "BaseBdev3", 00:16:35.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:35.252 "is_configured": false, 00:16:35.252 "data_offset": 0, 00:16:35.252 "data_size": 0 00:16:35.252 } 00:16:35.252 ] 00:16:35.252 }' 00:16:35.252 08:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:35.252 08:29:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:35.818 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:36.077 [2024-07-23 08:29:48.421202] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:36.077 BaseBdev2 00:16:36.077 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:36.077 08:29:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:36.077 08:29:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:36.077 08:29:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:36.077 08:29:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:36.077 08:29:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:36.077 08:29:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:36.336 08:29:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:36.336 [ 00:16:36.336 { 00:16:36.336 "name": "BaseBdev2", 00:16:36.336 "aliases": [ 00:16:36.336 "c258da28-7cb9-4720-84f6-e20542d3497f" 00:16:36.336 ], 00:16:36.336 "product_name": "Malloc disk", 00:16:36.336 "block_size": 512, 00:16:36.336 "num_blocks": 65536, 00:16:36.336 "uuid": "c258da28-7cb9-4720-84f6-e20542d3497f", 00:16:36.336 "assigned_rate_limits": { 00:16:36.336 "rw_ios_per_sec": 0, 00:16:36.336 "rw_mbytes_per_sec": 0, 00:16:36.336 "r_mbytes_per_sec": 0, 00:16:36.336 "w_mbytes_per_sec": 0 00:16:36.336 }, 00:16:36.336 "claimed": true, 00:16:36.336 "claim_type": "exclusive_write", 00:16:36.336 "zoned": false, 00:16:36.336 "supported_io_types": { 00:16:36.336 "read": true, 00:16:36.336 "write": true, 00:16:36.336 "unmap": true, 00:16:36.336 "flush": true, 00:16:36.336 "reset": true, 00:16:36.336 "nvme_admin": false, 00:16:36.336 "nvme_io": false, 00:16:36.336 "nvme_io_md": false, 00:16:36.336 "write_zeroes": true, 00:16:36.336 "zcopy": true, 00:16:36.336 "get_zone_info": false, 00:16:36.336 "zone_management": false, 00:16:36.336 "zone_append": false, 00:16:36.336 "compare": false, 00:16:36.336 "compare_and_write": false, 00:16:36.336 "abort": true, 00:16:36.336 "seek_hole": false, 00:16:36.336 "seek_data": false, 00:16:36.336 "copy": true, 00:16:36.336 "nvme_iov_md": false 00:16:36.336 }, 00:16:36.336 "memory_domains": [ 00:16:36.336 { 00:16:36.336 "dma_device_id": "system", 00:16:36.336 "dma_device_type": 1 00:16:36.336 }, 00:16:36.336 { 00:16:36.336 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.336 "dma_device_type": 2 00:16:36.336 } 00:16:36.336 ], 00:16:36.336 "driver_specific": {} 00:16:36.336 } 00:16:36.336 ] 00:16:36.336 08:29:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:36.336 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:36.336 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:36.336 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:36.336 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:36.336 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:36.336 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:36.336 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:36.336 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:36.336 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:36.336 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:36.336 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:36.336 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:36.336 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:36.336 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:36.595 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:36.595 "name": "Existed_Raid", 00:16:36.595 "uuid": "22086779-9c26-42d3-bf62-d9f4355b4761", 00:16:36.595 "strip_size_kb": 0, 00:16:36.595 "state": "configuring", 00:16:36.595 "raid_level": "raid1", 00:16:36.595 "superblock": true, 00:16:36.595 "num_base_bdevs": 3, 00:16:36.595 "num_base_bdevs_discovered": 2, 00:16:36.595 "num_base_bdevs_operational": 3, 00:16:36.595 "base_bdevs_list": [ 00:16:36.595 { 00:16:36.595 "name": "BaseBdev1", 00:16:36.595 "uuid": "99f183e5-db7c-4111-bc8d-61d7a7b98299", 00:16:36.595 "is_configured": true, 00:16:36.595 "data_offset": 2048, 00:16:36.595 "data_size": 63488 00:16:36.595 }, 00:16:36.595 { 00:16:36.595 "name": "BaseBdev2", 00:16:36.595 "uuid": "c258da28-7cb9-4720-84f6-e20542d3497f", 00:16:36.595 "is_configured": true, 00:16:36.595 "data_offset": 2048, 00:16:36.595 "data_size": 63488 00:16:36.595 }, 00:16:36.595 { 00:16:36.595 "name": "BaseBdev3", 00:16:36.595 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:36.595 "is_configured": false, 00:16:36.596 "data_offset": 0, 00:16:36.596 "data_size": 0 00:16:36.596 } 00:16:36.596 ] 00:16:36.596 }' 00:16:36.596 08:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:36.596 08:29:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:37.163 08:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:37.163 [2024-07-23 08:29:49.614149] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:37.163 [2024-07-23 08:29:49.614379] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:16:37.163 [2024-07-23 08:29:49.614398] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:37.163 [2024-07-23 08:29:49.614648] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:16:37.163 [2024-07-23 08:29:49.614864] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:16:37.163 [2024-07-23 08:29:49.614875] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:16:37.163 [2024-07-23 08:29:49.615025] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:37.164 BaseBdev3 00:16:37.164 08:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:37.164 08:29:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:37.164 08:29:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:37.164 08:29:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:37.164 08:29:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:37.164 08:29:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:37.164 08:29:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:37.422 08:29:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:37.681 [ 00:16:37.681 { 00:16:37.681 "name": "BaseBdev3", 00:16:37.681 "aliases": [ 00:16:37.681 "8ca05632-24e9-4fe5-9ec9-5a4e3320d9d8" 00:16:37.681 ], 00:16:37.681 "product_name": "Malloc disk", 00:16:37.681 "block_size": 512, 00:16:37.681 "num_blocks": 65536, 00:16:37.681 "uuid": "8ca05632-24e9-4fe5-9ec9-5a4e3320d9d8", 00:16:37.681 "assigned_rate_limits": { 00:16:37.681 "rw_ios_per_sec": 0, 00:16:37.681 "rw_mbytes_per_sec": 0, 00:16:37.681 "r_mbytes_per_sec": 0, 00:16:37.681 "w_mbytes_per_sec": 0 00:16:37.681 }, 00:16:37.681 "claimed": true, 00:16:37.681 "claim_type": "exclusive_write", 00:16:37.681 "zoned": false, 00:16:37.681 "supported_io_types": { 00:16:37.681 "read": true, 00:16:37.681 "write": true, 00:16:37.681 "unmap": true, 00:16:37.681 "flush": true, 00:16:37.681 "reset": true, 00:16:37.681 "nvme_admin": false, 00:16:37.681 "nvme_io": false, 00:16:37.681 "nvme_io_md": false, 00:16:37.681 "write_zeroes": true, 00:16:37.681 "zcopy": true, 00:16:37.681 "get_zone_info": false, 00:16:37.681 "zone_management": false, 00:16:37.681 "zone_append": false, 00:16:37.681 "compare": false, 00:16:37.681 "compare_and_write": false, 00:16:37.681 "abort": true, 00:16:37.681 "seek_hole": false, 00:16:37.681 "seek_data": false, 00:16:37.681 "copy": true, 00:16:37.681 "nvme_iov_md": false 00:16:37.681 }, 00:16:37.681 "memory_domains": [ 00:16:37.681 { 00:16:37.681 "dma_device_id": "system", 00:16:37.681 "dma_device_type": 1 00:16:37.681 }, 00:16:37.681 { 00:16:37.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.681 "dma_device_type": 2 00:16:37.681 } 00:16:37.681 ], 00:16:37.681 "driver_specific": {} 00:16:37.681 } 00:16:37.681 ] 00:16:37.681 08:29:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:37.681 08:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:37.681 08:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:37.681 08:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:37.681 08:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:37.681 08:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:37.681 08:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:37.681 08:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:37.681 08:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:37.681 08:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:37.681 08:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:37.682 08:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:37.682 08:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:37.682 08:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.682 08:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:37.682 08:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:37.682 "name": "Existed_Raid", 00:16:37.682 "uuid": "22086779-9c26-42d3-bf62-d9f4355b4761", 00:16:37.682 "strip_size_kb": 0, 00:16:37.682 "state": "online", 00:16:37.682 "raid_level": "raid1", 00:16:37.682 "superblock": true, 00:16:37.682 "num_base_bdevs": 3, 00:16:37.682 "num_base_bdevs_discovered": 3, 00:16:37.682 "num_base_bdevs_operational": 3, 00:16:37.682 "base_bdevs_list": [ 00:16:37.682 { 00:16:37.682 "name": "BaseBdev1", 00:16:37.682 "uuid": "99f183e5-db7c-4111-bc8d-61d7a7b98299", 00:16:37.682 "is_configured": true, 00:16:37.682 "data_offset": 2048, 00:16:37.682 "data_size": 63488 00:16:37.682 }, 00:16:37.682 { 00:16:37.682 "name": "BaseBdev2", 00:16:37.682 "uuid": "c258da28-7cb9-4720-84f6-e20542d3497f", 00:16:37.682 "is_configured": true, 00:16:37.682 "data_offset": 2048, 00:16:37.682 "data_size": 63488 00:16:37.682 }, 00:16:37.682 { 00:16:37.682 "name": "BaseBdev3", 00:16:37.682 "uuid": "8ca05632-24e9-4fe5-9ec9-5a4e3320d9d8", 00:16:37.682 "is_configured": true, 00:16:37.682 "data_offset": 2048, 00:16:37.682 "data_size": 63488 00:16:37.682 } 00:16:37.682 ] 00:16:37.682 }' 00:16:37.682 08:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:37.682 08:29:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:38.248 08:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:38.248 08:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:38.248 08:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:38.248 08:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:38.248 08:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:38.248 08:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:38.248 08:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:38.248 08:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:38.507 [2024-07-23 08:29:50.789588] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:38.507 08:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:38.507 "name": "Existed_Raid", 00:16:38.507 "aliases": [ 00:16:38.507 "22086779-9c26-42d3-bf62-d9f4355b4761" 00:16:38.507 ], 00:16:38.507 "product_name": "Raid Volume", 00:16:38.507 "block_size": 512, 00:16:38.507 "num_blocks": 63488, 00:16:38.507 "uuid": "22086779-9c26-42d3-bf62-d9f4355b4761", 00:16:38.507 "assigned_rate_limits": { 00:16:38.507 "rw_ios_per_sec": 0, 00:16:38.507 "rw_mbytes_per_sec": 0, 00:16:38.507 "r_mbytes_per_sec": 0, 00:16:38.507 "w_mbytes_per_sec": 0 00:16:38.507 }, 00:16:38.507 "claimed": false, 00:16:38.507 "zoned": false, 00:16:38.507 "supported_io_types": { 00:16:38.507 "read": true, 00:16:38.507 "write": true, 00:16:38.507 "unmap": false, 00:16:38.507 "flush": false, 00:16:38.507 "reset": true, 00:16:38.507 "nvme_admin": false, 00:16:38.507 "nvme_io": false, 00:16:38.507 "nvme_io_md": false, 00:16:38.507 "write_zeroes": true, 00:16:38.507 "zcopy": false, 00:16:38.507 "get_zone_info": false, 00:16:38.507 "zone_management": false, 00:16:38.507 "zone_append": false, 00:16:38.507 "compare": false, 00:16:38.507 "compare_and_write": false, 00:16:38.507 "abort": false, 00:16:38.507 "seek_hole": false, 00:16:38.507 "seek_data": false, 00:16:38.508 "copy": false, 00:16:38.508 "nvme_iov_md": false 00:16:38.508 }, 00:16:38.508 "memory_domains": [ 00:16:38.508 { 00:16:38.508 "dma_device_id": "system", 00:16:38.508 "dma_device_type": 1 00:16:38.508 }, 00:16:38.508 { 00:16:38.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.508 "dma_device_type": 2 00:16:38.508 }, 00:16:38.508 { 00:16:38.508 "dma_device_id": "system", 00:16:38.508 "dma_device_type": 1 00:16:38.508 }, 00:16:38.508 { 00:16:38.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.508 "dma_device_type": 2 00:16:38.508 }, 00:16:38.508 { 00:16:38.508 "dma_device_id": "system", 00:16:38.508 "dma_device_type": 1 00:16:38.508 }, 00:16:38.508 { 00:16:38.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.508 "dma_device_type": 2 00:16:38.508 } 00:16:38.508 ], 00:16:38.508 "driver_specific": { 00:16:38.508 "raid": { 00:16:38.508 "uuid": "22086779-9c26-42d3-bf62-d9f4355b4761", 00:16:38.508 "strip_size_kb": 0, 00:16:38.508 "state": "online", 00:16:38.508 "raid_level": "raid1", 00:16:38.508 "superblock": true, 00:16:38.508 "num_base_bdevs": 3, 00:16:38.508 "num_base_bdevs_discovered": 3, 00:16:38.508 "num_base_bdevs_operational": 3, 00:16:38.508 "base_bdevs_list": [ 00:16:38.508 { 00:16:38.508 "name": "BaseBdev1", 00:16:38.508 "uuid": "99f183e5-db7c-4111-bc8d-61d7a7b98299", 00:16:38.508 "is_configured": true, 00:16:38.508 "data_offset": 2048, 00:16:38.508 "data_size": 63488 00:16:38.508 }, 00:16:38.508 { 00:16:38.508 "name": "BaseBdev2", 00:16:38.508 "uuid": "c258da28-7cb9-4720-84f6-e20542d3497f", 00:16:38.508 "is_configured": true, 00:16:38.508 "data_offset": 2048, 00:16:38.508 "data_size": 63488 00:16:38.508 }, 00:16:38.508 { 00:16:38.508 "name": "BaseBdev3", 00:16:38.508 "uuid": "8ca05632-24e9-4fe5-9ec9-5a4e3320d9d8", 00:16:38.508 "is_configured": true, 00:16:38.508 "data_offset": 2048, 00:16:38.508 "data_size": 63488 00:16:38.508 } 00:16:38.508 ] 00:16:38.508 } 00:16:38.508 } 00:16:38.508 }' 00:16:38.508 08:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:38.508 08:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:38.508 BaseBdev2 00:16:38.508 BaseBdev3' 00:16:38.508 08:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:38.508 08:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:38.508 08:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:38.508 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:38.508 "name": "BaseBdev1", 00:16:38.508 "aliases": [ 00:16:38.508 "99f183e5-db7c-4111-bc8d-61d7a7b98299" 00:16:38.508 ], 00:16:38.508 "product_name": "Malloc disk", 00:16:38.508 "block_size": 512, 00:16:38.508 "num_blocks": 65536, 00:16:38.508 "uuid": "99f183e5-db7c-4111-bc8d-61d7a7b98299", 00:16:38.508 "assigned_rate_limits": { 00:16:38.508 "rw_ios_per_sec": 0, 00:16:38.508 "rw_mbytes_per_sec": 0, 00:16:38.508 "r_mbytes_per_sec": 0, 00:16:38.508 "w_mbytes_per_sec": 0 00:16:38.508 }, 00:16:38.508 "claimed": true, 00:16:38.508 "claim_type": "exclusive_write", 00:16:38.508 "zoned": false, 00:16:38.508 "supported_io_types": { 00:16:38.508 "read": true, 00:16:38.508 "write": true, 00:16:38.508 "unmap": true, 00:16:38.508 "flush": true, 00:16:38.508 "reset": true, 00:16:38.508 "nvme_admin": false, 00:16:38.508 "nvme_io": false, 00:16:38.508 "nvme_io_md": false, 00:16:38.508 "write_zeroes": true, 00:16:38.508 "zcopy": true, 00:16:38.508 "get_zone_info": false, 00:16:38.508 "zone_management": false, 00:16:38.508 "zone_append": false, 00:16:38.508 "compare": false, 00:16:38.508 "compare_and_write": false, 00:16:38.508 "abort": true, 00:16:38.508 "seek_hole": false, 00:16:38.508 "seek_data": false, 00:16:38.508 "copy": true, 00:16:38.508 "nvme_iov_md": false 00:16:38.508 }, 00:16:38.508 "memory_domains": [ 00:16:38.508 { 00:16:38.508 "dma_device_id": "system", 00:16:38.508 "dma_device_type": 1 00:16:38.508 }, 00:16:38.508 { 00:16:38.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.508 "dma_device_type": 2 00:16:38.508 } 00:16:38.508 ], 00:16:38.508 "driver_specific": {} 00:16:38.508 }' 00:16:38.508 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.767 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.767 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:38.767 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.767 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.767 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:38.767 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.767 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.767 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:38.767 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.767 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.027 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:39.027 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:39.027 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:39.027 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:39.027 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:39.027 "name": "BaseBdev2", 00:16:39.027 "aliases": [ 00:16:39.027 "c258da28-7cb9-4720-84f6-e20542d3497f" 00:16:39.027 ], 00:16:39.027 "product_name": "Malloc disk", 00:16:39.027 "block_size": 512, 00:16:39.027 "num_blocks": 65536, 00:16:39.027 "uuid": "c258da28-7cb9-4720-84f6-e20542d3497f", 00:16:39.027 "assigned_rate_limits": { 00:16:39.027 "rw_ios_per_sec": 0, 00:16:39.027 "rw_mbytes_per_sec": 0, 00:16:39.027 "r_mbytes_per_sec": 0, 00:16:39.027 "w_mbytes_per_sec": 0 00:16:39.027 }, 00:16:39.027 "claimed": true, 00:16:39.027 "claim_type": "exclusive_write", 00:16:39.027 "zoned": false, 00:16:39.027 "supported_io_types": { 00:16:39.027 "read": true, 00:16:39.027 "write": true, 00:16:39.027 "unmap": true, 00:16:39.027 "flush": true, 00:16:39.027 "reset": true, 00:16:39.027 "nvme_admin": false, 00:16:39.027 "nvme_io": false, 00:16:39.027 "nvme_io_md": false, 00:16:39.027 "write_zeroes": true, 00:16:39.027 "zcopy": true, 00:16:39.027 "get_zone_info": false, 00:16:39.027 "zone_management": false, 00:16:39.027 "zone_append": false, 00:16:39.027 "compare": false, 00:16:39.027 "compare_and_write": false, 00:16:39.027 "abort": true, 00:16:39.027 "seek_hole": false, 00:16:39.027 "seek_data": false, 00:16:39.027 "copy": true, 00:16:39.027 "nvme_iov_md": false 00:16:39.027 }, 00:16:39.027 "memory_domains": [ 00:16:39.027 { 00:16:39.027 "dma_device_id": "system", 00:16:39.027 "dma_device_type": 1 00:16:39.027 }, 00:16:39.027 { 00:16:39.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.027 "dma_device_type": 2 00:16:39.027 } 00:16:39.027 ], 00:16:39.027 "driver_specific": {} 00:16:39.027 }' 00:16:39.027 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.027 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.286 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:39.286 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.286 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.286 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:39.286 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.286 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.286 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:39.286 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.286 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.286 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:39.287 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:39.287 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:39.287 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:39.545 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:39.545 "name": "BaseBdev3", 00:16:39.545 "aliases": [ 00:16:39.545 "8ca05632-24e9-4fe5-9ec9-5a4e3320d9d8" 00:16:39.545 ], 00:16:39.545 "product_name": "Malloc disk", 00:16:39.545 "block_size": 512, 00:16:39.545 "num_blocks": 65536, 00:16:39.545 "uuid": "8ca05632-24e9-4fe5-9ec9-5a4e3320d9d8", 00:16:39.545 "assigned_rate_limits": { 00:16:39.545 "rw_ios_per_sec": 0, 00:16:39.545 "rw_mbytes_per_sec": 0, 00:16:39.545 "r_mbytes_per_sec": 0, 00:16:39.545 "w_mbytes_per_sec": 0 00:16:39.545 }, 00:16:39.545 "claimed": true, 00:16:39.545 "claim_type": "exclusive_write", 00:16:39.545 "zoned": false, 00:16:39.545 "supported_io_types": { 00:16:39.545 "read": true, 00:16:39.545 "write": true, 00:16:39.545 "unmap": true, 00:16:39.545 "flush": true, 00:16:39.545 "reset": true, 00:16:39.545 "nvme_admin": false, 00:16:39.545 "nvme_io": false, 00:16:39.545 "nvme_io_md": false, 00:16:39.545 "write_zeroes": true, 00:16:39.545 "zcopy": true, 00:16:39.546 "get_zone_info": false, 00:16:39.546 "zone_management": false, 00:16:39.546 "zone_append": false, 00:16:39.546 "compare": false, 00:16:39.546 "compare_and_write": false, 00:16:39.546 "abort": true, 00:16:39.546 "seek_hole": false, 00:16:39.546 "seek_data": false, 00:16:39.546 "copy": true, 00:16:39.546 "nvme_iov_md": false 00:16:39.546 }, 00:16:39.546 "memory_domains": [ 00:16:39.546 { 00:16:39.546 "dma_device_id": "system", 00:16:39.546 "dma_device_type": 1 00:16:39.546 }, 00:16:39.546 { 00:16:39.546 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.546 "dma_device_type": 2 00:16:39.546 } 00:16:39.546 ], 00:16:39.546 "driver_specific": {} 00:16:39.546 }' 00:16:39.546 08:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.546 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.546 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:39.546 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.804 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.804 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:39.804 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.804 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.804 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:39.804 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.804 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.804 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:39.804 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:40.062 [2024-07-23 08:29:52.445746] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:40.062 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:40.062 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:40.062 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:40.062 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:16:40.062 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:40.062 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:40.062 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:40.062 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:40.062 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:40.063 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:40.063 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:40.063 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:40.063 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:40.063 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:40.063 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:40.063 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.063 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:40.320 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:40.320 "name": "Existed_Raid", 00:16:40.320 "uuid": "22086779-9c26-42d3-bf62-d9f4355b4761", 00:16:40.320 "strip_size_kb": 0, 00:16:40.320 "state": "online", 00:16:40.320 "raid_level": "raid1", 00:16:40.320 "superblock": true, 00:16:40.320 "num_base_bdevs": 3, 00:16:40.320 "num_base_bdevs_discovered": 2, 00:16:40.320 "num_base_bdevs_operational": 2, 00:16:40.320 "base_bdevs_list": [ 00:16:40.320 { 00:16:40.320 "name": null, 00:16:40.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:40.321 "is_configured": false, 00:16:40.321 "data_offset": 2048, 00:16:40.321 "data_size": 63488 00:16:40.321 }, 00:16:40.321 { 00:16:40.321 "name": "BaseBdev2", 00:16:40.321 "uuid": "c258da28-7cb9-4720-84f6-e20542d3497f", 00:16:40.321 "is_configured": true, 00:16:40.321 "data_offset": 2048, 00:16:40.321 "data_size": 63488 00:16:40.321 }, 00:16:40.321 { 00:16:40.321 "name": "BaseBdev3", 00:16:40.321 "uuid": "8ca05632-24e9-4fe5-9ec9-5a4e3320d9d8", 00:16:40.321 "is_configured": true, 00:16:40.321 "data_offset": 2048, 00:16:40.321 "data_size": 63488 00:16:40.321 } 00:16:40.321 ] 00:16:40.321 }' 00:16:40.321 08:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:40.321 08:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:40.889 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:40.889 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:40.889 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.889 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:40.889 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:40.889 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:40.889 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:41.148 [2024-07-23 08:29:53.456600] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:41.148 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:41.148 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:41.148 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.148 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:41.406 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:41.406 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:41.406 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:41.406 [2024-07-23 08:29:53.889242] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:41.406 [2024-07-23 08:29:53.889346] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:41.664 [2024-07-23 08:29:53.983870] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:41.664 [2024-07-23 08:29:53.983921] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:41.664 [2024-07-23 08:29:53.983933] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:16:41.664 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:41.664 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:41.664 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.664 08:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:41.664 08:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:41.664 08:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:41.664 08:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:41.664 08:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:41.664 08:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:41.665 08:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:41.923 BaseBdev2 00:16:41.923 08:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:41.923 08:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:41.923 08:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:41.923 08:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:41.923 08:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:41.923 08:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:41.923 08:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:42.182 08:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:42.440 [ 00:16:42.440 { 00:16:42.440 "name": "BaseBdev2", 00:16:42.440 "aliases": [ 00:16:42.440 "c7fd3fc4-63bb-43da-bab9-f6a13a3f981c" 00:16:42.440 ], 00:16:42.440 "product_name": "Malloc disk", 00:16:42.440 "block_size": 512, 00:16:42.440 "num_blocks": 65536, 00:16:42.440 "uuid": "c7fd3fc4-63bb-43da-bab9-f6a13a3f981c", 00:16:42.440 "assigned_rate_limits": { 00:16:42.440 "rw_ios_per_sec": 0, 00:16:42.440 "rw_mbytes_per_sec": 0, 00:16:42.441 "r_mbytes_per_sec": 0, 00:16:42.441 "w_mbytes_per_sec": 0 00:16:42.441 }, 00:16:42.441 "claimed": false, 00:16:42.441 "zoned": false, 00:16:42.441 "supported_io_types": { 00:16:42.441 "read": true, 00:16:42.441 "write": true, 00:16:42.441 "unmap": true, 00:16:42.441 "flush": true, 00:16:42.441 "reset": true, 00:16:42.441 "nvme_admin": false, 00:16:42.441 "nvme_io": false, 00:16:42.441 "nvme_io_md": false, 00:16:42.441 "write_zeroes": true, 00:16:42.441 "zcopy": true, 00:16:42.441 "get_zone_info": false, 00:16:42.441 "zone_management": false, 00:16:42.441 "zone_append": false, 00:16:42.441 "compare": false, 00:16:42.441 "compare_and_write": false, 00:16:42.441 "abort": true, 00:16:42.441 "seek_hole": false, 00:16:42.441 "seek_data": false, 00:16:42.441 "copy": true, 00:16:42.441 "nvme_iov_md": false 00:16:42.441 }, 00:16:42.441 "memory_domains": [ 00:16:42.441 { 00:16:42.441 "dma_device_id": "system", 00:16:42.441 "dma_device_type": 1 00:16:42.441 }, 00:16:42.441 { 00:16:42.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:42.441 "dma_device_type": 2 00:16:42.441 } 00:16:42.441 ], 00:16:42.441 "driver_specific": {} 00:16:42.441 } 00:16:42.441 ] 00:16:42.441 08:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:42.441 08:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:42.441 08:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:42.441 08:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:42.441 BaseBdev3 00:16:42.441 08:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:42.441 08:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:42.441 08:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:42.441 08:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:42.441 08:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:42.441 08:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:42.441 08:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:42.699 08:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:42.976 [ 00:16:42.977 { 00:16:42.977 "name": "BaseBdev3", 00:16:42.977 "aliases": [ 00:16:42.977 "22c2e093-91ae-4a32-bf17-9df306a68ed4" 00:16:42.977 ], 00:16:42.977 "product_name": "Malloc disk", 00:16:42.977 "block_size": 512, 00:16:42.977 "num_blocks": 65536, 00:16:42.977 "uuid": "22c2e093-91ae-4a32-bf17-9df306a68ed4", 00:16:42.977 "assigned_rate_limits": { 00:16:42.977 "rw_ios_per_sec": 0, 00:16:42.977 "rw_mbytes_per_sec": 0, 00:16:42.977 "r_mbytes_per_sec": 0, 00:16:42.977 "w_mbytes_per_sec": 0 00:16:42.977 }, 00:16:42.977 "claimed": false, 00:16:42.977 "zoned": false, 00:16:42.977 "supported_io_types": { 00:16:42.977 "read": true, 00:16:42.977 "write": true, 00:16:42.977 "unmap": true, 00:16:42.977 "flush": true, 00:16:42.977 "reset": true, 00:16:42.977 "nvme_admin": false, 00:16:42.977 "nvme_io": false, 00:16:42.977 "nvme_io_md": false, 00:16:42.977 "write_zeroes": true, 00:16:42.977 "zcopy": true, 00:16:42.977 "get_zone_info": false, 00:16:42.977 "zone_management": false, 00:16:42.977 "zone_append": false, 00:16:42.977 "compare": false, 00:16:42.977 "compare_and_write": false, 00:16:42.977 "abort": true, 00:16:42.977 "seek_hole": false, 00:16:42.977 "seek_data": false, 00:16:42.977 "copy": true, 00:16:42.977 "nvme_iov_md": false 00:16:42.977 }, 00:16:42.977 "memory_domains": [ 00:16:42.977 { 00:16:42.977 "dma_device_id": "system", 00:16:42.977 "dma_device_type": 1 00:16:42.977 }, 00:16:42.977 { 00:16:42.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:42.977 "dma_device_type": 2 00:16:42.977 } 00:16:42.977 ], 00:16:42.977 "driver_specific": {} 00:16:42.977 } 00:16:42.977 ] 00:16:42.977 08:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:42.977 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:42.977 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:42.977 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:42.977 [2024-07-23 08:29:55.416554] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:42.977 [2024-07-23 08:29:55.416597] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:42.977 [2024-07-23 08:29:55.416630] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:42.977 [2024-07-23 08:29:55.418256] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:42.977 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:42.977 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:42.977 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:42.977 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:42.977 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:42.977 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:42.977 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:42.977 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:42.977 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:42.977 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:42.977 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.977 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:43.252 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.252 "name": "Existed_Raid", 00:16:43.252 "uuid": "5a7fe7fa-441e-4c3f-abf4-44a808b322fc", 00:16:43.252 "strip_size_kb": 0, 00:16:43.252 "state": "configuring", 00:16:43.252 "raid_level": "raid1", 00:16:43.252 "superblock": true, 00:16:43.252 "num_base_bdevs": 3, 00:16:43.252 "num_base_bdevs_discovered": 2, 00:16:43.252 "num_base_bdevs_operational": 3, 00:16:43.252 "base_bdevs_list": [ 00:16:43.252 { 00:16:43.252 "name": "BaseBdev1", 00:16:43.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.252 "is_configured": false, 00:16:43.252 "data_offset": 0, 00:16:43.252 "data_size": 0 00:16:43.252 }, 00:16:43.252 { 00:16:43.252 "name": "BaseBdev2", 00:16:43.252 "uuid": "c7fd3fc4-63bb-43da-bab9-f6a13a3f981c", 00:16:43.252 "is_configured": true, 00:16:43.252 "data_offset": 2048, 00:16:43.252 "data_size": 63488 00:16:43.252 }, 00:16:43.252 { 00:16:43.252 "name": "BaseBdev3", 00:16:43.252 "uuid": "22c2e093-91ae-4a32-bf17-9df306a68ed4", 00:16:43.252 "is_configured": true, 00:16:43.252 "data_offset": 2048, 00:16:43.252 "data_size": 63488 00:16:43.252 } 00:16:43.252 ] 00:16:43.252 }' 00:16:43.252 08:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.252 08:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:43.819 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:43.819 [2024-07-23 08:29:56.230690] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:43.819 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:43.819 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:43.819 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:43.819 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:43.819 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:43.819 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:43.819 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.819 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.819 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.819 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.819 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.819 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.078 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.078 "name": "Existed_Raid", 00:16:44.078 "uuid": "5a7fe7fa-441e-4c3f-abf4-44a808b322fc", 00:16:44.078 "strip_size_kb": 0, 00:16:44.078 "state": "configuring", 00:16:44.078 "raid_level": "raid1", 00:16:44.078 "superblock": true, 00:16:44.078 "num_base_bdevs": 3, 00:16:44.078 "num_base_bdevs_discovered": 1, 00:16:44.078 "num_base_bdevs_operational": 3, 00:16:44.078 "base_bdevs_list": [ 00:16:44.078 { 00:16:44.078 "name": "BaseBdev1", 00:16:44.078 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.078 "is_configured": false, 00:16:44.078 "data_offset": 0, 00:16:44.078 "data_size": 0 00:16:44.078 }, 00:16:44.078 { 00:16:44.078 "name": null, 00:16:44.078 "uuid": "c7fd3fc4-63bb-43da-bab9-f6a13a3f981c", 00:16:44.078 "is_configured": false, 00:16:44.078 "data_offset": 2048, 00:16:44.078 "data_size": 63488 00:16:44.078 }, 00:16:44.078 { 00:16:44.078 "name": "BaseBdev3", 00:16:44.078 "uuid": "22c2e093-91ae-4a32-bf17-9df306a68ed4", 00:16:44.078 "is_configured": true, 00:16:44.078 "data_offset": 2048, 00:16:44.078 "data_size": 63488 00:16:44.078 } 00:16:44.078 ] 00:16:44.078 }' 00:16:44.078 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.078 08:29:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:44.645 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.645 08:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:44.645 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:44.645 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:44.903 [2024-07-23 08:29:57.277394] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:44.903 BaseBdev1 00:16:44.903 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:44.903 08:29:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:44.903 08:29:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:44.903 08:29:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:44.903 08:29:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:44.903 08:29:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:44.903 08:29:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:45.162 08:29:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:45.162 [ 00:16:45.162 { 00:16:45.162 "name": "BaseBdev1", 00:16:45.162 "aliases": [ 00:16:45.162 "6491b52d-5d43-45d0-b7c0-fe7c91f00ca0" 00:16:45.162 ], 00:16:45.162 "product_name": "Malloc disk", 00:16:45.162 "block_size": 512, 00:16:45.162 "num_blocks": 65536, 00:16:45.162 "uuid": "6491b52d-5d43-45d0-b7c0-fe7c91f00ca0", 00:16:45.162 "assigned_rate_limits": { 00:16:45.162 "rw_ios_per_sec": 0, 00:16:45.162 "rw_mbytes_per_sec": 0, 00:16:45.162 "r_mbytes_per_sec": 0, 00:16:45.162 "w_mbytes_per_sec": 0 00:16:45.162 }, 00:16:45.162 "claimed": true, 00:16:45.162 "claim_type": "exclusive_write", 00:16:45.162 "zoned": false, 00:16:45.162 "supported_io_types": { 00:16:45.162 "read": true, 00:16:45.162 "write": true, 00:16:45.162 "unmap": true, 00:16:45.162 "flush": true, 00:16:45.162 "reset": true, 00:16:45.162 "nvme_admin": false, 00:16:45.162 "nvme_io": false, 00:16:45.162 "nvme_io_md": false, 00:16:45.162 "write_zeroes": true, 00:16:45.162 "zcopy": true, 00:16:45.162 "get_zone_info": false, 00:16:45.162 "zone_management": false, 00:16:45.162 "zone_append": false, 00:16:45.162 "compare": false, 00:16:45.162 "compare_and_write": false, 00:16:45.162 "abort": true, 00:16:45.162 "seek_hole": false, 00:16:45.162 "seek_data": false, 00:16:45.162 "copy": true, 00:16:45.162 "nvme_iov_md": false 00:16:45.162 }, 00:16:45.162 "memory_domains": [ 00:16:45.162 { 00:16:45.162 "dma_device_id": "system", 00:16:45.162 "dma_device_type": 1 00:16:45.162 }, 00:16:45.162 { 00:16:45.162 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:45.162 "dma_device_type": 2 00:16:45.162 } 00:16:45.162 ], 00:16:45.162 "driver_specific": {} 00:16:45.162 } 00:16:45.162 ] 00:16:45.162 08:29:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:45.162 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:45.162 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.162 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:45.162 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:45.162 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:45.162 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:45.162 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.162 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.162 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.162 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.162 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.162 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.421 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.422 "name": "Existed_Raid", 00:16:45.422 "uuid": "5a7fe7fa-441e-4c3f-abf4-44a808b322fc", 00:16:45.422 "strip_size_kb": 0, 00:16:45.422 "state": "configuring", 00:16:45.422 "raid_level": "raid1", 00:16:45.422 "superblock": true, 00:16:45.422 "num_base_bdevs": 3, 00:16:45.422 "num_base_bdevs_discovered": 2, 00:16:45.422 "num_base_bdevs_operational": 3, 00:16:45.422 "base_bdevs_list": [ 00:16:45.422 { 00:16:45.422 "name": "BaseBdev1", 00:16:45.422 "uuid": "6491b52d-5d43-45d0-b7c0-fe7c91f00ca0", 00:16:45.422 "is_configured": true, 00:16:45.422 "data_offset": 2048, 00:16:45.422 "data_size": 63488 00:16:45.422 }, 00:16:45.422 { 00:16:45.422 "name": null, 00:16:45.422 "uuid": "c7fd3fc4-63bb-43da-bab9-f6a13a3f981c", 00:16:45.422 "is_configured": false, 00:16:45.422 "data_offset": 2048, 00:16:45.422 "data_size": 63488 00:16:45.422 }, 00:16:45.422 { 00:16:45.422 "name": "BaseBdev3", 00:16:45.422 "uuid": "22c2e093-91ae-4a32-bf17-9df306a68ed4", 00:16:45.422 "is_configured": true, 00:16:45.422 "data_offset": 2048, 00:16:45.422 "data_size": 63488 00:16:45.422 } 00:16:45.422 ] 00:16:45.422 }' 00:16:45.422 08:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.422 08:29:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:45.990 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.990 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:45.990 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:45.990 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:46.248 [2024-07-23 08:29:58.625016] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:46.248 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:46.248 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.248 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:46.248 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:46.248 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:46.248 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:46.248 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.248 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.248 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.248 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.248 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.248 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:46.506 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:46.506 "name": "Existed_Raid", 00:16:46.506 "uuid": "5a7fe7fa-441e-4c3f-abf4-44a808b322fc", 00:16:46.506 "strip_size_kb": 0, 00:16:46.506 "state": "configuring", 00:16:46.506 "raid_level": "raid1", 00:16:46.506 "superblock": true, 00:16:46.506 "num_base_bdevs": 3, 00:16:46.506 "num_base_bdevs_discovered": 1, 00:16:46.506 "num_base_bdevs_operational": 3, 00:16:46.506 "base_bdevs_list": [ 00:16:46.506 { 00:16:46.506 "name": "BaseBdev1", 00:16:46.506 "uuid": "6491b52d-5d43-45d0-b7c0-fe7c91f00ca0", 00:16:46.506 "is_configured": true, 00:16:46.506 "data_offset": 2048, 00:16:46.506 "data_size": 63488 00:16:46.506 }, 00:16:46.506 { 00:16:46.506 "name": null, 00:16:46.506 "uuid": "c7fd3fc4-63bb-43da-bab9-f6a13a3f981c", 00:16:46.506 "is_configured": false, 00:16:46.506 "data_offset": 2048, 00:16:46.506 "data_size": 63488 00:16:46.506 }, 00:16:46.506 { 00:16:46.506 "name": null, 00:16:46.506 "uuid": "22c2e093-91ae-4a32-bf17-9df306a68ed4", 00:16:46.506 "is_configured": false, 00:16:46.506 "data_offset": 2048, 00:16:46.506 "data_size": 63488 00:16:46.506 } 00:16:46.506 ] 00:16:46.506 }' 00:16:46.506 08:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:46.506 08:29:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:46.764 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.764 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:47.023 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:47.023 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:47.282 [2024-07-23 08:29:59.583561] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:47.282 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:47.282 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:47.282 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:47.282 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:47.282 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:47.282 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:47.282 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:47.282 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:47.282 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:47.282 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:47.282 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.282 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.282 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.282 "name": "Existed_Raid", 00:16:47.282 "uuid": "5a7fe7fa-441e-4c3f-abf4-44a808b322fc", 00:16:47.282 "strip_size_kb": 0, 00:16:47.282 "state": "configuring", 00:16:47.282 "raid_level": "raid1", 00:16:47.282 "superblock": true, 00:16:47.282 "num_base_bdevs": 3, 00:16:47.282 "num_base_bdevs_discovered": 2, 00:16:47.282 "num_base_bdevs_operational": 3, 00:16:47.282 "base_bdevs_list": [ 00:16:47.282 { 00:16:47.282 "name": "BaseBdev1", 00:16:47.282 "uuid": "6491b52d-5d43-45d0-b7c0-fe7c91f00ca0", 00:16:47.282 "is_configured": true, 00:16:47.282 "data_offset": 2048, 00:16:47.282 "data_size": 63488 00:16:47.282 }, 00:16:47.282 { 00:16:47.282 "name": null, 00:16:47.282 "uuid": "c7fd3fc4-63bb-43da-bab9-f6a13a3f981c", 00:16:47.282 "is_configured": false, 00:16:47.282 "data_offset": 2048, 00:16:47.282 "data_size": 63488 00:16:47.282 }, 00:16:47.282 { 00:16:47.282 "name": "BaseBdev3", 00:16:47.282 "uuid": "22c2e093-91ae-4a32-bf17-9df306a68ed4", 00:16:47.282 "is_configured": true, 00:16:47.282 "data_offset": 2048, 00:16:47.282 "data_size": 63488 00:16:47.282 } 00:16:47.282 ] 00:16:47.282 }' 00:16:47.282 08:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.282 08:29:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:47.849 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.849 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:48.107 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:48.107 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:48.107 [2024-07-23 08:30:00.586257] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:48.366 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:48.366 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:48.366 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:48.366 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:48.366 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:48.366 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:48.366 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:48.366 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:48.366 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:48.366 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:48.366 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.366 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:48.366 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:48.366 "name": "Existed_Raid", 00:16:48.366 "uuid": "5a7fe7fa-441e-4c3f-abf4-44a808b322fc", 00:16:48.366 "strip_size_kb": 0, 00:16:48.366 "state": "configuring", 00:16:48.366 "raid_level": "raid1", 00:16:48.366 "superblock": true, 00:16:48.366 "num_base_bdevs": 3, 00:16:48.366 "num_base_bdevs_discovered": 1, 00:16:48.366 "num_base_bdevs_operational": 3, 00:16:48.366 "base_bdevs_list": [ 00:16:48.366 { 00:16:48.366 "name": null, 00:16:48.366 "uuid": "6491b52d-5d43-45d0-b7c0-fe7c91f00ca0", 00:16:48.366 "is_configured": false, 00:16:48.366 "data_offset": 2048, 00:16:48.366 "data_size": 63488 00:16:48.366 }, 00:16:48.366 { 00:16:48.366 "name": null, 00:16:48.366 "uuid": "c7fd3fc4-63bb-43da-bab9-f6a13a3f981c", 00:16:48.366 "is_configured": false, 00:16:48.366 "data_offset": 2048, 00:16:48.366 "data_size": 63488 00:16:48.366 }, 00:16:48.366 { 00:16:48.366 "name": "BaseBdev3", 00:16:48.366 "uuid": "22c2e093-91ae-4a32-bf17-9df306a68ed4", 00:16:48.366 "is_configured": true, 00:16:48.366 "data_offset": 2048, 00:16:48.366 "data_size": 63488 00:16:48.366 } 00:16:48.366 ] 00:16:48.366 }' 00:16:48.366 08:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:48.366 08:30:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:48.931 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.931 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:49.190 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:49.190 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:49.190 [2024-07-23 08:30:01.690575] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:49.190 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:49.190 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:49.190 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:49.190 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:49.190 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:49.190 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:49.190 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.190 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.190 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.190 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.448 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.448 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:49.448 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.448 "name": "Existed_Raid", 00:16:49.448 "uuid": "5a7fe7fa-441e-4c3f-abf4-44a808b322fc", 00:16:49.448 "strip_size_kb": 0, 00:16:49.448 "state": "configuring", 00:16:49.448 "raid_level": "raid1", 00:16:49.448 "superblock": true, 00:16:49.448 "num_base_bdevs": 3, 00:16:49.448 "num_base_bdevs_discovered": 2, 00:16:49.448 "num_base_bdevs_operational": 3, 00:16:49.448 "base_bdevs_list": [ 00:16:49.448 { 00:16:49.448 "name": null, 00:16:49.448 "uuid": "6491b52d-5d43-45d0-b7c0-fe7c91f00ca0", 00:16:49.448 "is_configured": false, 00:16:49.448 "data_offset": 2048, 00:16:49.448 "data_size": 63488 00:16:49.448 }, 00:16:49.448 { 00:16:49.448 "name": "BaseBdev2", 00:16:49.448 "uuid": "c7fd3fc4-63bb-43da-bab9-f6a13a3f981c", 00:16:49.448 "is_configured": true, 00:16:49.448 "data_offset": 2048, 00:16:49.448 "data_size": 63488 00:16:49.448 }, 00:16:49.448 { 00:16:49.448 "name": "BaseBdev3", 00:16:49.448 "uuid": "22c2e093-91ae-4a32-bf17-9df306a68ed4", 00:16:49.448 "is_configured": true, 00:16:49.448 "data_offset": 2048, 00:16:49.448 "data_size": 63488 00:16:49.448 } 00:16:49.448 ] 00:16:49.448 }' 00:16:49.448 08:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.448 08:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:50.015 08:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.015 08:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:50.015 08:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:50.015 08:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.015 08:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:50.272 08:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6491b52d-5d43-45d0-b7c0-fe7c91f00ca0 00:16:50.530 [2024-07-23 08:30:02.882219] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:50.530 [2024-07-23 08:30:02.882418] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036980 00:16:50.530 [2024-07-23 08:30:02.882432] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:50.530 [2024-07-23 08:30:02.882693] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c200 00:16:50.530 [2024-07-23 08:30:02.882874] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036980 00:16:50.530 [2024-07-23 08:30:02.882887] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000036980 00:16:50.530 [2024-07-23 08:30:02.883023] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:50.530 NewBaseBdev 00:16:50.530 08:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:50.530 08:30:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:50.530 08:30:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:50.530 08:30:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:50.530 08:30:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:50.530 08:30:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:50.530 08:30:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:50.787 08:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:50.787 [ 00:16:50.787 { 00:16:50.787 "name": "NewBaseBdev", 00:16:50.787 "aliases": [ 00:16:50.787 "6491b52d-5d43-45d0-b7c0-fe7c91f00ca0" 00:16:50.787 ], 00:16:50.787 "product_name": "Malloc disk", 00:16:50.787 "block_size": 512, 00:16:50.787 "num_blocks": 65536, 00:16:50.787 "uuid": "6491b52d-5d43-45d0-b7c0-fe7c91f00ca0", 00:16:50.787 "assigned_rate_limits": { 00:16:50.787 "rw_ios_per_sec": 0, 00:16:50.787 "rw_mbytes_per_sec": 0, 00:16:50.787 "r_mbytes_per_sec": 0, 00:16:50.787 "w_mbytes_per_sec": 0 00:16:50.787 }, 00:16:50.787 "claimed": true, 00:16:50.787 "claim_type": "exclusive_write", 00:16:50.787 "zoned": false, 00:16:50.787 "supported_io_types": { 00:16:50.787 "read": true, 00:16:50.787 "write": true, 00:16:50.787 "unmap": true, 00:16:50.787 "flush": true, 00:16:50.787 "reset": true, 00:16:50.787 "nvme_admin": false, 00:16:50.787 "nvme_io": false, 00:16:50.787 "nvme_io_md": false, 00:16:50.787 "write_zeroes": true, 00:16:50.787 "zcopy": true, 00:16:50.787 "get_zone_info": false, 00:16:50.787 "zone_management": false, 00:16:50.787 "zone_append": false, 00:16:50.787 "compare": false, 00:16:50.787 "compare_and_write": false, 00:16:50.787 "abort": true, 00:16:50.787 "seek_hole": false, 00:16:50.787 "seek_data": false, 00:16:50.787 "copy": true, 00:16:50.787 "nvme_iov_md": false 00:16:50.787 }, 00:16:50.787 "memory_domains": [ 00:16:50.787 { 00:16:50.787 "dma_device_id": "system", 00:16:50.787 "dma_device_type": 1 00:16:50.787 }, 00:16:50.787 { 00:16:50.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.787 "dma_device_type": 2 00:16:50.787 } 00:16:50.787 ], 00:16:50.787 "driver_specific": {} 00:16:50.787 } 00:16:50.787 ] 00:16:50.787 08:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:50.787 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:50.787 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:50.787 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:50.787 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:50.787 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:50.787 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:50.787 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:50.787 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:50.787 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:50.787 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:50.787 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.787 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:51.045 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:51.045 "name": "Existed_Raid", 00:16:51.045 "uuid": "5a7fe7fa-441e-4c3f-abf4-44a808b322fc", 00:16:51.045 "strip_size_kb": 0, 00:16:51.045 "state": "online", 00:16:51.045 "raid_level": "raid1", 00:16:51.045 "superblock": true, 00:16:51.045 "num_base_bdevs": 3, 00:16:51.045 "num_base_bdevs_discovered": 3, 00:16:51.045 "num_base_bdevs_operational": 3, 00:16:51.045 "base_bdevs_list": [ 00:16:51.045 { 00:16:51.045 "name": "NewBaseBdev", 00:16:51.045 "uuid": "6491b52d-5d43-45d0-b7c0-fe7c91f00ca0", 00:16:51.045 "is_configured": true, 00:16:51.045 "data_offset": 2048, 00:16:51.045 "data_size": 63488 00:16:51.045 }, 00:16:51.045 { 00:16:51.045 "name": "BaseBdev2", 00:16:51.045 "uuid": "c7fd3fc4-63bb-43da-bab9-f6a13a3f981c", 00:16:51.045 "is_configured": true, 00:16:51.045 "data_offset": 2048, 00:16:51.045 "data_size": 63488 00:16:51.045 }, 00:16:51.045 { 00:16:51.045 "name": "BaseBdev3", 00:16:51.045 "uuid": "22c2e093-91ae-4a32-bf17-9df306a68ed4", 00:16:51.045 "is_configured": true, 00:16:51.045 "data_offset": 2048, 00:16:51.045 "data_size": 63488 00:16:51.045 } 00:16:51.045 ] 00:16:51.045 }' 00:16:51.045 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:51.045 08:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:51.610 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:51.610 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:51.610 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:51.610 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:51.610 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:51.610 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:51.610 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:51.610 08:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:51.610 [2024-07-23 08:30:04.061669] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:51.610 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:51.610 "name": "Existed_Raid", 00:16:51.610 "aliases": [ 00:16:51.610 "5a7fe7fa-441e-4c3f-abf4-44a808b322fc" 00:16:51.610 ], 00:16:51.610 "product_name": "Raid Volume", 00:16:51.610 "block_size": 512, 00:16:51.610 "num_blocks": 63488, 00:16:51.610 "uuid": "5a7fe7fa-441e-4c3f-abf4-44a808b322fc", 00:16:51.610 "assigned_rate_limits": { 00:16:51.610 "rw_ios_per_sec": 0, 00:16:51.610 "rw_mbytes_per_sec": 0, 00:16:51.610 "r_mbytes_per_sec": 0, 00:16:51.610 "w_mbytes_per_sec": 0 00:16:51.610 }, 00:16:51.610 "claimed": false, 00:16:51.610 "zoned": false, 00:16:51.610 "supported_io_types": { 00:16:51.610 "read": true, 00:16:51.610 "write": true, 00:16:51.610 "unmap": false, 00:16:51.610 "flush": false, 00:16:51.610 "reset": true, 00:16:51.610 "nvme_admin": false, 00:16:51.610 "nvme_io": false, 00:16:51.610 "nvme_io_md": false, 00:16:51.610 "write_zeroes": true, 00:16:51.610 "zcopy": false, 00:16:51.610 "get_zone_info": false, 00:16:51.610 "zone_management": false, 00:16:51.610 "zone_append": false, 00:16:51.610 "compare": false, 00:16:51.610 "compare_and_write": false, 00:16:51.610 "abort": false, 00:16:51.610 "seek_hole": false, 00:16:51.610 "seek_data": false, 00:16:51.610 "copy": false, 00:16:51.610 "nvme_iov_md": false 00:16:51.610 }, 00:16:51.610 "memory_domains": [ 00:16:51.610 { 00:16:51.610 "dma_device_id": "system", 00:16:51.610 "dma_device_type": 1 00:16:51.610 }, 00:16:51.610 { 00:16:51.610 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.610 "dma_device_type": 2 00:16:51.610 }, 00:16:51.610 { 00:16:51.610 "dma_device_id": "system", 00:16:51.610 "dma_device_type": 1 00:16:51.610 }, 00:16:51.610 { 00:16:51.610 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.610 "dma_device_type": 2 00:16:51.610 }, 00:16:51.610 { 00:16:51.610 "dma_device_id": "system", 00:16:51.610 "dma_device_type": 1 00:16:51.610 }, 00:16:51.610 { 00:16:51.610 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.610 "dma_device_type": 2 00:16:51.610 } 00:16:51.610 ], 00:16:51.610 "driver_specific": { 00:16:51.610 "raid": { 00:16:51.610 "uuid": "5a7fe7fa-441e-4c3f-abf4-44a808b322fc", 00:16:51.610 "strip_size_kb": 0, 00:16:51.610 "state": "online", 00:16:51.610 "raid_level": "raid1", 00:16:51.610 "superblock": true, 00:16:51.610 "num_base_bdevs": 3, 00:16:51.610 "num_base_bdevs_discovered": 3, 00:16:51.610 "num_base_bdevs_operational": 3, 00:16:51.610 "base_bdevs_list": [ 00:16:51.610 { 00:16:51.610 "name": "NewBaseBdev", 00:16:51.610 "uuid": "6491b52d-5d43-45d0-b7c0-fe7c91f00ca0", 00:16:51.610 "is_configured": true, 00:16:51.610 "data_offset": 2048, 00:16:51.610 "data_size": 63488 00:16:51.610 }, 00:16:51.610 { 00:16:51.610 "name": "BaseBdev2", 00:16:51.610 "uuid": "c7fd3fc4-63bb-43da-bab9-f6a13a3f981c", 00:16:51.610 "is_configured": true, 00:16:51.610 "data_offset": 2048, 00:16:51.610 "data_size": 63488 00:16:51.610 }, 00:16:51.610 { 00:16:51.610 "name": "BaseBdev3", 00:16:51.610 "uuid": "22c2e093-91ae-4a32-bf17-9df306a68ed4", 00:16:51.610 "is_configured": true, 00:16:51.610 "data_offset": 2048, 00:16:51.610 "data_size": 63488 00:16:51.610 } 00:16:51.610 ] 00:16:51.610 } 00:16:51.610 } 00:16:51.610 }' 00:16:51.610 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:51.610 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:51.610 BaseBdev2 00:16:51.610 BaseBdev3' 00:16:51.610 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:51.610 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:51.610 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:51.868 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:51.868 "name": "NewBaseBdev", 00:16:51.868 "aliases": [ 00:16:51.868 "6491b52d-5d43-45d0-b7c0-fe7c91f00ca0" 00:16:51.868 ], 00:16:51.868 "product_name": "Malloc disk", 00:16:51.868 "block_size": 512, 00:16:51.868 "num_blocks": 65536, 00:16:51.868 "uuid": "6491b52d-5d43-45d0-b7c0-fe7c91f00ca0", 00:16:51.868 "assigned_rate_limits": { 00:16:51.868 "rw_ios_per_sec": 0, 00:16:51.868 "rw_mbytes_per_sec": 0, 00:16:51.868 "r_mbytes_per_sec": 0, 00:16:51.868 "w_mbytes_per_sec": 0 00:16:51.868 }, 00:16:51.868 "claimed": true, 00:16:51.868 "claim_type": "exclusive_write", 00:16:51.868 "zoned": false, 00:16:51.868 "supported_io_types": { 00:16:51.868 "read": true, 00:16:51.868 "write": true, 00:16:51.868 "unmap": true, 00:16:51.868 "flush": true, 00:16:51.868 "reset": true, 00:16:51.868 "nvme_admin": false, 00:16:51.868 "nvme_io": false, 00:16:51.868 "nvme_io_md": false, 00:16:51.868 "write_zeroes": true, 00:16:51.868 "zcopy": true, 00:16:51.868 "get_zone_info": false, 00:16:51.868 "zone_management": false, 00:16:51.868 "zone_append": false, 00:16:51.868 "compare": false, 00:16:51.868 "compare_and_write": false, 00:16:51.868 "abort": true, 00:16:51.868 "seek_hole": false, 00:16:51.868 "seek_data": false, 00:16:51.868 "copy": true, 00:16:51.868 "nvme_iov_md": false 00:16:51.868 }, 00:16:51.868 "memory_domains": [ 00:16:51.868 { 00:16:51.868 "dma_device_id": "system", 00:16:51.868 "dma_device_type": 1 00:16:51.868 }, 00:16:51.868 { 00:16:51.868 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.868 "dma_device_type": 2 00:16:51.868 } 00:16:51.868 ], 00:16:51.868 "driver_specific": {} 00:16:51.868 }' 00:16:51.868 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.868 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.126 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:52.127 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.127 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.127 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:52.127 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.127 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.127 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:52.127 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.127 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.127 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:52.127 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:52.127 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:52.127 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:52.384 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:52.384 "name": "BaseBdev2", 00:16:52.384 "aliases": [ 00:16:52.384 "c7fd3fc4-63bb-43da-bab9-f6a13a3f981c" 00:16:52.384 ], 00:16:52.384 "product_name": "Malloc disk", 00:16:52.384 "block_size": 512, 00:16:52.384 "num_blocks": 65536, 00:16:52.384 "uuid": "c7fd3fc4-63bb-43da-bab9-f6a13a3f981c", 00:16:52.384 "assigned_rate_limits": { 00:16:52.384 "rw_ios_per_sec": 0, 00:16:52.384 "rw_mbytes_per_sec": 0, 00:16:52.384 "r_mbytes_per_sec": 0, 00:16:52.384 "w_mbytes_per_sec": 0 00:16:52.384 }, 00:16:52.384 "claimed": true, 00:16:52.384 "claim_type": "exclusive_write", 00:16:52.384 "zoned": false, 00:16:52.384 "supported_io_types": { 00:16:52.384 "read": true, 00:16:52.384 "write": true, 00:16:52.384 "unmap": true, 00:16:52.384 "flush": true, 00:16:52.384 "reset": true, 00:16:52.384 "nvme_admin": false, 00:16:52.384 "nvme_io": false, 00:16:52.384 "nvme_io_md": false, 00:16:52.384 "write_zeroes": true, 00:16:52.384 "zcopy": true, 00:16:52.384 "get_zone_info": false, 00:16:52.384 "zone_management": false, 00:16:52.384 "zone_append": false, 00:16:52.384 "compare": false, 00:16:52.384 "compare_and_write": false, 00:16:52.384 "abort": true, 00:16:52.385 "seek_hole": false, 00:16:52.385 "seek_data": false, 00:16:52.385 "copy": true, 00:16:52.385 "nvme_iov_md": false 00:16:52.385 }, 00:16:52.385 "memory_domains": [ 00:16:52.385 { 00:16:52.385 "dma_device_id": "system", 00:16:52.385 "dma_device_type": 1 00:16:52.385 }, 00:16:52.385 { 00:16:52.385 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.385 "dma_device_type": 2 00:16:52.385 } 00:16:52.385 ], 00:16:52.385 "driver_specific": {} 00:16:52.385 }' 00:16:52.385 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.385 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.385 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:52.385 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.642 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.642 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:52.642 08:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.642 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.642 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:52.642 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.642 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.642 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:52.642 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:52.642 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:52.642 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:52.899 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:52.899 "name": "BaseBdev3", 00:16:52.899 "aliases": [ 00:16:52.899 "22c2e093-91ae-4a32-bf17-9df306a68ed4" 00:16:52.899 ], 00:16:52.899 "product_name": "Malloc disk", 00:16:52.899 "block_size": 512, 00:16:52.899 "num_blocks": 65536, 00:16:52.899 "uuid": "22c2e093-91ae-4a32-bf17-9df306a68ed4", 00:16:52.899 "assigned_rate_limits": { 00:16:52.899 "rw_ios_per_sec": 0, 00:16:52.899 "rw_mbytes_per_sec": 0, 00:16:52.899 "r_mbytes_per_sec": 0, 00:16:52.899 "w_mbytes_per_sec": 0 00:16:52.899 }, 00:16:52.899 "claimed": true, 00:16:52.899 "claim_type": "exclusive_write", 00:16:52.899 "zoned": false, 00:16:52.899 "supported_io_types": { 00:16:52.900 "read": true, 00:16:52.900 "write": true, 00:16:52.900 "unmap": true, 00:16:52.900 "flush": true, 00:16:52.900 "reset": true, 00:16:52.900 "nvme_admin": false, 00:16:52.900 "nvme_io": false, 00:16:52.900 "nvme_io_md": false, 00:16:52.900 "write_zeroes": true, 00:16:52.900 "zcopy": true, 00:16:52.900 "get_zone_info": false, 00:16:52.900 "zone_management": false, 00:16:52.900 "zone_append": false, 00:16:52.900 "compare": false, 00:16:52.900 "compare_and_write": false, 00:16:52.900 "abort": true, 00:16:52.900 "seek_hole": false, 00:16:52.900 "seek_data": false, 00:16:52.900 "copy": true, 00:16:52.900 "nvme_iov_md": false 00:16:52.900 }, 00:16:52.900 "memory_domains": [ 00:16:52.900 { 00:16:52.900 "dma_device_id": "system", 00:16:52.900 "dma_device_type": 1 00:16:52.900 }, 00:16:52.900 { 00:16:52.900 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.900 "dma_device_type": 2 00:16:52.900 } 00:16:52.900 ], 00:16:52.900 "driver_specific": {} 00:16:52.900 }' 00:16:52.900 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.900 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.900 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:52.900 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.158 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.158 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:53.158 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.158 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.158 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:53.158 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.158 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.158 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:53.158 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:53.416 [2024-07-23 08:30:05.773911] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:53.416 [2024-07-23 08:30:05.773942] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:53.416 [2024-07-23 08:30:05.774014] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:53.416 [2024-07-23 08:30:05.774285] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:53.416 [2024-07-23 08:30:05.774296] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036980 name Existed_Raid, state offline 00:16:53.416 08:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1459009 00:16:53.416 08:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1459009 ']' 00:16:53.416 08:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1459009 00:16:53.416 08:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:53.416 08:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:53.416 08:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1459009 00:16:53.416 08:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:53.417 08:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:53.417 08:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1459009' 00:16:53.417 killing process with pid 1459009 00:16:53.417 08:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1459009 00:16:53.417 [2024-07-23 08:30:05.832693] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:53.417 08:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1459009 00:16:53.675 [2024-07-23 08:30:06.066952] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:55.049 08:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:55.049 00:16:55.049 real 0m23.309s 00:16:55.049 user 0m41.547s 00:16:55.049 sys 0m3.594s 00:16:55.049 08:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:55.049 08:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:55.049 ************************************ 00:16:55.049 END TEST raid_state_function_test_sb 00:16:55.049 ************************************ 00:16:55.049 08:30:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:55.049 08:30:07 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:16:55.049 08:30:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:55.049 08:30:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:55.049 08:30:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:55.049 ************************************ 00:16:55.049 START TEST raid_superblock_test 00:16:55.049 ************************************ 00:16:55.049 08:30:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:16:55.049 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:16:55.049 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:16:55.049 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:55.049 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:55.049 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:55.049 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:55.049 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:55.049 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:55.049 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:55.049 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:55.049 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:55.049 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:55.049 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:55.049 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:16:55.049 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:16:55.050 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1463880 00:16:55.050 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1463880 /var/tmp/spdk-raid.sock 00:16:55.050 08:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:55.050 08:30:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1463880 ']' 00:16:55.050 08:30:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:55.050 08:30:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:55.050 08:30:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:55.050 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:55.050 08:30:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:55.050 08:30:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:55.050 [2024-07-23 08:30:07.452023] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:16:55.050 [2024-07-23 08:30:07.452110] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1463880 ] 00:16:55.307 [2024-07-23 08:30:07.576441] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:55.307 [2024-07-23 08:30:07.822469] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:55.873 [2024-07-23 08:30:08.084656] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:55.873 [2024-07-23 08:30:08.084690] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:55.873 08:30:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:55.873 08:30:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:16:55.873 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:55.873 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:55.873 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:55.873 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:55.873 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:55.873 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:55.873 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:55.873 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:55.873 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:56.130 malloc1 00:16:56.131 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:56.131 [2024-07-23 08:30:08.610788] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:56.131 [2024-07-23 08:30:08.610839] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:56.131 [2024-07-23 08:30:08.610861] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:16:56.131 [2024-07-23 08:30:08.610872] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:56.131 [2024-07-23 08:30:08.612811] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:56.131 [2024-07-23 08:30:08.612840] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:56.131 pt1 00:16:56.131 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:56.131 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:56.131 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:56.131 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:56.131 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:56.131 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:56.131 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:56.131 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:56.131 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:56.388 malloc2 00:16:56.388 08:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:56.646 [2024-07-23 08:30:08.991970] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:56.646 [2024-07-23 08:30:08.992022] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:56.646 [2024-07-23 08:30:08.992043] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:16:56.646 [2024-07-23 08:30:08.992052] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:56.646 [2024-07-23 08:30:08.994111] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:56.646 [2024-07-23 08:30:08.994139] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:56.646 pt2 00:16:56.646 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:56.646 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:56.646 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:56.646 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:56.646 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:56.646 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:56.646 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:56.646 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:56.646 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:56.903 malloc3 00:16:56.903 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:56.903 [2024-07-23 08:30:09.381266] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:56.903 [2024-07-23 08:30:09.381320] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:56.904 [2024-07-23 08:30:09.381342] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036080 00:16:56.904 [2024-07-23 08:30:09.381351] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:56.904 [2024-07-23 08:30:09.383302] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:56.904 [2024-07-23 08:30:09.383331] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:56.904 pt3 00:16:56.904 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:56.904 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:56.904 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:16:57.161 [2024-07-23 08:30:09.549755] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:57.161 [2024-07-23 08:30:09.551414] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:57.161 [2024-07-23 08:30:09.551478] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:57.161 [2024-07-23 08:30:09.551680] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036680 00:16:57.161 [2024-07-23 08:30:09.551695] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:57.161 [2024-07-23 08:30:09.551968] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:16:57.161 [2024-07-23 08:30:09.552176] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036680 00:16:57.161 [2024-07-23 08:30:09.552186] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036680 00:16:57.161 [2024-07-23 08:30:09.552353] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:57.161 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:57.161 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:57.161 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:57.161 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:57.161 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:57.161 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:57.161 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:57.161 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:57.161 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:57.161 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:57.161 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.161 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:57.438 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:57.439 "name": "raid_bdev1", 00:16:57.439 "uuid": "2570bfdf-395e-4bdd-8a10-4d78c041fa09", 00:16:57.439 "strip_size_kb": 0, 00:16:57.439 "state": "online", 00:16:57.439 "raid_level": "raid1", 00:16:57.439 "superblock": true, 00:16:57.439 "num_base_bdevs": 3, 00:16:57.439 "num_base_bdevs_discovered": 3, 00:16:57.439 "num_base_bdevs_operational": 3, 00:16:57.439 "base_bdevs_list": [ 00:16:57.439 { 00:16:57.439 "name": "pt1", 00:16:57.439 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:57.439 "is_configured": true, 00:16:57.439 "data_offset": 2048, 00:16:57.439 "data_size": 63488 00:16:57.439 }, 00:16:57.439 { 00:16:57.439 "name": "pt2", 00:16:57.439 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:57.439 "is_configured": true, 00:16:57.439 "data_offset": 2048, 00:16:57.439 "data_size": 63488 00:16:57.439 }, 00:16:57.439 { 00:16:57.439 "name": "pt3", 00:16:57.439 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:57.439 "is_configured": true, 00:16:57.439 "data_offset": 2048, 00:16:57.439 "data_size": 63488 00:16:57.439 } 00:16:57.439 ] 00:16:57.439 }' 00:16:57.439 08:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:57.439 08:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:57.745 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:57.745 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:57.745 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:57.745 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:57.745 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:57.745 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:57.745 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:57.745 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:58.003 [2024-07-23 08:30:10.388170] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:58.003 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:58.003 "name": "raid_bdev1", 00:16:58.003 "aliases": [ 00:16:58.003 "2570bfdf-395e-4bdd-8a10-4d78c041fa09" 00:16:58.003 ], 00:16:58.003 "product_name": "Raid Volume", 00:16:58.003 "block_size": 512, 00:16:58.003 "num_blocks": 63488, 00:16:58.003 "uuid": "2570bfdf-395e-4bdd-8a10-4d78c041fa09", 00:16:58.003 "assigned_rate_limits": { 00:16:58.003 "rw_ios_per_sec": 0, 00:16:58.003 "rw_mbytes_per_sec": 0, 00:16:58.003 "r_mbytes_per_sec": 0, 00:16:58.003 "w_mbytes_per_sec": 0 00:16:58.003 }, 00:16:58.003 "claimed": false, 00:16:58.003 "zoned": false, 00:16:58.003 "supported_io_types": { 00:16:58.003 "read": true, 00:16:58.003 "write": true, 00:16:58.003 "unmap": false, 00:16:58.003 "flush": false, 00:16:58.003 "reset": true, 00:16:58.003 "nvme_admin": false, 00:16:58.003 "nvme_io": false, 00:16:58.003 "nvme_io_md": false, 00:16:58.003 "write_zeroes": true, 00:16:58.003 "zcopy": false, 00:16:58.003 "get_zone_info": false, 00:16:58.003 "zone_management": false, 00:16:58.003 "zone_append": false, 00:16:58.003 "compare": false, 00:16:58.003 "compare_and_write": false, 00:16:58.003 "abort": false, 00:16:58.003 "seek_hole": false, 00:16:58.003 "seek_data": false, 00:16:58.003 "copy": false, 00:16:58.003 "nvme_iov_md": false 00:16:58.003 }, 00:16:58.003 "memory_domains": [ 00:16:58.003 { 00:16:58.003 "dma_device_id": "system", 00:16:58.003 "dma_device_type": 1 00:16:58.003 }, 00:16:58.003 { 00:16:58.003 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.003 "dma_device_type": 2 00:16:58.003 }, 00:16:58.003 { 00:16:58.003 "dma_device_id": "system", 00:16:58.003 "dma_device_type": 1 00:16:58.003 }, 00:16:58.003 { 00:16:58.003 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.003 "dma_device_type": 2 00:16:58.003 }, 00:16:58.003 { 00:16:58.003 "dma_device_id": "system", 00:16:58.003 "dma_device_type": 1 00:16:58.003 }, 00:16:58.003 { 00:16:58.003 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.003 "dma_device_type": 2 00:16:58.003 } 00:16:58.003 ], 00:16:58.003 "driver_specific": { 00:16:58.003 "raid": { 00:16:58.003 "uuid": "2570bfdf-395e-4bdd-8a10-4d78c041fa09", 00:16:58.003 "strip_size_kb": 0, 00:16:58.003 "state": "online", 00:16:58.003 "raid_level": "raid1", 00:16:58.003 "superblock": true, 00:16:58.003 "num_base_bdevs": 3, 00:16:58.003 "num_base_bdevs_discovered": 3, 00:16:58.003 "num_base_bdevs_operational": 3, 00:16:58.003 "base_bdevs_list": [ 00:16:58.003 { 00:16:58.003 "name": "pt1", 00:16:58.003 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:58.003 "is_configured": true, 00:16:58.003 "data_offset": 2048, 00:16:58.003 "data_size": 63488 00:16:58.003 }, 00:16:58.003 { 00:16:58.003 "name": "pt2", 00:16:58.003 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:58.003 "is_configured": true, 00:16:58.003 "data_offset": 2048, 00:16:58.003 "data_size": 63488 00:16:58.003 }, 00:16:58.003 { 00:16:58.003 "name": "pt3", 00:16:58.003 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:58.003 "is_configured": true, 00:16:58.003 "data_offset": 2048, 00:16:58.003 "data_size": 63488 00:16:58.003 } 00:16:58.003 ] 00:16:58.003 } 00:16:58.003 } 00:16:58.003 }' 00:16:58.004 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:58.004 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:58.004 pt2 00:16:58.004 pt3' 00:16:58.004 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:58.004 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:58.004 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:58.261 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:58.261 "name": "pt1", 00:16:58.261 "aliases": [ 00:16:58.261 "00000000-0000-0000-0000-000000000001" 00:16:58.261 ], 00:16:58.261 "product_name": "passthru", 00:16:58.261 "block_size": 512, 00:16:58.261 "num_blocks": 65536, 00:16:58.261 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:58.261 "assigned_rate_limits": { 00:16:58.261 "rw_ios_per_sec": 0, 00:16:58.261 "rw_mbytes_per_sec": 0, 00:16:58.261 "r_mbytes_per_sec": 0, 00:16:58.261 "w_mbytes_per_sec": 0 00:16:58.261 }, 00:16:58.261 "claimed": true, 00:16:58.261 "claim_type": "exclusive_write", 00:16:58.261 "zoned": false, 00:16:58.261 "supported_io_types": { 00:16:58.261 "read": true, 00:16:58.261 "write": true, 00:16:58.261 "unmap": true, 00:16:58.261 "flush": true, 00:16:58.261 "reset": true, 00:16:58.261 "nvme_admin": false, 00:16:58.261 "nvme_io": false, 00:16:58.261 "nvme_io_md": false, 00:16:58.261 "write_zeroes": true, 00:16:58.261 "zcopy": true, 00:16:58.261 "get_zone_info": false, 00:16:58.261 "zone_management": false, 00:16:58.261 "zone_append": false, 00:16:58.261 "compare": false, 00:16:58.261 "compare_and_write": false, 00:16:58.261 "abort": true, 00:16:58.261 "seek_hole": false, 00:16:58.261 "seek_data": false, 00:16:58.261 "copy": true, 00:16:58.261 "nvme_iov_md": false 00:16:58.261 }, 00:16:58.261 "memory_domains": [ 00:16:58.261 { 00:16:58.261 "dma_device_id": "system", 00:16:58.261 "dma_device_type": 1 00:16:58.261 }, 00:16:58.261 { 00:16:58.261 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.261 "dma_device_type": 2 00:16:58.261 } 00:16:58.261 ], 00:16:58.261 "driver_specific": { 00:16:58.261 "passthru": { 00:16:58.261 "name": "pt1", 00:16:58.261 "base_bdev_name": "malloc1" 00:16:58.261 } 00:16:58.261 } 00:16:58.261 }' 00:16:58.261 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:58.261 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:58.261 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:58.261 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:58.261 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:58.261 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:58.518 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:58.518 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:58.518 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:58.518 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:58.518 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:58.518 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:58.518 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:58.518 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:58.518 08:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:58.776 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:58.776 "name": "pt2", 00:16:58.776 "aliases": [ 00:16:58.776 "00000000-0000-0000-0000-000000000002" 00:16:58.776 ], 00:16:58.776 "product_name": "passthru", 00:16:58.776 "block_size": 512, 00:16:58.776 "num_blocks": 65536, 00:16:58.776 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:58.776 "assigned_rate_limits": { 00:16:58.776 "rw_ios_per_sec": 0, 00:16:58.776 "rw_mbytes_per_sec": 0, 00:16:58.776 "r_mbytes_per_sec": 0, 00:16:58.776 "w_mbytes_per_sec": 0 00:16:58.776 }, 00:16:58.776 "claimed": true, 00:16:58.776 "claim_type": "exclusive_write", 00:16:58.776 "zoned": false, 00:16:58.776 "supported_io_types": { 00:16:58.776 "read": true, 00:16:58.776 "write": true, 00:16:58.776 "unmap": true, 00:16:58.776 "flush": true, 00:16:58.776 "reset": true, 00:16:58.776 "nvme_admin": false, 00:16:58.776 "nvme_io": false, 00:16:58.776 "nvme_io_md": false, 00:16:58.776 "write_zeroes": true, 00:16:58.776 "zcopy": true, 00:16:58.776 "get_zone_info": false, 00:16:58.776 "zone_management": false, 00:16:58.776 "zone_append": false, 00:16:58.776 "compare": false, 00:16:58.776 "compare_and_write": false, 00:16:58.776 "abort": true, 00:16:58.776 "seek_hole": false, 00:16:58.776 "seek_data": false, 00:16:58.776 "copy": true, 00:16:58.776 "nvme_iov_md": false 00:16:58.776 }, 00:16:58.776 "memory_domains": [ 00:16:58.776 { 00:16:58.776 "dma_device_id": "system", 00:16:58.776 "dma_device_type": 1 00:16:58.776 }, 00:16:58.776 { 00:16:58.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.776 "dma_device_type": 2 00:16:58.776 } 00:16:58.776 ], 00:16:58.776 "driver_specific": { 00:16:58.776 "passthru": { 00:16:58.776 "name": "pt2", 00:16:58.776 "base_bdev_name": "malloc2" 00:16:58.776 } 00:16:58.776 } 00:16:58.776 }' 00:16:58.776 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:58.776 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:58.776 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:58.776 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:58.776 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:58.776 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:58.776 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:58.776 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:59.034 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:59.034 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:59.034 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:59.034 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:59.034 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:59.034 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:59.034 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:59.291 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:59.291 "name": "pt3", 00:16:59.291 "aliases": [ 00:16:59.291 "00000000-0000-0000-0000-000000000003" 00:16:59.291 ], 00:16:59.291 "product_name": "passthru", 00:16:59.291 "block_size": 512, 00:16:59.291 "num_blocks": 65536, 00:16:59.291 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:59.291 "assigned_rate_limits": { 00:16:59.291 "rw_ios_per_sec": 0, 00:16:59.291 "rw_mbytes_per_sec": 0, 00:16:59.291 "r_mbytes_per_sec": 0, 00:16:59.291 "w_mbytes_per_sec": 0 00:16:59.291 }, 00:16:59.291 "claimed": true, 00:16:59.291 "claim_type": "exclusive_write", 00:16:59.291 "zoned": false, 00:16:59.291 "supported_io_types": { 00:16:59.291 "read": true, 00:16:59.291 "write": true, 00:16:59.291 "unmap": true, 00:16:59.291 "flush": true, 00:16:59.291 "reset": true, 00:16:59.291 "nvme_admin": false, 00:16:59.291 "nvme_io": false, 00:16:59.291 "nvme_io_md": false, 00:16:59.291 "write_zeroes": true, 00:16:59.291 "zcopy": true, 00:16:59.291 "get_zone_info": false, 00:16:59.291 "zone_management": false, 00:16:59.291 "zone_append": false, 00:16:59.291 "compare": false, 00:16:59.291 "compare_and_write": false, 00:16:59.291 "abort": true, 00:16:59.291 "seek_hole": false, 00:16:59.291 "seek_data": false, 00:16:59.291 "copy": true, 00:16:59.291 "nvme_iov_md": false 00:16:59.291 }, 00:16:59.291 "memory_domains": [ 00:16:59.291 { 00:16:59.291 "dma_device_id": "system", 00:16:59.291 "dma_device_type": 1 00:16:59.291 }, 00:16:59.291 { 00:16:59.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.291 "dma_device_type": 2 00:16:59.291 } 00:16:59.291 ], 00:16:59.291 "driver_specific": { 00:16:59.291 "passthru": { 00:16:59.292 "name": "pt3", 00:16:59.292 "base_bdev_name": "malloc3" 00:16:59.292 } 00:16:59.292 } 00:16:59.292 }' 00:16:59.292 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:59.292 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:59.292 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:59.292 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:59.292 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:59.292 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:59.292 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:59.292 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:59.292 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:59.292 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:59.549 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:59.549 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:59.549 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:59.549 08:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:59.549 [2024-07-23 08:30:12.012465] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:59.549 08:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=2570bfdf-395e-4bdd-8a10-4d78c041fa09 00:16:59.549 08:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 2570bfdf-395e-4bdd-8a10-4d78c041fa09 ']' 00:16:59.550 08:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:59.807 [2024-07-23 08:30:12.172597] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:59.807 [2024-07-23 08:30:12.172639] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:59.807 [2024-07-23 08:30:12.172711] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:59.807 [2024-07-23 08:30:12.172776] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:59.807 [2024-07-23 08:30:12.172786] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036680 name raid_bdev1, state offline 00:16:59.807 08:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.807 08:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:00.064 08:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:00.064 08:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:00.064 08:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:00.064 08:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:00.064 08:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:00.064 08:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:00.322 08:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:00.322 08:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:00.580 08:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:00.580 08:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:00.580 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:00.580 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:00.580 08:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:00.580 08:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:00.580 08:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:00.580 08:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:00.580 08:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:00.580 08:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:00.580 08:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:00.580 08:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:00.580 08:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:00.580 08:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:00.580 08:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:00.838 [2024-07-23 08:30:13.187242] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:00.838 [2024-07-23 08:30:13.188848] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:00.838 [2024-07-23 08:30:13.188898] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:00.838 [2024-07-23 08:30:13.188950] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:00.838 [2024-07-23 08:30:13.188994] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:00.838 [2024-07-23 08:30:13.189013] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:00.838 [2024-07-23 08:30:13.189028] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:00.838 [2024-07-23 08:30:13.189038] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036c80 name raid_bdev1, state configuring 00:17:00.838 request: 00:17:00.838 { 00:17:00.838 "name": "raid_bdev1", 00:17:00.838 "raid_level": "raid1", 00:17:00.838 "base_bdevs": [ 00:17:00.838 "malloc1", 00:17:00.838 "malloc2", 00:17:00.838 "malloc3" 00:17:00.838 ], 00:17:00.838 "superblock": false, 00:17:00.838 "method": "bdev_raid_create", 00:17:00.838 "req_id": 1 00:17:00.838 } 00:17:00.838 Got JSON-RPC error response 00:17:00.838 response: 00:17:00.838 { 00:17:00.838 "code": -17, 00:17:00.838 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:00.838 } 00:17:00.838 08:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:00.838 08:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:00.838 08:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:00.838 08:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:00.838 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.838 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:01.095 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:01.095 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:01.095 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:01.095 [2024-07-23 08:30:13.528098] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:01.095 [2024-07-23 08:30:13.528152] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:01.095 [2024-07-23 08:30:13.528172] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037280 00:17:01.095 [2024-07-23 08:30:13.528181] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:01.095 [2024-07-23 08:30:13.530196] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:01.095 [2024-07-23 08:30:13.530224] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:01.095 [2024-07-23 08:30:13.530315] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:01.095 [2024-07-23 08:30:13.530375] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:01.095 pt1 00:17:01.096 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:01.096 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:01.096 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:01.096 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:01.096 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:01.096 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:01.096 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:01.096 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:01.096 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:01.096 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:01.096 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.096 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:01.353 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:01.353 "name": "raid_bdev1", 00:17:01.353 "uuid": "2570bfdf-395e-4bdd-8a10-4d78c041fa09", 00:17:01.353 "strip_size_kb": 0, 00:17:01.353 "state": "configuring", 00:17:01.353 "raid_level": "raid1", 00:17:01.353 "superblock": true, 00:17:01.353 "num_base_bdevs": 3, 00:17:01.353 "num_base_bdevs_discovered": 1, 00:17:01.353 "num_base_bdevs_operational": 3, 00:17:01.353 "base_bdevs_list": [ 00:17:01.353 { 00:17:01.353 "name": "pt1", 00:17:01.353 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:01.353 "is_configured": true, 00:17:01.353 "data_offset": 2048, 00:17:01.353 "data_size": 63488 00:17:01.353 }, 00:17:01.353 { 00:17:01.353 "name": null, 00:17:01.353 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:01.353 "is_configured": false, 00:17:01.353 "data_offset": 2048, 00:17:01.353 "data_size": 63488 00:17:01.353 }, 00:17:01.353 { 00:17:01.353 "name": null, 00:17:01.353 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:01.353 "is_configured": false, 00:17:01.353 "data_offset": 2048, 00:17:01.353 "data_size": 63488 00:17:01.353 } 00:17:01.353 ] 00:17:01.353 }' 00:17:01.353 08:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:01.353 08:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:01.917 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:17:01.917 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:01.917 [2024-07-23 08:30:14.358279] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:01.917 [2024-07-23 08:30:14.358331] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:01.917 [2024-07-23 08:30:14.358351] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037b80 00:17:01.917 [2024-07-23 08:30:14.358359] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:01.917 [2024-07-23 08:30:14.358820] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:01.917 [2024-07-23 08:30:14.358838] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:01.917 [2024-07-23 08:30:14.358911] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:01.917 [2024-07-23 08:30:14.358935] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:01.917 pt2 00:17:01.917 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:02.175 [2024-07-23 08:30:14.526773] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:02.175 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:02.175 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:02.175 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:02.175 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:02.175 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:02.175 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:02.175 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:02.175 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:02.175 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:02.175 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:02.175 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.175 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:02.432 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:02.432 "name": "raid_bdev1", 00:17:02.432 "uuid": "2570bfdf-395e-4bdd-8a10-4d78c041fa09", 00:17:02.432 "strip_size_kb": 0, 00:17:02.432 "state": "configuring", 00:17:02.432 "raid_level": "raid1", 00:17:02.433 "superblock": true, 00:17:02.433 "num_base_bdevs": 3, 00:17:02.433 "num_base_bdevs_discovered": 1, 00:17:02.433 "num_base_bdevs_operational": 3, 00:17:02.433 "base_bdevs_list": [ 00:17:02.433 { 00:17:02.433 "name": "pt1", 00:17:02.433 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:02.433 "is_configured": true, 00:17:02.433 "data_offset": 2048, 00:17:02.433 "data_size": 63488 00:17:02.433 }, 00:17:02.433 { 00:17:02.433 "name": null, 00:17:02.433 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:02.433 "is_configured": false, 00:17:02.433 "data_offset": 2048, 00:17:02.433 "data_size": 63488 00:17:02.433 }, 00:17:02.433 { 00:17:02.433 "name": null, 00:17:02.433 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:02.433 "is_configured": false, 00:17:02.433 "data_offset": 2048, 00:17:02.433 "data_size": 63488 00:17:02.433 } 00:17:02.433 ] 00:17:02.433 }' 00:17:02.433 08:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:02.433 08:30:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:02.690 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:02.690 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:02.690 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:02.947 [2024-07-23 08:30:15.328855] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:02.947 [2024-07-23 08:30:15.328914] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:02.947 [2024-07-23 08:30:15.328930] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037e80 00:17:02.947 [2024-07-23 08:30:15.328941] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:02.947 [2024-07-23 08:30:15.329413] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:02.947 [2024-07-23 08:30:15.329437] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:02.947 [2024-07-23 08:30:15.329511] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:02.947 [2024-07-23 08:30:15.329537] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:02.947 pt2 00:17:02.947 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:02.947 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:02.947 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:03.204 [2024-07-23 08:30:15.501306] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:03.204 [2024-07-23 08:30:15.501359] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:03.204 [2024-07-23 08:30:15.501378] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038180 00:17:03.204 [2024-07-23 08:30:15.501389] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:03.204 [2024-07-23 08:30:15.501835] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:03.204 [2024-07-23 08:30:15.501857] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:03.204 [2024-07-23 08:30:15.501945] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:03.204 [2024-07-23 08:30:15.501974] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:03.204 [2024-07-23 08:30:15.502134] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037880 00:17:03.204 [2024-07-23 08:30:15.502146] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:03.204 [2024-07-23 08:30:15.502374] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:17:03.204 [2024-07-23 08:30:15.502559] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037880 00:17:03.204 [2024-07-23 08:30:15.502569] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037880 00:17:03.204 [2024-07-23 08:30:15.502723] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:03.204 pt3 00:17:03.204 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:03.204 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:03.204 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:03.204 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:03.204 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:03.204 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:03.205 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:03.205 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:03.205 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:03.205 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:03.205 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:03.205 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:03.205 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.205 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:03.205 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:03.205 "name": "raid_bdev1", 00:17:03.205 "uuid": "2570bfdf-395e-4bdd-8a10-4d78c041fa09", 00:17:03.205 "strip_size_kb": 0, 00:17:03.205 "state": "online", 00:17:03.205 "raid_level": "raid1", 00:17:03.205 "superblock": true, 00:17:03.205 "num_base_bdevs": 3, 00:17:03.205 "num_base_bdevs_discovered": 3, 00:17:03.205 "num_base_bdevs_operational": 3, 00:17:03.205 "base_bdevs_list": [ 00:17:03.205 { 00:17:03.205 "name": "pt1", 00:17:03.205 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:03.205 "is_configured": true, 00:17:03.205 "data_offset": 2048, 00:17:03.205 "data_size": 63488 00:17:03.205 }, 00:17:03.205 { 00:17:03.205 "name": "pt2", 00:17:03.205 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:03.205 "is_configured": true, 00:17:03.205 "data_offset": 2048, 00:17:03.205 "data_size": 63488 00:17:03.205 }, 00:17:03.205 { 00:17:03.205 "name": "pt3", 00:17:03.205 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:03.205 "is_configured": true, 00:17:03.205 "data_offset": 2048, 00:17:03.205 "data_size": 63488 00:17:03.205 } 00:17:03.205 ] 00:17:03.205 }' 00:17:03.205 08:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:03.205 08:30:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:03.770 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:03.770 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:03.770 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:03.770 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:03.770 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:03.770 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:03.770 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:03.770 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:04.027 [2024-07-23 08:30:16.359802] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:04.027 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:04.027 "name": "raid_bdev1", 00:17:04.027 "aliases": [ 00:17:04.027 "2570bfdf-395e-4bdd-8a10-4d78c041fa09" 00:17:04.027 ], 00:17:04.027 "product_name": "Raid Volume", 00:17:04.027 "block_size": 512, 00:17:04.027 "num_blocks": 63488, 00:17:04.027 "uuid": "2570bfdf-395e-4bdd-8a10-4d78c041fa09", 00:17:04.027 "assigned_rate_limits": { 00:17:04.027 "rw_ios_per_sec": 0, 00:17:04.027 "rw_mbytes_per_sec": 0, 00:17:04.027 "r_mbytes_per_sec": 0, 00:17:04.027 "w_mbytes_per_sec": 0 00:17:04.027 }, 00:17:04.027 "claimed": false, 00:17:04.027 "zoned": false, 00:17:04.027 "supported_io_types": { 00:17:04.027 "read": true, 00:17:04.027 "write": true, 00:17:04.027 "unmap": false, 00:17:04.027 "flush": false, 00:17:04.027 "reset": true, 00:17:04.027 "nvme_admin": false, 00:17:04.027 "nvme_io": false, 00:17:04.027 "nvme_io_md": false, 00:17:04.027 "write_zeroes": true, 00:17:04.027 "zcopy": false, 00:17:04.027 "get_zone_info": false, 00:17:04.027 "zone_management": false, 00:17:04.027 "zone_append": false, 00:17:04.027 "compare": false, 00:17:04.027 "compare_and_write": false, 00:17:04.027 "abort": false, 00:17:04.027 "seek_hole": false, 00:17:04.027 "seek_data": false, 00:17:04.027 "copy": false, 00:17:04.027 "nvme_iov_md": false 00:17:04.027 }, 00:17:04.027 "memory_domains": [ 00:17:04.027 { 00:17:04.027 "dma_device_id": "system", 00:17:04.027 "dma_device_type": 1 00:17:04.027 }, 00:17:04.027 { 00:17:04.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:04.027 "dma_device_type": 2 00:17:04.027 }, 00:17:04.027 { 00:17:04.027 "dma_device_id": "system", 00:17:04.027 "dma_device_type": 1 00:17:04.027 }, 00:17:04.027 { 00:17:04.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:04.027 "dma_device_type": 2 00:17:04.027 }, 00:17:04.027 { 00:17:04.027 "dma_device_id": "system", 00:17:04.027 "dma_device_type": 1 00:17:04.027 }, 00:17:04.027 { 00:17:04.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:04.027 "dma_device_type": 2 00:17:04.027 } 00:17:04.027 ], 00:17:04.027 "driver_specific": { 00:17:04.027 "raid": { 00:17:04.027 "uuid": "2570bfdf-395e-4bdd-8a10-4d78c041fa09", 00:17:04.027 "strip_size_kb": 0, 00:17:04.027 "state": "online", 00:17:04.027 "raid_level": "raid1", 00:17:04.027 "superblock": true, 00:17:04.027 "num_base_bdevs": 3, 00:17:04.027 "num_base_bdevs_discovered": 3, 00:17:04.027 "num_base_bdevs_operational": 3, 00:17:04.027 "base_bdevs_list": [ 00:17:04.027 { 00:17:04.027 "name": "pt1", 00:17:04.027 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:04.027 "is_configured": true, 00:17:04.027 "data_offset": 2048, 00:17:04.027 "data_size": 63488 00:17:04.027 }, 00:17:04.027 { 00:17:04.028 "name": "pt2", 00:17:04.028 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:04.028 "is_configured": true, 00:17:04.028 "data_offset": 2048, 00:17:04.028 "data_size": 63488 00:17:04.028 }, 00:17:04.028 { 00:17:04.028 "name": "pt3", 00:17:04.028 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:04.028 "is_configured": true, 00:17:04.028 "data_offset": 2048, 00:17:04.028 "data_size": 63488 00:17:04.028 } 00:17:04.028 ] 00:17:04.028 } 00:17:04.028 } 00:17:04.028 }' 00:17:04.028 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:04.028 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:04.028 pt2 00:17:04.028 pt3' 00:17:04.028 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:04.028 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:04.028 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:04.285 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:04.285 "name": "pt1", 00:17:04.285 "aliases": [ 00:17:04.285 "00000000-0000-0000-0000-000000000001" 00:17:04.285 ], 00:17:04.285 "product_name": "passthru", 00:17:04.285 "block_size": 512, 00:17:04.285 "num_blocks": 65536, 00:17:04.285 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:04.285 "assigned_rate_limits": { 00:17:04.285 "rw_ios_per_sec": 0, 00:17:04.285 "rw_mbytes_per_sec": 0, 00:17:04.285 "r_mbytes_per_sec": 0, 00:17:04.285 "w_mbytes_per_sec": 0 00:17:04.285 }, 00:17:04.285 "claimed": true, 00:17:04.285 "claim_type": "exclusive_write", 00:17:04.285 "zoned": false, 00:17:04.285 "supported_io_types": { 00:17:04.285 "read": true, 00:17:04.285 "write": true, 00:17:04.285 "unmap": true, 00:17:04.285 "flush": true, 00:17:04.285 "reset": true, 00:17:04.285 "nvme_admin": false, 00:17:04.285 "nvme_io": false, 00:17:04.285 "nvme_io_md": false, 00:17:04.285 "write_zeroes": true, 00:17:04.285 "zcopy": true, 00:17:04.285 "get_zone_info": false, 00:17:04.285 "zone_management": false, 00:17:04.285 "zone_append": false, 00:17:04.285 "compare": false, 00:17:04.285 "compare_and_write": false, 00:17:04.285 "abort": true, 00:17:04.285 "seek_hole": false, 00:17:04.285 "seek_data": false, 00:17:04.285 "copy": true, 00:17:04.285 "nvme_iov_md": false 00:17:04.285 }, 00:17:04.285 "memory_domains": [ 00:17:04.285 { 00:17:04.285 "dma_device_id": "system", 00:17:04.285 "dma_device_type": 1 00:17:04.285 }, 00:17:04.285 { 00:17:04.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:04.285 "dma_device_type": 2 00:17:04.285 } 00:17:04.285 ], 00:17:04.285 "driver_specific": { 00:17:04.285 "passthru": { 00:17:04.285 "name": "pt1", 00:17:04.285 "base_bdev_name": "malloc1" 00:17:04.285 } 00:17:04.285 } 00:17:04.285 }' 00:17:04.285 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:04.285 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:04.285 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:04.285 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.285 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.285 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:04.285 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.285 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.543 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:04.543 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:04.543 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:04.543 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:04.543 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:04.543 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:04.543 08:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:04.800 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:04.800 "name": "pt2", 00:17:04.800 "aliases": [ 00:17:04.800 "00000000-0000-0000-0000-000000000002" 00:17:04.800 ], 00:17:04.800 "product_name": "passthru", 00:17:04.800 "block_size": 512, 00:17:04.800 "num_blocks": 65536, 00:17:04.800 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:04.800 "assigned_rate_limits": { 00:17:04.800 "rw_ios_per_sec": 0, 00:17:04.800 "rw_mbytes_per_sec": 0, 00:17:04.800 "r_mbytes_per_sec": 0, 00:17:04.800 "w_mbytes_per_sec": 0 00:17:04.800 }, 00:17:04.800 "claimed": true, 00:17:04.800 "claim_type": "exclusive_write", 00:17:04.800 "zoned": false, 00:17:04.800 "supported_io_types": { 00:17:04.800 "read": true, 00:17:04.800 "write": true, 00:17:04.800 "unmap": true, 00:17:04.800 "flush": true, 00:17:04.800 "reset": true, 00:17:04.800 "nvme_admin": false, 00:17:04.800 "nvme_io": false, 00:17:04.800 "nvme_io_md": false, 00:17:04.800 "write_zeroes": true, 00:17:04.800 "zcopy": true, 00:17:04.800 "get_zone_info": false, 00:17:04.800 "zone_management": false, 00:17:04.800 "zone_append": false, 00:17:04.800 "compare": false, 00:17:04.800 "compare_and_write": false, 00:17:04.800 "abort": true, 00:17:04.800 "seek_hole": false, 00:17:04.800 "seek_data": false, 00:17:04.800 "copy": true, 00:17:04.800 "nvme_iov_md": false 00:17:04.800 }, 00:17:04.800 "memory_domains": [ 00:17:04.800 { 00:17:04.800 "dma_device_id": "system", 00:17:04.800 "dma_device_type": 1 00:17:04.800 }, 00:17:04.800 { 00:17:04.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:04.800 "dma_device_type": 2 00:17:04.800 } 00:17:04.800 ], 00:17:04.800 "driver_specific": { 00:17:04.800 "passthru": { 00:17:04.800 "name": "pt2", 00:17:04.800 "base_bdev_name": "malloc2" 00:17:04.800 } 00:17:04.800 } 00:17:04.800 }' 00:17:04.800 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:04.800 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:04.800 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:04.800 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.800 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.800 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:04.800 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.800 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:05.058 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:05.058 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:05.058 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:05.058 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:05.058 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:05.058 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:05.058 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:05.316 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:05.316 "name": "pt3", 00:17:05.316 "aliases": [ 00:17:05.316 "00000000-0000-0000-0000-000000000003" 00:17:05.316 ], 00:17:05.316 "product_name": "passthru", 00:17:05.316 "block_size": 512, 00:17:05.316 "num_blocks": 65536, 00:17:05.316 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:05.316 "assigned_rate_limits": { 00:17:05.316 "rw_ios_per_sec": 0, 00:17:05.316 "rw_mbytes_per_sec": 0, 00:17:05.316 "r_mbytes_per_sec": 0, 00:17:05.316 "w_mbytes_per_sec": 0 00:17:05.316 }, 00:17:05.316 "claimed": true, 00:17:05.316 "claim_type": "exclusive_write", 00:17:05.316 "zoned": false, 00:17:05.316 "supported_io_types": { 00:17:05.316 "read": true, 00:17:05.316 "write": true, 00:17:05.316 "unmap": true, 00:17:05.316 "flush": true, 00:17:05.316 "reset": true, 00:17:05.316 "nvme_admin": false, 00:17:05.316 "nvme_io": false, 00:17:05.316 "nvme_io_md": false, 00:17:05.316 "write_zeroes": true, 00:17:05.316 "zcopy": true, 00:17:05.316 "get_zone_info": false, 00:17:05.316 "zone_management": false, 00:17:05.316 "zone_append": false, 00:17:05.316 "compare": false, 00:17:05.316 "compare_and_write": false, 00:17:05.316 "abort": true, 00:17:05.316 "seek_hole": false, 00:17:05.316 "seek_data": false, 00:17:05.316 "copy": true, 00:17:05.316 "nvme_iov_md": false 00:17:05.316 }, 00:17:05.316 "memory_domains": [ 00:17:05.316 { 00:17:05.316 "dma_device_id": "system", 00:17:05.316 "dma_device_type": 1 00:17:05.316 }, 00:17:05.316 { 00:17:05.316 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.316 "dma_device_type": 2 00:17:05.316 } 00:17:05.316 ], 00:17:05.316 "driver_specific": { 00:17:05.316 "passthru": { 00:17:05.316 "name": "pt3", 00:17:05.316 "base_bdev_name": "malloc3" 00:17:05.316 } 00:17:05.316 } 00:17:05.316 }' 00:17:05.316 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:05.316 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:05.316 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:05.316 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:05.316 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:05.317 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:05.317 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:05.317 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:05.317 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:05.574 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:05.574 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:05.574 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:05.574 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:05.574 08:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:05.574 [2024-07-23 08:30:18.072348] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:05.574 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 2570bfdf-395e-4bdd-8a10-4d78c041fa09 '!=' 2570bfdf-395e-4bdd-8a10-4d78c041fa09 ']' 00:17:05.574 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:17:05.574 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:05.574 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:05.574 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:05.832 [2024-07-23 08:30:18.236544] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:17:05.832 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:05.832 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:05.832 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:05.832 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:05.832 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:05.832 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:05.832 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:05.832 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:05.832 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:05.832 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:05.832 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.832 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:06.090 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:06.090 "name": "raid_bdev1", 00:17:06.090 "uuid": "2570bfdf-395e-4bdd-8a10-4d78c041fa09", 00:17:06.090 "strip_size_kb": 0, 00:17:06.090 "state": "online", 00:17:06.090 "raid_level": "raid1", 00:17:06.090 "superblock": true, 00:17:06.090 "num_base_bdevs": 3, 00:17:06.090 "num_base_bdevs_discovered": 2, 00:17:06.090 "num_base_bdevs_operational": 2, 00:17:06.090 "base_bdevs_list": [ 00:17:06.090 { 00:17:06.090 "name": null, 00:17:06.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.090 "is_configured": false, 00:17:06.090 "data_offset": 2048, 00:17:06.090 "data_size": 63488 00:17:06.090 }, 00:17:06.090 { 00:17:06.090 "name": "pt2", 00:17:06.090 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:06.090 "is_configured": true, 00:17:06.090 "data_offset": 2048, 00:17:06.090 "data_size": 63488 00:17:06.090 }, 00:17:06.090 { 00:17:06.090 "name": "pt3", 00:17:06.090 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:06.090 "is_configured": true, 00:17:06.090 "data_offset": 2048, 00:17:06.090 "data_size": 63488 00:17:06.090 } 00:17:06.090 ] 00:17:06.090 }' 00:17:06.090 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:06.090 08:30:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:06.655 08:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:06.655 [2024-07-23 08:30:19.050765] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:06.655 [2024-07-23 08:30:19.050803] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:06.655 [2024-07-23 08:30:19.050871] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:06.655 [2024-07-23 08:30:19.050923] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:06.655 [2024-07-23 08:30:19.050935] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037880 name raid_bdev1, state offline 00:17:06.655 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.655 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:17:06.913 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:17:06.913 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:17:06.913 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:17:06.913 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:06.913 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:06.913 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:06.913 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:06.913 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:07.170 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:07.170 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:07.170 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:17:07.170 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:07.170 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:07.427 [2024-07-23 08:30:19.708470] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:07.427 [2024-07-23 08:30:19.708527] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:07.427 [2024-07-23 08:30:19.708544] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038480 00:17:07.427 [2024-07-23 08:30:19.708554] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:07.427 [2024-07-23 08:30:19.710532] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:07.427 [2024-07-23 08:30:19.710563] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:07.427 [2024-07-23 08:30:19.710647] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:07.427 [2024-07-23 08:30:19.710696] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:07.427 pt2 00:17:07.427 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:07.427 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:07.427 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:07.427 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:07.427 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:07.427 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:07.427 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:07.427 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:07.427 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:07.427 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:07.427 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.427 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:07.427 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:07.427 "name": "raid_bdev1", 00:17:07.427 "uuid": "2570bfdf-395e-4bdd-8a10-4d78c041fa09", 00:17:07.427 "strip_size_kb": 0, 00:17:07.427 "state": "configuring", 00:17:07.427 "raid_level": "raid1", 00:17:07.427 "superblock": true, 00:17:07.427 "num_base_bdevs": 3, 00:17:07.427 "num_base_bdevs_discovered": 1, 00:17:07.427 "num_base_bdevs_operational": 2, 00:17:07.427 "base_bdevs_list": [ 00:17:07.427 { 00:17:07.427 "name": null, 00:17:07.427 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:07.427 "is_configured": false, 00:17:07.427 "data_offset": 2048, 00:17:07.427 "data_size": 63488 00:17:07.427 }, 00:17:07.427 { 00:17:07.427 "name": "pt2", 00:17:07.427 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:07.428 "is_configured": true, 00:17:07.428 "data_offset": 2048, 00:17:07.428 "data_size": 63488 00:17:07.428 }, 00:17:07.428 { 00:17:07.428 "name": null, 00:17:07.428 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:07.428 "is_configured": false, 00:17:07.428 "data_offset": 2048, 00:17:07.428 "data_size": 63488 00:17:07.428 } 00:17:07.428 ] 00:17:07.428 }' 00:17:07.428 08:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:07.428 08:30:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:07.993 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:17:07.993 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:07.993 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:17:07.993 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:08.251 [2024-07-23 08:30:20.558724] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:08.251 [2024-07-23 08:30:20.558784] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:08.251 [2024-07-23 08:30:20.558803] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038d80 00:17:08.251 [2024-07-23 08:30:20.558813] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:08.251 [2024-07-23 08:30:20.559264] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:08.251 [2024-07-23 08:30:20.559285] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:08.251 [2024-07-23 08:30:20.559360] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:08.251 [2024-07-23 08:30:20.559385] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:08.251 [2024-07-23 08:30:20.559518] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000038a80 00:17:08.251 [2024-07-23 08:30:20.559532] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:08.251 [2024-07-23 08:30:20.559768] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:17:08.251 [2024-07-23 08:30:20.559954] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000038a80 00:17:08.251 [2024-07-23 08:30:20.559964] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000038a80 00:17:08.251 [2024-07-23 08:30:20.560106] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:08.251 pt3 00:17:08.251 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:08.251 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:08.251 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:08.251 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:08.251 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:08.251 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:08.251 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:08.251 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:08.251 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:08.251 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:08.251 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:08.251 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.251 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:08.251 "name": "raid_bdev1", 00:17:08.251 "uuid": "2570bfdf-395e-4bdd-8a10-4d78c041fa09", 00:17:08.251 "strip_size_kb": 0, 00:17:08.251 "state": "online", 00:17:08.251 "raid_level": "raid1", 00:17:08.251 "superblock": true, 00:17:08.251 "num_base_bdevs": 3, 00:17:08.251 "num_base_bdevs_discovered": 2, 00:17:08.251 "num_base_bdevs_operational": 2, 00:17:08.251 "base_bdevs_list": [ 00:17:08.251 { 00:17:08.251 "name": null, 00:17:08.251 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.251 "is_configured": false, 00:17:08.251 "data_offset": 2048, 00:17:08.251 "data_size": 63488 00:17:08.251 }, 00:17:08.251 { 00:17:08.251 "name": "pt2", 00:17:08.251 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:08.251 "is_configured": true, 00:17:08.251 "data_offset": 2048, 00:17:08.251 "data_size": 63488 00:17:08.251 }, 00:17:08.251 { 00:17:08.251 "name": "pt3", 00:17:08.251 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:08.251 "is_configured": true, 00:17:08.251 "data_offset": 2048, 00:17:08.251 "data_size": 63488 00:17:08.251 } 00:17:08.251 ] 00:17:08.251 }' 00:17:08.251 08:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:08.251 08:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:08.817 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:09.074 [2024-07-23 08:30:21.396904] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:09.075 [2024-07-23 08:30:21.396934] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:09.075 [2024-07-23 08:30:21.397003] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:09.075 [2024-07-23 08:30:21.397060] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:09.075 [2024-07-23 08:30:21.397070] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038a80 name raid_bdev1, state offline 00:17:09.075 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.075 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:17:09.075 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:17:09.075 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:17:09.075 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:17:09.075 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:17:09.075 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:09.332 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:09.589 [2024-07-23 08:30:21.906221] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:09.589 [2024-07-23 08:30:21.906279] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:09.589 [2024-07-23 08:30:21.906301] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039080 00:17:09.589 [2024-07-23 08:30:21.906311] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:09.589 [2024-07-23 08:30:21.908294] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:09.589 [2024-07-23 08:30:21.908325] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:09.589 [2024-07-23 08:30:21.908409] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:09.589 [2024-07-23 08:30:21.908450] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:09.590 [2024-07-23 08:30:21.908599] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:17:09.590 [2024-07-23 08:30:21.908620] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:09.590 [2024-07-23 08:30:21.908637] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000039680 name raid_bdev1, state configuring 00:17:09.590 [2024-07-23 08:30:21.908703] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:09.590 pt1 00:17:09.590 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:17:09.590 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:09.590 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:09.590 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:09.590 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:09.590 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:09.590 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:09.590 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:09.590 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:09.590 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:09.590 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:09.590 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.590 08:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:09.590 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:09.590 "name": "raid_bdev1", 00:17:09.590 "uuid": "2570bfdf-395e-4bdd-8a10-4d78c041fa09", 00:17:09.590 "strip_size_kb": 0, 00:17:09.590 "state": "configuring", 00:17:09.590 "raid_level": "raid1", 00:17:09.590 "superblock": true, 00:17:09.590 "num_base_bdevs": 3, 00:17:09.590 "num_base_bdevs_discovered": 1, 00:17:09.590 "num_base_bdevs_operational": 2, 00:17:09.590 "base_bdevs_list": [ 00:17:09.590 { 00:17:09.590 "name": null, 00:17:09.590 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.590 "is_configured": false, 00:17:09.590 "data_offset": 2048, 00:17:09.590 "data_size": 63488 00:17:09.590 }, 00:17:09.590 { 00:17:09.590 "name": "pt2", 00:17:09.590 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:09.590 "is_configured": true, 00:17:09.590 "data_offset": 2048, 00:17:09.590 "data_size": 63488 00:17:09.590 }, 00:17:09.590 { 00:17:09.590 "name": null, 00:17:09.590 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:09.590 "is_configured": false, 00:17:09.590 "data_offset": 2048, 00:17:09.590 "data_size": 63488 00:17:09.590 } 00:17:09.590 ] 00:17:09.590 }' 00:17:09.590 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:09.590 08:30:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.237 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:17:10.237 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:10.237 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:17:10.237 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:10.494 [2024-07-23 08:30:22.888824] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:10.494 [2024-07-23 08:30:22.888877] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:10.494 [2024-07-23 08:30:22.888896] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039c80 00:17:10.494 [2024-07-23 08:30:22.888904] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:10.494 [2024-07-23 08:30:22.889328] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:10.494 [2024-07-23 08:30:22.889348] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:10.495 [2024-07-23 08:30:22.889416] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:10.495 [2024-07-23 08:30:22.889436] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:10.495 [2024-07-23 08:30:22.889566] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000039980 00:17:10.495 [2024-07-23 08:30:22.889575] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:10.495 [2024-07-23 08:30:22.889818] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c130 00:17:10.495 [2024-07-23 08:30:22.890015] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000039980 00:17:10.495 [2024-07-23 08:30:22.890027] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000039980 00:17:10.495 [2024-07-23 08:30:22.890164] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:10.495 pt3 00:17:10.495 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:10.495 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:10.495 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:10.495 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:10.495 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:10.495 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:10.495 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:10.495 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:10.495 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:10.495 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:10.495 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.495 08:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:10.753 08:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:10.753 "name": "raid_bdev1", 00:17:10.753 "uuid": "2570bfdf-395e-4bdd-8a10-4d78c041fa09", 00:17:10.753 "strip_size_kb": 0, 00:17:10.753 "state": "online", 00:17:10.753 "raid_level": "raid1", 00:17:10.753 "superblock": true, 00:17:10.753 "num_base_bdevs": 3, 00:17:10.753 "num_base_bdevs_discovered": 2, 00:17:10.753 "num_base_bdevs_operational": 2, 00:17:10.753 "base_bdevs_list": [ 00:17:10.753 { 00:17:10.753 "name": null, 00:17:10.753 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:10.753 "is_configured": false, 00:17:10.753 "data_offset": 2048, 00:17:10.753 "data_size": 63488 00:17:10.753 }, 00:17:10.753 { 00:17:10.753 "name": "pt2", 00:17:10.753 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:10.753 "is_configured": true, 00:17:10.753 "data_offset": 2048, 00:17:10.753 "data_size": 63488 00:17:10.753 }, 00:17:10.753 { 00:17:10.753 "name": "pt3", 00:17:10.753 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:10.753 "is_configured": true, 00:17:10.753 "data_offset": 2048, 00:17:10.753 "data_size": 63488 00:17:10.753 } 00:17:10.753 ] 00:17:10.753 }' 00:17:10.753 08:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:10.753 08:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:11.318 08:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:17:11.318 08:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:11.318 08:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:17:11.318 08:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:11.318 08:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:17:11.576 [2024-07-23 08:30:23.919817] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:11.576 08:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 2570bfdf-395e-4bdd-8a10-4d78c041fa09 '!=' 2570bfdf-395e-4bdd-8a10-4d78c041fa09 ']' 00:17:11.576 08:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1463880 00:17:11.576 08:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1463880 ']' 00:17:11.576 08:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1463880 00:17:11.576 08:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:17:11.576 08:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:11.576 08:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1463880 00:17:11.576 08:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:11.576 08:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:11.576 08:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1463880' 00:17:11.576 killing process with pid 1463880 00:17:11.576 08:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1463880 00:17:11.576 [2024-07-23 08:30:23.969505] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:11.576 [2024-07-23 08:30:23.969589] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:11.576 [2024-07-23 08:30:23.969654] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:11.576 [2024-07-23 08:30:23.969667] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000039980 name raid_bdev1, state offline 00:17:11.576 08:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1463880 00:17:11.833 [2024-07-23 08:30:24.238356] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:13.208 08:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:13.208 00:17:13.208 real 0m18.134s 00:17:13.208 user 0m32.190s 00:17:13.208 sys 0m2.730s 00:17:13.208 08:30:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:13.208 08:30:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:13.208 ************************************ 00:17:13.208 END TEST raid_superblock_test 00:17:13.208 ************************************ 00:17:13.208 08:30:25 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:13.208 08:30:25 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:17:13.208 08:30:25 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:13.208 08:30:25 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:13.208 08:30:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:13.208 ************************************ 00:17:13.208 START TEST raid_read_error_test 00:17:13.208 ************************************ 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.KoaTUQsMi5 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1467766 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1467766 /var/tmp/spdk-raid.sock 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1467766 ']' 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:13.208 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:13.208 08:30:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:13.208 [2024-07-23 08:30:25.655357] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:17:13.208 [2024-07-23 08:30:25.655450] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1467766 ] 00:17:13.466 [2024-07-23 08:30:25.778025] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:13.724 [2024-07-23 08:30:25.996418] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:13.982 [2024-07-23 08:30:26.243757] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:13.982 [2024-07-23 08:30:26.243794] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:13.982 08:30:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:13.982 08:30:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:13.982 08:30:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:13.982 08:30:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:14.240 BaseBdev1_malloc 00:17:14.240 08:30:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:14.498 true 00:17:14.498 08:30:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:14.498 [2024-07-23 08:30:26.957548] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:14.498 [2024-07-23 08:30:26.957604] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:14.498 [2024-07-23 08:30:26.957632] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:17:14.498 [2024-07-23 08:30:26.957643] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:14.498 [2024-07-23 08:30:26.959648] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:14.498 [2024-07-23 08:30:26.959679] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:14.498 BaseBdev1 00:17:14.498 08:30:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:14.498 08:30:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:14.756 BaseBdev2_malloc 00:17:14.756 08:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:15.013 true 00:17:15.013 08:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:15.013 [2024-07-23 08:30:27.486970] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:15.013 [2024-07-23 08:30:27.487019] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:15.013 [2024-07-23 08:30:27.487037] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:17:15.013 [2024-07-23 08:30:27.487050] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:15.013 [2024-07-23 08:30:27.489022] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:15.013 [2024-07-23 08:30:27.489053] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:15.013 BaseBdev2 00:17:15.013 08:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:15.013 08:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:15.271 BaseBdev3_malloc 00:17:15.271 08:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:15.529 true 00:17:15.529 08:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:15.529 [2024-07-23 08:30:28.017967] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:15.529 [2024-07-23 08:30:28.018017] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:15.529 [2024-07-23 08:30:28.018037] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036980 00:17:15.529 [2024-07-23 08:30:28.018046] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:15.529 [2024-07-23 08:30:28.019930] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:15.530 [2024-07-23 08:30:28.019960] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:15.530 BaseBdev3 00:17:15.530 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:15.788 [2024-07-23 08:30:28.182439] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:15.788 [2024-07-23 08:30:28.184016] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:15.788 [2024-07-23 08:30:28.184084] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:15.788 [2024-07-23 08:30:28.184296] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036f80 00:17:15.788 [2024-07-23 08:30:28.184308] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:15.788 [2024-07-23 08:30:28.184552] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:17:15.788 [2024-07-23 08:30:28.184753] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036f80 00:17:15.788 [2024-07-23 08:30:28.184769] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036f80 00:17:15.788 [2024-07-23 08:30:28.184919] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:15.788 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:15.788 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:15.788 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:15.788 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:15.788 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:15.788 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:15.788 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.788 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.788 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.788 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.788 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.788 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:16.046 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:16.046 "name": "raid_bdev1", 00:17:16.046 "uuid": "be4803f4-2fdb-49f3-8c7f-a2e3f30406e6", 00:17:16.046 "strip_size_kb": 0, 00:17:16.046 "state": "online", 00:17:16.046 "raid_level": "raid1", 00:17:16.046 "superblock": true, 00:17:16.046 "num_base_bdevs": 3, 00:17:16.046 "num_base_bdevs_discovered": 3, 00:17:16.046 "num_base_bdevs_operational": 3, 00:17:16.046 "base_bdevs_list": [ 00:17:16.046 { 00:17:16.046 "name": "BaseBdev1", 00:17:16.046 "uuid": "1bca5201-b8cf-5e1c-b04a-08e8dc57fce0", 00:17:16.046 "is_configured": true, 00:17:16.046 "data_offset": 2048, 00:17:16.046 "data_size": 63488 00:17:16.046 }, 00:17:16.046 { 00:17:16.046 "name": "BaseBdev2", 00:17:16.046 "uuid": "e4a6259f-a873-572b-95cc-2a080e71e1fb", 00:17:16.046 "is_configured": true, 00:17:16.046 "data_offset": 2048, 00:17:16.046 "data_size": 63488 00:17:16.046 }, 00:17:16.046 { 00:17:16.046 "name": "BaseBdev3", 00:17:16.046 "uuid": "fcdd5534-d85c-563f-a4c2-8d33bb4b1e31", 00:17:16.046 "is_configured": true, 00:17:16.046 "data_offset": 2048, 00:17:16.046 "data_size": 63488 00:17:16.046 } 00:17:16.046 ] 00:17:16.046 }' 00:17:16.046 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:16.046 08:30:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:16.304 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:16.304 08:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:16.562 [2024-07-23 08:30:28.897750] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:17:17.496 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:17.496 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:17.496 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:17.496 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:17:17.496 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:17:17.496 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:17.496 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:17.496 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:17.496 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:17.496 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:17.496 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:17.496 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:17.496 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:17.496 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:17.496 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:17.497 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.497 08:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:17.755 08:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:17.755 "name": "raid_bdev1", 00:17:17.755 "uuid": "be4803f4-2fdb-49f3-8c7f-a2e3f30406e6", 00:17:17.755 "strip_size_kb": 0, 00:17:17.755 "state": "online", 00:17:17.755 "raid_level": "raid1", 00:17:17.755 "superblock": true, 00:17:17.755 "num_base_bdevs": 3, 00:17:17.755 "num_base_bdevs_discovered": 3, 00:17:17.755 "num_base_bdevs_operational": 3, 00:17:17.755 "base_bdevs_list": [ 00:17:17.755 { 00:17:17.755 "name": "BaseBdev1", 00:17:17.755 "uuid": "1bca5201-b8cf-5e1c-b04a-08e8dc57fce0", 00:17:17.755 "is_configured": true, 00:17:17.755 "data_offset": 2048, 00:17:17.755 "data_size": 63488 00:17:17.755 }, 00:17:17.755 { 00:17:17.755 "name": "BaseBdev2", 00:17:17.755 "uuid": "e4a6259f-a873-572b-95cc-2a080e71e1fb", 00:17:17.755 "is_configured": true, 00:17:17.755 "data_offset": 2048, 00:17:17.755 "data_size": 63488 00:17:17.755 }, 00:17:17.755 { 00:17:17.755 "name": "BaseBdev3", 00:17:17.755 "uuid": "fcdd5534-d85c-563f-a4c2-8d33bb4b1e31", 00:17:17.755 "is_configured": true, 00:17:17.755 "data_offset": 2048, 00:17:17.755 "data_size": 63488 00:17:17.755 } 00:17:17.755 ] 00:17:17.755 }' 00:17:17.755 08:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:17.755 08:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:18.321 08:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:18.579 [2024-07-23 08:30:30.841386] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:18.579 [2024-07-23 08:30:30.841418] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:18.579 [2024-07-23 08:30:30.843885] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:18.580 [2024-07-23 08:30:30.843927] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:18.580 [2024-07-23 08:30:30.844026] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:18.580 [2024-07-23 08:30:30.844036] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036f80 name raid_bdev1, state offline 00:17:18.580 0 00:17:18.580 08:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1467766 00:17:18.580 08:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1467766 ']' 00:17:18.580 08:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1467766 00:17:18.580 08:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:17:18.580 08:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:18.580 08:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1467766 00:17:18.580 08:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:18.580 08:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:18.580 08:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1467766' 00:17:18.580 killing process with pid 1467766 00:17:18.580 08:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1467766 00:17:18.580 [2024-07-23 08:30:30.899707] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:18.580 08:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1467766 00:17:18.580 [2024-07-23 08:30:31.063057] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:19.954 08:30:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.KoaTUQsMi5 00:17:19.954 08:30:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:19.954 08:30:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:19.954 08:30:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:19.954 08:30:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:19.954 08:30:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:19.954 08:30:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:19.954 08:30:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:19.954 00:17:19.954 real 0m6.866s 00:17:19.954 user 0m9.809s 00:17:19.954 sys 0m0.854s 00:17:19.954 08:30:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:19.954 08:30:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:19.954 ************************************ 00:17:19.954 END TEST raid_read_error_test 00:17:19.954 ************************************ 00:17:19.954 08:30:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:19.954 08:30:32 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:17:19.954 08:30:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:19.954 08:30:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:19.954 08:30:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:20.212 ************************************ 00:17:20.212 START TEST raid_write_error_test 00:17:20.212 ************************************ 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.2QTNKeRggm 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1469180 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1469180 /var/tmp/spdk-raid.sock 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1469180 ']' 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:20.212 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:20.212 08:30:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:20.212 [2024-07-23 08:30:32.602960] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:17:20.212 [2024-07-23 08:30:32.603063] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1469180 ] 00:17:20.212 [2024-07-23 08:30:32.729078] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:20.470 [2024-07-23 08:30:32.959990] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:20.728 [2024-07-23 08:30:33.219414] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:20.728 [2024-07-23 08:30:33.219457] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:20.987 08:30:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:20.987 08:30:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:20.987 08:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:20.987 08:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:21.245 BaseBdev1_malloc 00:17:21.245 08:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:21.245 true 00:17:21.503 08:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:21.503 [2024-07-23 08:30:33.947738] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:21.503 [2024-07-23 08:30:33.947830] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:21.503 [2024-07-23 08:30:33.947854] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:17:21.503 [2024-07-23 08:30:33.947865] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:21.503 [2024-07-23 08:30:33.949914] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:21.503 [2024-07-23 08:30:33.949947] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:21.503 BaseBdev1 00:17:21.503 08:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:21.503 08:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:21.760 BaseBdev2_malloc 00:17:21.760 08:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:22.018 true 00:17:22.018 08:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:22.018 [2024-07-23 08:30:34.498100] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:22.018 [2024-07-23 08:30:34.498166] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:22.018 [2024-07-23 08:30:34.498204] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:17:22.018 [2024-07-23 08:30:34.498217] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:22.018 [2024-07-23 08:30:34.500274] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:22.018 [2024-07-23 08:30:34.500304] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:22.018 BaseBdev2 00:17:22.018 08:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:22.018 08:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:22.277 BaseBdev3_malloc 00:17:22.277 08:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:22.535 true 00:17:22.535 08:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:22.535 [2024-07-23 08:30:35.050990] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:22.535 [2024-07-23 08:30:35.051059] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:22.535 [2024-07-23 08:30:35.051082] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036980 00:17:22.535 [2024-07-23 08:30:35.051094] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:22.535 [2024-07-23 08:30:35.053164] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:22.535 [2024-07-23 08:30:35.053196] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:22.794 BaseBdev3 00:17:22.794 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:22.794 [2024-07-23 08:30:35.243554] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:22.794 [2024-07-23 08:30:35.245246] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:22.794 [2024-07-23 08:30:35.245321] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:22.794 [2024-07-23 08:30:35.245563] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036f80 00:17:22.794 [2024-07-23 08:30:35.245576] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:22.794 [2024-07-23 08:30:35.245857] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:17:22.794 [2024-07-23 08:30:35.246070] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036f80 00:17:22.794 [2024-07-23 08:30:35.246086] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036f80 00:17:22.794 [2024-07-23 08:30:35.246253] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:22.794 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:22.794 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:22.794 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:22.794 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:22.794 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:22.794 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:22.794 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:22.794 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:22.794 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:22.794 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:22.794 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:22.794 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.053 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:23.053 "name": "raid_bdev1", 00:17:23.053 "uuid": "06a38944-e239-4f55-9260-b047fc165266", 00:17:23.053 "strip_size_kb": 0, 00:17:23.053 "state": "online", 00:17:23.053 "raid_level": "raid1", 00:17:23.053 "superblock": true, 00:17:23.053 "num_base_bdevs": 3, 00:17:23.053 "num_base_bdevs_discovered": 3, 00:17:23.053 "num_base_bdevs_operational": 3, 00:17:23.053 "base_bdevs_list": [ 00:17:23.053 { 00:17:23.053 "name": "BaseBdev1", 00:17:23.053 "uuid": "84435735-3370-5d5a-8748-d04144795d75", 00:17:23.053 "is_configured": true, 00:17:23.053 "data_offset": 2048, 00:17:23.053 "data_size": 63488 00:17:23.053 }, 00:17:23.053 { 00:17:23.053 "name": "BaseBdev2", 00:17:23.053 "uuid": "62f765b8-bf58-5537-8085-9e322951fddc", 00:17:23.053 "is_configured": true, 00:17:23.053 "data_offset": 2048, 00:17:23.053 "data_size": 63488 00:17:23.053 }, 00:17:23.053 { 00:17:23.053 "name": "BaseBdev3", 00:17:23.053 "uuid": "02b3c8d6-ffc6-5d43-ac09-5ae541eed979", 00:17:23.053 "is_configured": true, 00:17:23.053 "data_offset": 2048, 00:17:23.053 "data_size": 63488 00:17:23.053 } 00:17:23.053 ] 00:17:23.053 }' 00:17:23.053 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:23.053 08:30:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:23.658 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:23.658 08:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:23.658 [2024-07-23 08:30:35.999051] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:17:24.594 08:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:24.594 [2024-07-23 08:30:37.076928] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:17:24.594 [2024-07-23 08:30:37.076982] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:24.594 [2024-07-23 08:30:37.077210] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d00000c060 00:17:24.594 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:24.594 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:24.594 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:17:24.594 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:17:24.594 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:24.594 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:24.594 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:24.594 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:24.595 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:24.595 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:24.595 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:24.595 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:24.595 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:24.595 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:24.595 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.595 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:24.853 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:24.853 "name": "raid_bdev1", 00:17:24.853 "uuid": "06a38944-e239-4f55-9260-b047fc165266", 00:17:24.853 "strip_size_kb": 0, 00:17:24.853 "state": "online", 00:17:24.853 "raid_level": "raid1", 00:17:24.853 "superblock": true, 00:17:24.853 "num_base_bdevs": 3, 00:17:24.853 "num_base_bdevs_discovered": 2, 00:17:24.853 "num_base_bdevs_operational": 2, 00:17:24.853 "base_bdevs_list": [ 00:17:24.853 { 00:17:24.853 "name": null, 00:17:24.853 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:24.853 "is_configured": false, 00:17:24.853 "data_offset": 2048, 00:17:24.853 "data_size": 63488 00:17:24.853 }, 00:17:24.853 { 00:17:24.853 "name": "BaseBdev2", 00:17:24.853 "uuid": "62f765b8-bf58-5537-8085-9e322951fddc", 00:17:24.853 "is_configured": true, 00:17:24.853 "data_offset": 2048, 00:17:24.853 "data_size": 63488 00:17:24.853 }, 00:17:24.853 { 00:17:24.853 "name": "BaseBdev3", 00:17:24.853 "uuid": "02b3c8d6-ffc6-5d43-ac09-5ae541eed979", 00:17:24.853 "is_configured": true, 00:17:24.853 "data_offset": 2048, 00:17:24.853 "data_size": 63488 00:17:24.853 } 00:17:24.853 ] 00:17:24.853 }' 00:17:24.853 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:24.853 08:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:25.419 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:25.677 [2024-07-23 08:30:37.960955] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:25.677 [2024-07-23 08:30:37.961000] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:25.677 [2024-07-23 08:30:37.963366] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:25.677 [2024-07-23 08:30:37.963405] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:25.677 [2024-07-23 08:30:37.963480] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:25.677 [2024-07-23 08:30:37.963493] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036f80 name raid_bdev1, state offline 00:17:25.677 0 00:17:25.677 08:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1469180 00:17:25.677 08:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1469180 ']' 00:17:25.677 08:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1469180 00:17:25.677 08:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:17:25.677 08:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:25.677 08:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1469180 00:17:25.677 08:30:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:25.677 08:30:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:25.677 08:30:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1469180' 00:17:25.677 killing process with pid 1469180 00:17:25.677 08:30:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1469180 00:17:25.677 [2024-07-23 08:30:38.019774] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:25.677 08:30:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1469180 00:17:25.677 [2024-07-23 08:30:38.186707] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:27.051 08:30:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.2QTNKeRggm 00:17:27.051 08:30:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:27.051 08:30:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:27.051 08:30:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:27.051 08:30:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:27.051 08:30:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:27.051 08:30:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:27.051 08:30:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:27.051 00:17:27.051 real 0m6.998s 00:17:27.051 user 0m9.946s 00:17:27.051 sys 0m1.019s 00:17:27.051 08:30:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:27.051 08:30:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:27.051 ************************************ 00:17:27.051 END TEST raid_write_error_test 00:17:27.051 ************************************ 00:17:27.051 08:30:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:27.051 08:30:39 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:17:27.051 08:30:39 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:17:27.051 08:30:39 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:17:27.051 08:30:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:27.051 08:30:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:27.051 08:30:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:27.309 ************************************ 00:17:27.309 START TEST raid_state_function_test 00:17:27.309 ************************************ 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1470536 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1470536' 00:17:27.309 Process raid pid: 1470536 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1470536 /var/tmp/spdk-raid.sock 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1470536 ']' 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:27.309 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:27.309 08:30:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:27.309 [2024-07-23 08:30:39.658564] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:17:27.309 [2024-07-23 08:30:39.658655] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:27.309 [2024-07-23 08:30:39.783078] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:27.567 [2024-07-23 08:30:39.997224] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:27.825 [2024-07-23 08:30:40.276744] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:27.825 [2024-07-23 08:30:40.276771] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:28.082 08:30:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:28.082 08:30:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:17:28.082 08:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:28.082 [2024-07-23 08:30:40.583899] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:28.082 [2024-07-23 08:30:40.583951] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:28.082 [2024-07-23 08:30:40.583961] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:28.082 [2024-07-23 08:30:40.583972] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:28.082 [2024-07-23 08:30:40.583979] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:28.082 [2024-07-23 08:30:40.583989] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:28.082 [2024-07-23 08:30:40.583996] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:28.082 [2024-07-23 08:30:40.584005] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:28.082 08:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:28.082 08:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:28.339 08:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:28.339 08:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:28.339 08:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:28.339 08:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:28.339 08:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.339 08:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.339 08:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.339 08:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.339 08:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.339 08:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:28.339 08:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.339 "name": "Existed_Raid", 00:17:28.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:28.339 "strip_size_kb": 64, 00:17:28.339 "state": "configuring", 00:17:28.339 "raid_level": "raid0", 00:17:28.339 "superblock": false, 00:17:28.339 "num_base_bdevs": 4, 00:17:28.339 "num_base_bdevs_discovered": 0, 00:17:28.339 "num_base_bdevs_operational": 4, 00:17:28.339 "base_bdevs_list": [ 00:17:28.339 { 00:17:28.339 "name": "BaseBdev1", 00:17:28.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:28.339 "is_configured": false, 00:17:28.339 "data_offset": 0, 00:17:28.339 "data_size": 0 00:17:28.339 }, 00:17:28.339 { 00:17:28.339 "name": "BaseBdev2", 00:17:28.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:28.339 "is_configured": false, 00:17:28.339 "data_offset": 0, 00:17:28.339 "data_size": 0 00:17:28.339 }, 00:17:28.339 { 00:17:28.339 "name": "BaseBdev3", 00:17:28.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:28.339 "is_configured": false, 00:17:28.339 "data_offset": 0, 00:17:28.339 "data_size": 0 00:17:28.339 }, 00:17:28.339 { 00:17:28.339 "name": "BaseBdev4", 00:17:28.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:28.339 "is_configured": false, 00:17:28.339 "data_offset": 0, 00:17:28.339 "data_size": 0 00:17:28.339 } 00:17:28.339 ] 00:17:28.339 }' 00:17:28.339 08:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.339 08:30:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:28.905 08:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:29.163 [2024-07-23 08:30:41.446040] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:29.163 [2024-07-23 08:30:41.446073] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:17:29.163 08:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:29.163 [2024-07-23 08:30:41.618523] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:29.163 [2024-07-23 08:30:41.618567] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:29.163 [2024-07-23 08:30:41.618575] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:29.163 [2024-07-23 08:30:41.618601] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:29.163 [2024-07-23 08:30:41.618616] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:29.163 [2024-07-23 08:30:41.618626] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:29.163 [2024-07-23 08:30:41.618633] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:29.163 [2024-07-23 08:30:41.618643] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:29.163 08:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:29.421 [2024-07-23 08:30:41.834976] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:29.421 BaseBdev1 00:17:29.421 08:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:29.421 08:30:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:29.421 08:30:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:29.421 08:30:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:29.421 08:30:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:29.421 08:30:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:29.421 08:30:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:29.678 08:30:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:29.678 [ 00:17:29.678 { 00:17:29.678 "name": "BaseBdev1", 00:17:29.678 "aliases": [ 00:17:29.678 "8bcccba8-80a5-4dbb-9e3f-962869fa31a3" 00:17:29.678 ], 00:17:29.678 "product_name": "Malloc disk", 00:17:29.678 "block_size": 512, 00:17:29.678 "num_blocks": 65536, 00:17:29.678 "uuid": "8bcccba8-80a5-4dbb-9e3f-962869fa31a3", 00:17:29.678 "assigned_rate_limits": { 00:17:29.678 "rw_ios_per_sec": 0, 00:17:29.678 "rw_mbytes_per_sec": 0, 00:17:29.678 "r_mbytes_per_sec": 0, 00:17:29.678 "w_mbytes_per_sec": 0 00:17:29.678 }, 00:17:29.678 "claimed": true, 00:17:29.678 "claim_type": "exclusive_write", 00:17:29.678 "zoned": false, 00:17:29.678 "supported_io_types": { 00:17:29.678 "read": true, 00:17:29.678 "write": true, 00:17:29.678 "unmap": true, 00:17:29.678 "flush": true, 00:17:29.678 "reset": true, 00:17:29.678 "nvme_admin": false, 00:17:29.678 "nvme_io": false, 00:17:29.678 "nvme_io_md": false, 00:17:29.678 "write_zeroes": true, 00:17:29.678 "zcopy": true, 00:17:29.678 "get_zone_info": false, 00:17:29.678 "zone_management": false, 00:17:29.678 "zone_append": false, 00:17:29.678 "compare": false, 00:17:29.678 "compare_and_write": false, 00:17:29.678 "abort": true, 00:17:29.678 "seek_hole": false, 00:17:29.678 "seek_data": false, 00:17:29.678 "copy": true, 00:17:29.678 "nvme_iov_md": false 00:17:29.678 }, 00:17:29.678 "memory_domains": [ 00:17:29.678 { 00:17:29.678 "dma_device_id": "system", 00:17:29.678 "dma_device_type": 1 00:17:29.678 }, 00:17:29.678 { 00:17:29.678 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.678 "dma_device_type": 2 00:17:29.678 } 00:17:29.678 ], 00:17:29.678 "driver_specific": {} 00:17:29.678 } 00:17:29.678 ] 00:17:29.937 08:30:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:29.937 08:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:29.937 08:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:29.937 08:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:29.937 08:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:29.937 08:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:29.937 08:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:29.937 08:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:29.937 08:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:29.937 08:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:29.937 08:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:29.937 08:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.937 08:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:29.937 08:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:29.937 "name": "Existed_Raid", 00:17:29.937 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:29.937 "strip_size_kb": 64, 00:17:29.937 "state": "configuring", 00:17:29.937 "raid_level": "raid0", 00:17:29.937 "superblock": false, 00:17:29.937 "num_base_bdevs": 4, 00:17:29.937 "num_base_bdevs_discovered": 1, 00:17:29.937 "num_base_bdevs_operational": 4, 00:17:29.937 "base_bdevs_list": [ 00:17:29.937 { 00:17:29.937 "name": "BaseBdev1", 00:17:29.937 "uuid": "8bcccba8-80a5-4dbb-9e3f-962869fa31a3", 00:17:29.937 "is_configured": true, 00:17:29.937 "data_offset": 0, 00:17:29.937 "data_size": 65536 00:17:29.937 }, 00:17:29.937 { 00:17:29.937 "name": "BaseBdev2", 00:17:29.937 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:29.937 "is_configured": false, 00:17:29.937 "data_offset": 0, 00:17:29.937 "data_size": 0 00:17:29.937 }, 00:17:29.937 { 00:17:29.937 "name": "BaseBdev3", 00:17:29.937 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:29.937 "is_configured": false, 00:17:29.937 "data_offset": 0, 00:17:29.937 "data_size": 0 00:17:29.937 }, 00:17:29.937 { 00:17:29.937 "name": "BaseBdev4", 00:17:29.937 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:29.937 "is_configured": false, 00:17:29.937 "data_offset": 0, 00:17:29.937 "data_size": 0 00:17:29.937 } 00:17:29.937 ] 00:17:29.937 }' 00:17:29.937 08:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:29.937 08:30:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:30.503 08:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:30.761 [2024-07-23 08:30:43.026161] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:30.761 [2024-07-23 08:30:43.026209] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:17:30.761 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:30.761 [2024-07-23 08:30:43.194649] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:30.761 [2024-07-23 08:30:43.196276] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:30.761 [2024-07-23 08:30:43.196312] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:30.761 [2024-07-23 08:30:43.196322] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:30.761 [2024-07-23 08:30:43.196331] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:30.761 [2024-07-23 08:30:43.196338] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:30.761 [2024-07-23 08:30:43.196349] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:30.761 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:30.761 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:30.761 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:30.761 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:30.761 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:30.761 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:30.761 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:30.761 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:30.762 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:30.762 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:30.762 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:30.762 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:30.762 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.762 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:31.019 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:31.019 "name": "Existed_Raid", 00:17:31.019 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:31.019 "strip_size_kb": 64, 00:17:31.019 "state": "configuring", 00:17:31.019 "raid_level": "raid0", 00:17:31.019 "superblock": false, 00:17:31.019 "num_base_bdevs": 4, 00:17:31.019 "num_base_bdevs_discovered": 1, 00:17:31.019 "num_base_bdevs_operational": 4, 00:17:31.019 "base_bdevs_list": [ 00:17:31.019 { 00:17:31.019 "name": "BaseBdev1", 00:17:31.019 "uuid": "8bcccba8-80a5-4dbb-9e3f-962869fa31a3", 00:17:31.019 "is_configured": true, 00:17:31.019 "data_offset": 0, 00:17:31.019 "data_size": 65536 00:17:31.019 }, 00:17:31.019 { 00:17:31.019 "name": "BaseBdev2", 00:17:31.019 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:31.019 "is_configured": false, 00:17:31.019 "data_offset": 0, 00:17:31.019 "data_size": 0 00:17:31.019 }, 00:17:31.019 { 00:17:31.019 "name": "BaseBdev3", 00:17:31.019 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:31.019 "is_configured": false, 00:17:31.019 "data_offset": 0, 00:17:31.019 "data_size": 0 00:17:31.019 }, 00:17:31.019 { 00:17:31.019 "name": "BaseBdev4", 00:17:31.019 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:31.019 "is_configured": false, 00:17:31.019 "data_offset": 0, 00:17:31.019 "data_size": 0 00:17:31.019 } 00:17:31.019 ] 00:17:31.019 }' 00:17:31.019 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:31.019 08:30:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:31.585 08:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:31.585 [2024-07-23 08:30:44.000439] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:31.585 BaseBdev2 00:17:31.585 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:31.585 08:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:31.585 08:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:31.585 08:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:31.585 08:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:31.585 08:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:31.585 08:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:31.843 08:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:31.843 [ 00:17:31.843 { 00:17:31.843 "name": "BaseBdev2", 00:17:31.843 "aliases": [ 00:17:31.843 "ba7aff0f-910c-46d5-9382-dbcf1f48dc51" 00:17:31.843 ], 00:17:31.843 "product_name": "Malloc disk", 00:17:31.843 "block_size": 512, 00:17:31.843 "num_blocks": 65536, 00:17:31.843 "uuid": "ba7aff0f-910c-46d5-9382-dbcf1f48dc51", 00:17:31.843 "assigned_rate_limits": { 00:17:31.843 "rw_ios_per_sec": 0, 00:17:31.843 "rw_mbytes_per_sec": 0, 00:17:31.843 "r_mbytes_per_sec": 0, 00:17:31.843 "w_mbytes_per_sec": 0 00:17:31.843 }, 00:17:31.843 "claimed": true, 00:17:31.843 "claim_type": "exclusive_write", 00:17:31.843 "zoned": false, 00:17:31.843 "supported_io_types": { 00:17:31.843 "read": true, 00:17:31.843 "write": true, 00:17:31.843 "unmap": true, 00:17:31.843 "flush": true, 00:17:31.843 "reset": true, 00:17:31.843 "nvme_admin": false, 00:17:31.843 "nvme_io": false, 00:17:31.843 "nvme_io_md": false, 00:17:31.843 "write_zeroes": true, 00:17:31.843 "zcopy": true, 00:17:31.843 "get_zone_info": false, 00:17:31.843 "zone_management": false, 00:17:31.843 "zone_append": false, 00:17:31.843 "compare": false, 00:17:31.843 "compare_and_write": false, 00:17:31.843 "abort": true, 00:17:31.843 "seek_hole": false, 00:17:31.843 "seek_data": false, 00:17:31.843 "copy": true, 00:17:31.843 "nvme_iov_md": false 00:17:31.843 }, 00:17:31.843 "memory_domains": [ 00:17:31.843 { 00:17:31.843 "dma_device_id": "system", 00:17:31.843 "dma_device_type": 1 00:17:31.843 }, 00:17:31.843 { 00:17:31.843 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.843 "dma_device_type": 2 00:17:31.843 } 00:17:31.843 ], 00:17:31.843 "driver_specific": {} 00:17:31.843 } 00:17:31.843 ] 00:17:31.843 08:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:31.843 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:31.843 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:31.843 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:31.843 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:31.843 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:31.843 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:31.843 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:31.843 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:31.843 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:31.843 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:31.843 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:31.843 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:31.843 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:31.843 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:32.101 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:32.101 "name": "Existed_Raid", 00:17:32.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:32.101 "strip_size_kb": 64, 00:17:32.101 "state": "configuring", 00:17:32.101 "raid_level": "raid0", 00:17:32.101 "superblock": false, 00:17:32.101 "num_base_bdevs": 4, 00:17:32.101 "num_base_bdevs_discovered": 2, 00:17:32.101 "num_base_bdevs_operational": 4, 00:17:32.101 "base_bdevs_list": [ 00:17:32.101 { 00:17:32.101 "name": "BaseBdev1", 00:17:32.101 "uuid": "8bcccba8-80a5-4dbb-9e3f-962869fa31a3", 00:17:32.101 "is_configured": true, 00:17:32.101 "data_offset": 0, 00:17:32.101 "data_size": 65536 00:17:32.101 }, 00:17:32.101 { 00:17:32.101 "name": "BaseBdev2", 00:17:32.101 "uuid": "ba7aff0f-910c-46d5-9382-dbcf1f48dc51", 00:17:32.101 "is_configured": true, 00:17:32.101 "data_offset": 0, 00:17:32.101 "data_size": 65536 00:17:32.101 }, 00:17:32.101 { 00:17:32.101 "name": "BaseBdev3", 00:17:32.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:32.101 "is_configured": false, 00:17:32.101 "data_offset": 0, 00:17:32.101 "data_size": 0 00:17:32.101 }, 00:17:32.101 { 00:17:32.101 "name": "BaseBdev4", 00:17:32.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:32.101 "is_configured": false, 00:17:32.101 "data_offset": 0, 00:17:32.101 "data_size": 0 00:17:32.101 } 00:17:32.101 ] 00:17:32.101 }' 00:17:32.101 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:32.101 08:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:32.666 08:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:32.666 [2024-07-23 08:30:45.181821] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:32.666 BaseBdev3 00:17:32.923 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:32.923 08:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:32.923 08:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:32.923 08:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:32.923 08:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:32.923 08:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:32.923 08:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:32.923 08:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:33.181 [ 00:17:33.181 { 00:17:33.181 "name": "BaseBdev3", 00:17:33.181 "aliases": [ 00:17:33.181 "c52d68f4-2890-4fdd-8cdb-33320caab66b" 00:17:33.181 ], 00:17:33.181 "product_name": "Malloc disk", 00:17:33.181 "block_size": 512, 00:17:33.181 "num_blocks": 65536, 00:17:33.181 "uuid": "c52d68f4-2890-4fdd-8cdb-33320caab66b", 00:17:33.181 "assigned_rate_limits": { 00:17:33.181 "rw_ios_per_sec": 0, 00:17:33.181 "rw_mbytes_per_sec": 0, 00:17:33.181 "r_mbytes_per_sec": 0, 00:17:33.181 "w_mbytes_per_sec": 0 00:17:33.181 }, 00:17:33.181 "claimed": true, 00:17:33.181 "claim_type": "exclusive_write", 00:17:33.181 "zoned": false, 00:17:33.181 "supported_io_types": { 00:17:33.181 "read": true, 00:17:33.181 "write": true, 00:17:33.181 "unmap": true, 00:17:33.181 "flush": true, 00:17:33.181 "reset": true, 00:17:33.181 "nvme_admin": false, 00:17:33.181 "nvme_io": false, 00:17:33.181 "nvme_io_md": false, 00:17:33.181 "write_zeroes": true, 00:17:33.181 "zcopy": true, 00:17:33.181 "get_zone_info": false, 00:17:33.181 "zone_management": false, 00:17:33.181 "zone_append": false, 00:17:33.181 "compare": false, 00:17:33.181 "compare_and_write": false, 00:17:33.181 "abort": true, 00:17:33.181 "seek_hole": false, 00:17:33.181 "seek_data": false, 00:17:33.181 "copy": true, 00:17:33.181 "nvme_iov_md": false 00:17:33.181 }, 00:17:33.181 "memory_domains": [ 00:17:33.181 { 00:17:33.181 "dma_device_id": "system", 00:17:33.181 "dma_device_type": 1 00:17:33.181 }, 00:17:33.181 { 00:17:33.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.181 "dma_device_type": 2 00:17:33.181 } 00:17:33.181 ], 00:17:33.181 "driver_specific": {} 00:17:33.181 } 00:17:33.181 ] 00:17:33.181 08:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:33.181 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:33.181 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:33.181 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:33.181 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:33.181 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:33.181 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:33.181 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:33.181 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:33.181 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:33.181 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:33.181 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:33.181 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:33.181 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.181 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:33.439 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:33.440 "name": "Existed_Raid", 00:17:33.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.440 "strip_size_kb": 64, 00:17:33.440 "state": "configuring", 00:17:33.440 "raid_level": "raid0", 00:17:33.440 "superblock": false, 00:17:33.440 "num_base_bdevs": 4, 00:17:33.440 "num_base_bdevs_discovered": 3, 00:17:33.440 "num_base_bdevs_operational": 4, 00:17:33.440 "base_bdevs_list": [ 00:17:33.440 { 00:17:33.440 "name": "BaseBdev1", 00:17:33.440 "uuid": "8bcccba8-80a5-4dbb-9e3f-962869fa31a3", 00:17:33.440 "is_configured": true, 00:17:33.440 "data_offset": 0, 00:17:33.440 "data_size": 65536 00:17:33.440 }, 00:17:33.440 { 00:17:33.440 "name": "BaseBdev2", 00:17:33.440 "uuid": "ba7aff0f-910c-46d5-9382-dbcf1f48dc51", 00:17:33.440 "is_configured": true, 00:17:33.440 "data_offset": 0, 00:17:33.440 "data_size": 65536 00:17:33.440 }, 00:17:33.440 { 00:17:33.440 "name": "BaseBdev3", 00:17:33.440 "uuid": "c52d68f4-2890-4fdd-8cdb-33320caab66b", 00:17:33.440 "is_configured": true, 00:17:33.440 "data_offset": 0, 00:17:33.440 "data_size": 65536 00:17:33.440 }, 00:17:33.440 { 00:17:33.440 "name": "BaseBdev4", 00:17:33.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.440 "is_configured": false, 00:17:33.440 "data_offset": 0, 00:17:33.440 "data_size": 0 00:17:33.440 } 00:17:33.440 ] 00:17:33.440 }' 00:17:33.440 08:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:33.440 08:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:34.005 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:34.005 [2024-07-23 08:30:46.414823] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:34.005 [2024-07-23 08:30:46.414864] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:17:34.005 [2024-07-23 08:30:46.414873] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:34.005 [2024-07-23 08:30:46.415101] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:17:34.005 [2024-07-23 08:30:46.415281] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:17:34.005 [2024-07-23 08:30:46.415293] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:17:34.005 [2024-07-23 08:30:46.415545] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:34.005 BaseBdev4 00:17:34.005 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:34.005 08:30:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:34.005 08:30:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:34.005 08:30:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:34.005 08:30:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:34.005 08:30:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:34.005 08:30:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:34.263 08:30:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:34.263 [ 00:17:34.263 { 00:17:34.263 "name": "BaseBdev4", 00:17:34.263 "aliases": [ 00:17:34.263 "720de797-2614-4107-b7bf-bdedc29ce9ad" 00:17:34.263 ], 00:17:34.263 "product_name": "Malloc disk", 00:17:34.263 "block_size": 512, 00:17:34.263 "num_blocks": 65536, 00:17:34.263 "uuid": "720de797-2614-4107-b7bf-bdedc29ce9ad", 00:17:34.263 "assigned_rate_limits": { 00:17:34.263 "rw_ios_per_sec": 0, 00:17:34.263 "rw_mbytes_per_sec": 0, 00:17:34.263 "r_mbytes_per_sec": 0, 00:17:34.263 "w_mbytes_per_sec": 0 00:17:34.263 }, 00:17:34.263 "claimed": true, 00:17:34.263 "claim_type": "exclusive_write", 00:17:34.263 "zoned": false, 00:17:34.263 "supported_io_types": { 00:17:34.263 "read": true, 00:17:34.263 "write": true, 00:17:34.263 "unmap": true, 00:17:34.263 "flush": true, 00:17:34.263 "reset": true, 00:17:34.263 "nvme_admin": false, 00:17:34.263 "nvme_io": false, 00:17:34.263 "nvme_io_md": false, 00:17:34.263 "write_zeroes": true, 00:17:34.263 "zcopy": true, 00:17:34.263 "get_zone_info": false, 00:17:34.263 "zone_management": false, 00:17:34.263 "zone_append": false, 00:17:34.263 "compare": false, 00:17:34.263 "compare_and_write": false, 00:17:34.263 "abort": true, 00:17:34.263 "seek_hole": false, 00:17:34.263 "seek_data": false, 00:17:34.263 "copy": true, 00:17:34.263 "nvme_iov_md": false 00:17:34.263 }, 00:17:34.263 "memory_domains": [ 00:17:34.263 { 00:17:34.263 "dma_device_id": "system", 00:17:34.263 "dma_device_type": 1 00:17:34.263 }, 00:17:34.263 { 00:17:34.263 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:34.263 "dma_device_type": 2 00:17:34.263 } 00:17:34.263 ], 00:17:34.263 "driver_specific": {} 00:17:34.263 } 00:17:34.263 ] 00:17:34.263 08:30:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:34.263 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:34.263 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:34.263 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:34.263 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:34.263 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:34.263 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:34.263 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:34.263 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:34.263 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:34.263 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:34.263 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:34.263 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:34.263 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.263 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:34.521 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:34.521 "name": "Existed_Raid", 00:17:34.521 "uuid": "3915f0ef-0cea-4deb-9d41-9a5275faf7ec", 00:17:34.521 "strip_size_kb": 64, 00:17:34.521 "state": "online", 00:17:34.521 "raid_level": "raid0", 00:17:34.521 "superblock": false, 00:17:34.521 "num_base_bdevs": 4, 00:17:34.521 "num_base_bdevs_discovered": 4, 00:17:34.521 "num_base_bdevs_operational": 4, 00:17:34.521 "base_bdevs_list": [ 00:17:34.521 { 00:17:34.521 "name": "BaseBdev1", 00:17:34.521 "uuid": "8bcccba8-80a5-4dbb-9e3f-962869fa31a3", 00:17:34.521 "is_configured": true, 00:17:34.521 "data_offset": 0, 00:17:34.521 "data_size": 65536 00:17:34.521 }, 00:17:34.521 { 00:17:34.521 "name": "BaseBdev2", 00:17:34.521 "uuid": "ba7aff0f-910c-46d5-9382-dbcf1f48dc51", 00:17:34.521 "is_configured": true, 00:17:34.521 "data_offset": 0, 00:17:34.521 "data_size": 65536 00:17:34.521 }, 00:17:34.521 { 00:17:34.521 "name": "BaseBdev3", 00:17:34.521 "uuid": "c52d68f4-2890-4fdd-8cdb-33320caab66b", 00:17:34.521 "is_configured": true, 00:17:34.521 "data_offset": 0, 00:17:34.521 "data_size": 65536 00:17:34.521 }, 00:17:34.521 { 00:17:34.521 "name": "BaseBdev4", 00:17:34.521 "uuid": "720de797-2614-4107-b7bf-bdedc29ce9ad", 00:17:34.521 "is_configured": true, 00:17:34.521 "data_offset": 0, 00:17:34.521 "data_size": 65536 00:17:34.521 } 00:17:34.521 ] 00:17:34.521 }' 00:17:34.521 08:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:34.521 08:30:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.088 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:35.088 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:35.088 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:35.088 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:35.088 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:35.088 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:35.088 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:35.088 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:35.088 [2024-07-23 08:30:47.534203] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:35.088 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:35.088 "name": "Existed_Raid", 00:17:35.088 "aliases": [ 00:17:35.088 "3915f0ef-0cea-4deb-9d41-9a5275faf7ec" 00:17:35.088 ], 00:17:35.088 "product_name": "Raid Volume", 00:17:35.088 "block_size": 512, 00:17:35.088 "num_blocks": 262144, 00:17:35.088 "uuid": "3915f0ef-0cea-4deb-9d41-9a5275faf7ec", 00:17:35.088 "assigned_rate_limits": { 00:17:35.088 "rw_ios_per_sec": 0, 00:17:35.088 "rw_mbytes_per_sec": 0, 00:17:35.088 "r_mbytes_per_sec": 0, 00:17:35.088 "w_mbytes_per_sec": 0 00:17:35.088 }, 00:17:35.088 "claimed": false, 00:17:35.088 "zoned": false, 00:17:35.088 "supported_io_types": { 00:17:35.088 "read": true, 00:17:35.088 "write": true, 00:17:35.088 "unmap": true, 00:17:35.088 "flush": true, 00:17:35.088 "reset": true, 00:17:35.088 "nvme_admin": false, 00:17:35.088 "nvme_io": false, 00:17:35.088 "nvme_io_md": false, 00:17:35.088 "write_zeroes": true, 00:17:35.088 "zcopy": false, 00:17:35.088 "get_zone_info": false, 00:17:35.088 "zone_management": false, 00:17:35.088 "zone_append": false, 00:17:35.088 "compare": false, 00:17:35.088 "compare_and_write": false, 00:17:35.088 "abort": false, 00:17:35.088 "seek_hole": false, 00:17:35.088 "seek_data": false, 00:17:35.088 "copy": false, 00:17:35.088 "nvme_iov_md": false 00:17:35.088 }, 00:17:35.088 "memory_domains": [ 00:17:35.088 { 00:17:35.088 "dma_device_id": "system", 00:17:35.088 "dma_device_type": 1 00:17:35.088 }, 00:17:35.088 { 00:17:35.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.088 "dma_device_type": 2 00:17:35.088 }, 00:17:35.088 { 00:17:35.088 "dma_device_id": "system", 00:17:35.088 "dma_device_type": 1 00:17:35.088 }, 00:17:35.088 { 00:17:35.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.088 "dma_device_type": 2 00:17:35.088 }, 00:17:35.088 { 00:17:35.088 "dma_device_id": "system", 00:17:35.088 "dma_device_type": 1 00:17:35.088 }, 00:17:35.088 { 00:17:35.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.088 "dma_device_type": 2 00:17:35.088 }, 00:17:35.088 { 00:17:35.088 "dma_device_id": "system", 00:17:35.088 "dma_device_type": 1 00:17:35.088 }, 00:17:35.088 { 00:17:35.089 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.089 "dma_device_type": 2 00:17:35.089 } 00:17:35.089 ], 00:17:35.089 "driver_specific": { 00:17:35.089 "raid": { 00:17:35.089 "uuid": "3915f0ef-0cea-4deb-9d41-9a5275faf7ec", 00:17:35.089 "strip_size_kb": 64, 00:17:35.089 "state": "online", 00:17:35.089 "raid_level": "raid0", 00:17:35.089 "superblock": false, 00:17:35.089 "num_base_bdevs": 4, 00:17:35.089 "num_base_bdevs_discovered": 4, 00:17:35.089 "num_base_bdevs_operational": 4, 00:17:35.089 "base_bdevs_list": [ 00:17:35.089 { 00:17:35.089 "name": "BaseBdev1", 00:17:35.089 "uuid": "8bcccba8-80a5-4dbb-9e3f-962869fa31a3", 00:17:35.089 "is_configured": true, 00:17:35.089 "data_offset": 0, 00:17:35.089 "data_size": 65536 00:17:35.089 }, 00:17:35.089 { 00:17:35.089 "name": "BaseBdev2", 00:17:35.089 "uuid": "ba7aff0f-910c-46d5-9382-dbcf1f48dc51", 00:17:35.089 "is_configured": true, 00:17:35.089 "data_offset": 0, 00:17:35.089 "data_size": 65536 00:17:35.089 }, 00:17:35.089 { 00:17:35.089 "name": "BaseBdev3", 00:17:35.089 "uuid": "c52d68f4-2890-4fdd-8cdb-33320caab66b", 00:17:35.089 "is_configured": true, 00:17:35.089 "data_offset": 0, 00:17:35.089 "data_size": 65536 00:17:35.089 }, 00:17:35.089 { 00:17:35.089 "name": "BaseBdev4", 00:17:35.089 "uuid": "720de797-2614-4107-b7bf-bdedc29ce9ad", 00:17:35.089 "is_configured": true, 00:17:35.089 "data_offset": 0, 00:17:35.089 "data_size": 65536 00:17:35.089 } 00:17:35.089 ] 00:17:35.089 } 00:17:35.089 } 00:17:35.089 }' 00:17:35.089 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:35.089 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:35.089 BaseBdev2 00:17:35.089 BaseBdev3 00:17:35.089 BaseBdev4' 00:17:35.089 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:35.089 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:35.089 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:35.347 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:35.347 "name": "BaseBdev1", 00:17:35.347 "aliases": [ 00:17:35.347 "8bcccba8-80a5-4dbb-9e3f-962869fa31a3" 00:17:35.347 ], 00:17:35.347 "product_name": "Malloc disk", 00:17:35.347 "block_size": 512, 00:17:35.347 "num_blocks": 65536, 00:17:35.347 "uuid": "8bcccba8-80a5-4dbb-9e3f-962869fa31a3", 00:17:35.347 "assigned_rate_limits": { 00:17:35.347 "rw_ios_per_sec": 0, 00:17:35.347 "rw_mbytes_per_sec": 0, 00:17:35.347 "r_mbytes_per_sec": 0, 00:17:35.347 "w_mbytes_per_sec": 0 00:17:35.347 }, 00:17:35.347 "claimed": true, 00:17:35.347 "claim_type": "exclusive_write", 00:17:35.347 "zoned": false, 00:17:35.347 "supported_io_types": { 00:17:35.347 "read": true, 00:17:35.347 "write": true, 00:17:35.347 "unmap": true, 00:17:35.347 "flush": true, 00:17:35.347 "reset": true, 00:17:35.347 "nvme_admin": false, 00:17:35.347 "nvme_io": false, 00:17:35.347 "nvme_io_md": false, 00:17:35.347 "write_zeroes": true, 00:17:35.347 "zcopy": true, 00:17:35.347 "get_zone_info": false, 00:17:35.347 "zone_management": false, 00:17:35.347 "zone_append": false, 00:17:35.347 "compare": false, 00:17:35.347 "compare_and_write": false, 00:17:35.347 "abort": true, 00:17:35.347 "seek_hole": false, 00:17:35.347 "seek_data": false, 00:17:35.347 "copy": true, 00:17:35.347 "nvme_iov_md": false 00:17:35.347 }, 00:17:35.347 "memory_domains": [ 00:17:35.347 { 00:17:35.347 "dma_device_id": "system", 00:17:35.347 "dma_device_type": 1 00:17:35.347 }, 00:17:35.347 { 00:17:35.347 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.347 "dma_device_type": 2 00:17:35.347 } 00:17:35.347 ], 00:17:35.347 "driver_specific": {} 00:17:35.347 }' 00:17:35.347 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:35.347 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:35.347 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:35.347 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:35.605 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:35.605 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:35.605 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:35.605 08:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:35.605 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:35.605 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:35.606 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:35.606 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:35.606 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:35.606 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:35.606 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:35.863 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:35.863 "name": "BaseBdev2", 00:17:35.863 "aliases": [ 00:17:35.863 "ba7aff0f-910c-46d5-9382-dbcf1f48dc51" 00:17:35.863 ], 00:17:35.863 "product_name": "Malloc disk", 00:17:35.863 "block_size": 512, 00:17:35.863 "num_blocks": 65536, 00:17:35.863 "uuid": "ba7aff0f-910c-46d5-9382-dbcf1f48dc51", 00:17:35.863 "assigned_rate_limits": { 00:17:35.863 "rw_ios_per_sec": 0, 00:17:35.863 "rw_mbytes_per_sec": 0, 00:17:35.863 "r_mbytes_per_sec": 0, 00:17:35.863 "w_mbytes_per_sec": 0 00:17:35.863 }, 00:17:35.863 "claimed": true, 00:17:35.863 "claim_type": "exclusive_write", 00:17:35.863 "zoned": false, 00:17:35.863 "supported_io_types": { 00:17:35.863 "read": true, 00:17:35.863 "write": true, 00:17:35.863 "unmap": true, 00:17:35.863 "flush": true, 00:17:35.863 "reset": true, 00:17:35.863 "nvme_admin": false, 00:17:35.863 "nvme_io": false, 00:17:35.863 "nvme_io_md": false, 00:17:35.863 "write_zeroes": true, 00:17:35.863 "zcopy": true, 00:17:35.863 "get_zone_info": false, 00:17:35.863 "zone_management": false, 00:17:35.863 "zone_append": false, 00:17:35.863 "compare": false, 00:17:35.863 "compare_and_write": false, 00:17:35.863 "abort": true, 00:17:35.863 "seek_hole": false, 00:17:35.863 "seek_data": false, 00:17:35.863 "copy": true, 00:17:35.863 "nvme_iov_md": false 00:17:35.863 }, 00:17:35.863 "memory_domains": [ 00:17:35.863 { 00:17:35.863 "dma_device_id": "system", 00:17:35.863 "dma_device_type": 1 00:17:35.863 }, 00:17:35.863 { 00:17:35.863 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.863 "dma_device_type": 2 00:17:35.863 } 00:17:35.863 ], 00:17:35.863 "driver_specific": {} 00:17:35.863 }' 00:17:35.863 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:35.863 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:35.863 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:35.863 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:35.863 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:36.121 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:36.121 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:36.121 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:36.121 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:36.121 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:36.121 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:36.121 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:36.121 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:36.121 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:36.121 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:36.379 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:36.379 "name": "BaseBdev3", 00:17:36.379 "aliases": [ 00:17:36.379 "c52d68f4-2890-4fdd-8cdb-33320caab66b" 00:17:36.379 ], 00:17:36.379 "product_name": "Malloc disk", 00:17:36.379 "block_size": 512, 00:17:36.379 "num_blocks": 65536, 00:17:36.379 "uuid": "c52d68f4-2890-4fdd-8cdb-33320caab66b", 00:17:36.379 "assigned_rate_limits": { 00:17:36.379 "rw_ios_per_sec": 0, 00:17:36.379 "rw_mbytes_per_sec": 0, 00:17:36.379 "r_mbytes_per_sec": 0, 00:17:36.379 "w_mbytes_per_sec": 0 00:17:36.379 }, 00:17:36.379 "claimed": true, 00:17:36.379 "claim_type": "exclusive_write", 00:17:36.379 "zoned": false, 00:17:36.379 "supported_io_types": { 00:17:36.379 "read": true, 00:17:36.379 "write": true, 00:17:36.379 "unmap": true, 00:17:36.379 "flush": true, 00:17:36.379 "reset": true, 00:17:36.379 "nvme_admin": false, 00:17:36.379 "nvme_io": false, 00:17:36.379 "nvme_io_md": false, 00:17:36.379 "write_zeroes": true, 00:17:36.379 "zcopy": true, 00:17:36.379 "get_zone_info": false, 00:17:36.379 "zone_management": false, 00:17:36.379 "zone_append": false, 00:17:36.379 "compare": false, 00:17:36.379 "compare_and_write": false, 00:17:36.379 "abort": true, 00:17:36.379 "seek_hole": false, 00:17:36.379 "seek_data": false, 00:17:36.379 "copy": true, 00:17:36.379 "nvme_iov_md": false 00:17:36.379 }, 00:17:36.379 "memory_domains": [ 00:17:36.379 { 00:17:36.379 "dma_device_id": "system", 00:17:36.379 "dma_device_type": 1 00:17:36.379 }, 00:17:36.379 { 00:17:36.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:36.379 "dma_device_type": 2 00:17:36.379 } 00:17:36.379 ], 00:17:36.379 "driver_specific": {} 00:17:36.379 }' 00:17:36.379 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:36.379 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:36.379 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:36.379 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:36.380 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:36.637 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:36.638 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:36.638 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:36.638 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:36.638 08:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:36.638 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:36.638 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:36.638 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:36.638 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:36.638 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:36.927 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:36.928 "name": "BaseBdev4", 00:17:36.928 "aliases": [ 00:17:36.928 "720de797-2614-4107-b7bf-bdedc29ce9ad" 00:17:36.928 ], 00:17:36.928 "product_name": "Malloc disk", 00:17:36.928 "block_size": 512, 00:17:36.928 "num_blocks": 65536, 00:17:36.928 "uuid": "720de797-2614-4107-b7bf-bdedc29ce9ad", 00:17:36.928 "assigned_rate_limits": { 00:17:36.928 "rw_ios_per_sec": 0, 00:17:36.928 "rw_mbytes_per_sec": 0, 00:17:36.928 "r_mbytes_per_sec": 0, 00:17:36.928 "w_mbytes_per_sec": 0 00:17:36.928 }, 00:17:36.928 "claimed": true, 00:17:36.928 "claim_type": "exclusive_write", 00:17:36.928 "zoned": false, 00:17:36.928 "supported_io_types": { 00:17:36.928 "read": true, 00:17:36.928 "write": true, 00:17:36.928 "unmap": true, 00:17:36.928 "flush": true, 00:17:36.928 "reset": true, 00:17:36.928 "nvme_admin": false, 00:17:36.928 "nvme_io": false, 00:17:36.928 "nvme_io_md": false, 00:17:36.928 "write_zeroes": true, 00:17:36.928 "zcopy": true, 00:17:36.928 "get_zone_info": false, 00:17:36.928 "zone_management": false, 00:17:36.928 "zone_append": false, 00:17:36.928 "compare": false, 00:17:36.928 "compare_and_write": false, 00:17:36.928 "abort": true, 00:17:36.928 "seek_hole": false, 00:17:36.928 "seek_data": false, 00:17:36.928 "copy": true, 00:17:36.928 "nvme_iov_md": false 00:17:36.928 }, 00:17:36.928 "memory_domains": [ 00:17:36.928 { 00:17:36.928 "dma_device_id": "system", 00:17:36.928 "dma_device_type": 1 00:17:36.928 }, 00:17:36.928 { 00:17:36.928 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:36.928 "dma_device_type": 2 00:17:36.928 } 00:17:36.928 ], 00:17:36.928 "driver_specific": {} 00:17:36.928 }' 00:17:36.928 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:36.928 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:36.928 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:36.928 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:36.928 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:36.928 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:36.928 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:37.190 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:37.190 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:37.190 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:37.190 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:37.190 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:37.190 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:37.449 [2024-07-23 08:30:49.711709] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:37.449 [2024-07-23 08:30:49.711738] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:37.449 [2024-07-23 08:30:49.711787] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:37.449 "name": "Existed_Raid", 00:17:37.449 "uuid": "3915f0ef-0cea-4deb-9d41-9a5275faf7ec", 00:17:37.449 "strip_size_kb": 64, 00:17:37.449 "state": "offline", 00:17:37.449 "raid_level": "raid0", 00:17:37.449 "superblock": false, 00:17:37.449 "num_base_bdevs": 4, 00:17:37.449 "num_base_bdevs_discovered": 3, 00:17:37.449 "num_base_bdevs_operational": 3, 00:17:37.449 "base_bdevs_list": [ 00:17:37.449 { 00:17:37.449 "name": null, 00:17:37.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.449 "is_configured": false, 00:17:37.449 "data_offset": 0, 00:17:37.449 "data_size": 65536 00:17:37.449 }, 00:17:37.449 { 00:17:37.449 "name": "BaseBdev2", 00:17:37.449 "uuid": "ba7aff0f-910c-46d5-9382-dbcf1f48dc51", 00:17:37.449 "is_configured": true, 00:17:37.449 "data_offset": 0, 00:17:37.449 "data_size": 65536 00:17:37.449 }, 00:17:37.449 { 00:17:37.449 "name": "BaseBdev3", 00:17:37.449 "uuid": "c52d68f4-2890-4fdd-8cdb-33320caab66b", 00:17:37.449 "is_configured": true, 00:17:37.449 "data_offset": 0, 00:17:37.449 "data_size": 65536 00:17:37.449 }, 00:17:37.449 { 00:17:37.449 "name": "BaseBdev4", 00:17:37.449 "uuid": "720de797-2614-4107-b7bf-bdedc29ce9ad", 00:17:37.449 "is_configured": true, 00:17:37.449 "data_offset": 0, 00:17:37.449 "data_size": 65536 00:17:37.449 } 00:17:37.449 ] 00:17:37.449 }' 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:37.449 08:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.016 08:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:38.016 08:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:38.016 08:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.016 08:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:38.274 08:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:38.274 08:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:38.274 08:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:38.274 [2024-07-23 08:30:50.767279] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:38.533 08:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:38.533 08:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:38.533 08:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.533 08:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:38.533 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:38.533 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:38.533 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:38.792 [2024-07-23 08:30:51.206324] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:39.050 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:39.050 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:39.050 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.050 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:39.050 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:39.050 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:39.050 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:39.309 [2024-07-23 08:30:51.674072] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:39.309 [2024-07-23 08:30:51.674125] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:17:39.309 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:39.309 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:39.309 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.309 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:39.568 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:39.568 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:39.568 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:39.568 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:39.568 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:39.568 08:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:39.826 BaseBdev2 00:17:39.826 08:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:39.826 08:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:39.826 08:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:39.826 08:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:39.826 08:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:39.826 08:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:39.826 08:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:39.826 08:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:40.085 [ 00:17:40.085 { 00:17:40.085 "name": "BaseBdev2", 00:17:40.085 "aliases": [ 00:17:40.085 "89c8c0be-d213-4240-9000-8b1c68235e14" 00:17:40.085 ], 00:17:40.085 "product_name": "Malloc disk", 00:17:40.085 "block_size": 512, 00:17:40.085 "num_blocks": 65536, 00:17:40.085 "uuid": "89c8c0be-d213-4240-9000-8b1c68235e14", 00:17:40.085 "assigned_rate_limits": { 00:17:40.085 "rw_ios_per_sec": 0, 00:17:40.085 "rw_mbytes_per_sec": 0, 00:17:40.085 "r_mbytes_per_sec": 0, 00:17:40.085 "w_mbytes_per_sec": 0 00:17:40.085 }, 00:17:40.085 "claimed": false, 00:17:40.085 "zoned": false, 00:17:40.085 "supported_io_types": { 00:17:40.085 "read": true, 00:17:40.085 "write": true, 00:17:40.085 "unmap": true, 00:17:40.085 "flush": true, 00:17:40.085 "reset": true, 00:17:40.085 "nvme_admin": false, 00:17:40.085 "nvme_io": false, 00:17:40.085 "nvme_io_md": false, 00:17:40.085 "write_zeroes": true, 00:17:40.085 "zcopy": true, 00:17:40.085 "get_zone_info": false, 00:17:40.085 "zone_management": false, 00:17:40.085 "zone_append": false, 00:17:40.085 "compare": false, 00:17:40.085 "compare_and_write": false, 00:17:40.085 "abort": true, 00:17:40.085 "seek_hole": false, 00:17:40.085 "seek_data": false, 00:17:40.085 "copy": true, 00:17:40.085 "nvme_iov_md": false 00:17:40.085 }, 00:17:40.085 "memory_domains": [ 00:17:40.085 { 00:17:40.085 "dma_device_id": "system", 00:17:40.085 "dma_device_type": 1 00:17:40.085 }, 00:17:40.085 { 00:17:40.085 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.085 "dma_device_type": 2 00:17:40.085 } 00:17:40.085 ], 00:17:40.085 "driver_specific": {} 00:17:40.085 } 00:17:40.085 ] 00:17:40.085 08:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:40.085 08:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:40.085 08:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:40.085 08:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:40.343 BaseBdev3 00:17:40.343 08:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:40.343 08:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:40.343 08:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:40.343 08:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:40.343 08:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:40.343 08:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:40.344 08:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:40.344 08:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:40.601 [ 00:17:40.601 { 00:17:40.601 "name": "BaseBdev3", 00:17:40.601 "aliases": [ 00:17:40.601 "b890f75f-90e2-4f2a-8907-b2b185da2eb6" 00:17:40.601 ], 00:17:40.601 "product_name": "Malloc disk", 00:17:40.601 "block_size": 512, 00:17:40.601 "num_blocks": 65536, 00:17:40.601 "uuid": "b890f75f-90e2-4f2a-8907-b2b185da2eb6", 00:17:40.601 "assigned_rate_limits": { 00:17:40.601 "rw_ios_per_sec": 0, 00:17:40.601 "rw_mbytes_per_sec": 0, 00:17:40.601 "r_mbytes_per_sec": 0, 00:17:40.601 "w_mbytes_per_sec": 0 00:17:40.601 }, 00:17:40.601 "claimed": false, 00:17:40.601 "zoned": false, 00:17:40.601 "supported_io_types": { 00:17:40.601 "read": true, 00:17:40.601 "write": true, 00:17:40.601 "unmap": true, 00:17:40.601 "flush": true, 00:17:40.601 "reset": true, 00:17:40.601 "nvme_admin": false, 00:17:40.601 "nvme_io": false, 00:17:40.601 "nvme_io_md": false, 00:17:40.601 "write_zeroes": true, 00:17:40.601 "zcopy": true, 00:17:40.601 "get_zone_info": false, 00:17:40.601 "zone_management": false, 00:17:40.601 "zone_append": false, 00:17:40.601 "compare": false, 00:17:40.601 "compare_and_write": false, 00:17:40.601 "abort": true, 00:17:40.601 "seek_hole": false, 00:17:40.601 "seek_data": false, 00:17:40.601 "copy": true, 00:17:40.601 "nvme_iov_md": false 00:17:40.601 }, 00:17:40.601 "memory_domains": [ 00:17:40.601 { 00:17:40.601 "dma_device_id": "system", 00:17:40.601 "dma_device_type": 1 00:17:40.601 }, 00:17:40.601 { 00:17:40.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.601 "dma_device_type": 2 00:17:40.601 } 00:17:40.601 ], 00:17:40.601 "driver_specific": {} 00:17:40.601 } 00:17:40.601 ] 00:17:40.601 08:30:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:40.601 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:40.601 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:40.601 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:40.859 BaseBdev4 00:17:40.859 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:40.859 08:30:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:40.859 08:30:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:40.859 08:30:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:40.859 08:30:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:40.859 08:30:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:40.859 08:30:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:41.117 08:30:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:41.117 [ 00:17:41.117 { 00:17:41.117 "name": "BaseBdev4", 00:17:41.117 "aliases": [ 00:17:41.117 "b2d30b95-2895-469d-a6cd-abe7ccf074d7" 00:17:41.117 ], 00:17:41.117 "product_name": "Malloc disk", 00:17:41.117 "block_size": 512, 00:17:41.117 "num_blocks": 65536, 00:17:41.117 "uuid": "b2d30b95-2895-469d-a6cd-abe7ccf074d7", 00:17:41.117 "assigned_rate_limits": { 00:17:41.117 "rw_ios_per_sec": 0, 00:17:41.117 "rw_mbytes_per_sec": 0, 00:17:41.117 "r_mbytes_per_sec": 0, 00:17:41.117 "w_mbytes_per_sec": 0 00:17:41.117 }, 00:17:41.117 "claimed": false, 00:17:41.117 "zoned": false, 00:17:41.117 "supported_io_types": { 00:17:41.117 "read": true, 00:17:41.117 "write": true, 00:17:41.117 "unmap": true, 00:17:41.117 "flush": true, 00:17:41.117 "reset": true, 00:17:41.117 "nvme_admin": false, 00:17:41.117 "nvme_io": false, 00:17:41.117 "nvme_io_md": false, 00:17:41.117 "write_zeroes": true, 00:17:41.117 "zcopy": true, 00:17:41.117 "get_zone_info": false, 00:17:41.117 "zone_management": false, 00:17:41.117 "zone_append": false, 00:17:41.117 "compare": false, 00:17:41.117 "compare_and_write": false, 00:17:41.117 "abort": true, 00:17:41.117 "seek_hole": false, 00:17:41.117 "seek_data": false, 00:17:41.117 "copy": true, 00:17:41.117 "nvme_iov_md": false 00:17:41.117 }, 00:17:41.117 "memory_domains": [ 00:17:41.117 { 00:17:41.117 "dma_device_id": "system", 00:17:41.117 "dma_device_type": 1 00:17:41.117 }, 00:17:41.117 { 00:17:41.117 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:41.117 "dma_device_type": 2 00:17:41.117 } 00:17:41.118 ], 00:17:41.118 "driver_specific": {} 00:17:41.118 } 00:17:41.118 ] 00:17:41.118 08:30:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:41.118 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:41.118 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:41.118 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:41.376 [2024-07-23 08:30:53.717892] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:41.376 [2024-07-23 08:30:53.717930] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:41.376 [2024-07-23 08:30:53.717970] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:41.376 [2024-07-23 08:30:53.719531] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:41.376 [2024-07-23 08:30:53.719577] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:41.376 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:41.376 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:41.376 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:41.376 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:41.376 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:41.376 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:41.376 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:41.376 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:41.376 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:41.376 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:41.376 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.376 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:41.634 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:41.634 "name": "Existed_Raid", 00:17:41.634 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:41.634 "strip_size_kb": 64, 00:17:41.634 "state": "configuring", 00:17:41.634 "raid_level": "raid0", 00:17:41.634 "superblock": false, 00:17:41.634 "num_base_bdevs": 4, 00:17:41.634 "num_base_bdevs_discovered": 3, 00:17:41.634 "num_base_bdevs_operational": 4, 00:17:41.634 "base_bdevs_list": [ 00:17:41.634 { 00:17:41.634 "name": "BaseBdev1", 00:17:41.634 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:41.634 "is_configured": false, 00:17:41.634 "data_offset": 0, 00:17:41.634 "data_size": 0 00:17:41.634 }, 00:17:41.634 { 00:17:41.634 "name": "BaseBdev2", 00:17:41.634 "uuid": "89c8c0be-d213-4240-9000-8b1c68235e14", 00:17:41.634 "is_configured": true, 00:17:41.634 "data_offset": 0, 00:17:41.634 "data_size": 65536 00:17:41.634 }, 00:17:41.634 { 00:17:41.634 "name": "BaseBdev3", 00:17:41.634 "uuid": "b890f75f-90e2-4f2a-8907-b2b185da2eb6", 00:17:41.634 "is_configured": true, 00:17:41.634 "data_offset": 0, 00:17:41.634 "data_size": 65536 00:17:41.634 }, 00:17:41.634 { 00:17:41.634 "name": "BaseBdev4", 00:17:41.634 "uuid": "b2d30b95-2895-469d-a6cd-abe7ccf074d7", 00:17:41.634 "is_configured": true, 00:17:41.634 "data_offset": 0, 00:17:41.634 "data_size": 65536 00:17:41.634 } 00:17:41.634 ] 00:17:41.634 }' 00:17:41.634 08:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:41.634 08:30:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.891 08:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:42.150 [2024-07-23 08:30:54.503957] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:42.150 08:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:42.150 08:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:42.150 08:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:42.150 08:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:42.150 08:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:42.150 08:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:42.150 08:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:42.150 08:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:42.150 08:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:42.150 08:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:42.150 08:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.150 08:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:42.408 08:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:42.408 "name": "Existed_Raid", 00:17:42.408 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.408 "strip_size_kb": 64, 00:17:42.408 "state": "configuring", 00:17:42.408 "raid_level": "raid0", 00:17:42.408 "superblock": false, 00:17:42.408 "num_base_bdevs": 4, 00:17:42.408 "num_base_bdevs_discovered": 2, 00:17:42.408 "num_base_bdevs_operational": 4, 00:17:42.408 "base_bdevs_list": [ 00:17:42.408 { 00:17:42.408 "name": "BaseBdev1", 00:17:42.408 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.408 "is_configured": false, 00:17:42.408 "data_offset": 0, 00:17:42.408 "data_size": 0 00:17:42.408 }, 00:17:42.408 { 00:17:42.408 "name": null, 00:17:42.408 "uuid": "89c8c0be-d213-4240-9000-8b1c68235e14", 00:17:42.408 "is_configured": false, 00:17:42.408 "data_offset": 0, 00:17:42.408 "data_size": 65536 00:17:42.408 }, 00:17:42.408 { 00:17:42.408 "name": "BaseBdev3", 00:17:42.408 "uuid": "b890f75f-90e2-4f2a-8907-b2b185da2eb6", 00:17:42.408 "is_configured": true, 00:17:42.408 "data_offset": 0, 00:17:42.408 "data_size": 65536 00:17:42.408 }, 00:17:42.408 { 00:17:42.408 "name": "BaseBdev4", 00:17:42.408 "uuid": "b2d30b95-2895-469d-a6cd-abe7ccf074d7", 00:17:42.408 "is_configured": true, 00:17:42.408 "data_offset": 0, 00:17:42.408 "data_size": 65536 00:17:42.408 } 00:17:42.408 ] 00:17:42.408 }' 00:17:42.408 08:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:42.408 08:30:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:42.666 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:42.666 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.924 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:42.925 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:43.183 [2024-07-23 08:30:55.527564] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:43.183 BaseBdev1 00:17:43.183 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:43.183 08:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:43.183 08:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:43.183 08:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:43.183 08:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:43.183 08:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:43.183 08:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:43.442 08:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:43.442 [ 00:17:43.442 { 00:17:43.442 "name": "BaseBdev1", 00:17:43.442 "aliases": [ 00:17:43.442 "caa684d0-e39c-4c46-b314-461bbe38bb86" 00:17:43.442 ], 00:17:43.442 "product_name": "Malloc disk", 00:17:43.442 "block_size": 512, 00:17:43.442 "num_blocks": 65536, 00:17:43.442 "uuid": "caa684d0-e39c-4c46-b314-461bbe38bb86", 00:17:43.442 "assigned_rate_limits": { 00:17:43.442 "rw_ios_per_sec": 0, 00:17:43.442 "rw_mbytes_per_sec": 0, 00:17:43.442 "r_mbytes_per_sec": 0, 00:17:43.442 "w_mbytes_per_sec": 0 00:17:43.442 }, 00:17:43.442 "claimed": true, 00:17:43.442 "claim_type": "exclusive_write", 00:17:43.442 "zoned": false, 00:17:43.442 "supported_io_types": { 00:17:43.442 "read": true, 00:17:43.442 "write": true, 00:17:43.442 "unmap": true, 00:17:43.442 "flush": true, 00:17:43.442 "reset": true, 00:17:43.442 "nvme_admin": false, 00:17:43.442 "nvme_io": false, 00:17:43.442 "nvme_io_md": false, 00:17:43.442 "write_zeroes": true, 00:17:43.442 "zcopy": true, 00:17:43.442 "get_zone_info": false, 00:17:43.442 "zone_management": false, 00:17:43.442 "zone_append": false, 00:17:43.442 "compare": false, 00:17:43.442 "compare_and_write": false, 00:17:43.442 "abort": true, 00:17:43.442 "seek_hole": false, 00:17:43.442 "seek_data": false, 00:17:43.442 "copy": true, 00:17:43.442 "nvme_iov_md": false 00:17:43.442 }, 00:17:43.442 "memory_domains": [ 00:17:43.442 { 00:17:43.442 "dma_device_id": "system", 00:17:43.442 "dma_device_type": 1 00:17:43.442 }, 00:17:43.442 { 00:17:43.442 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.442 "dma_device_type": 2 00:17:43.442 } 00:17:43.442 ], 00:17:43.442 "driver_specific": {} 00:17:43.442 } 00:17:43.442 ] 00:17:43.442 08:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:43.442 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:43.442 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:43.442 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:43.442 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:43.442 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:43.442 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:43.442 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:43.442 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:43.442 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:43.442 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:43.442 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.442 08:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:43.700 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:43.700 "name": "Existed_Raid", 00:17:43.700 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:43.700 "strip_size_kb": 64, 00:17:43.700 "state": "configuring", 00:17:43.700 "raid_level": "raid0", 00:17:43.700 "superblock": false, 00:17:43.700 "num_base_bdevs": 4, 00:17:43.700 "num_base_bdevs_discovered": 3, 00:17:43.700 "num_base_bdevs_operational": 4, 00:17:43.700 "base_bdevs_list": [ 00:17:43.700 { 00:17:43.700 "name": "BaseBdev1", 00:17:43.700 "uuid": "caa684d0-e39c-4c46-b314-461bbe38bb86", 00:17:43.700 "is_configured": true, 00:17:43.700 "data_offset": 0, 00:17:43.700 "data_size": 65536 00:17:43.700 }, 00:17:43.700 { 00:17:43.700 "name": null, 00:17:43.700 "uuid": "89c8c0be-d213-4240-9000-8b1c68235e14", 00:17:43.700 "is_configured": false, 00:17:43.700 "data_offset": 0, 00:17:43.700 "data_size": 65536 00:17:43.700 }, 00:17:43.700 { 00:17:43.700 "name": "BaseBdev3", 00:17:43.700 "uuid": "b890f75f-90e2-4f2a-8907-b2b185da2eb6", 00:17:43.701 "is_configured": true, 00:17:43.701 "data_offset": 0, 00:17:43.701 "data_size": 65536 00:17:43.701 }, 00:17:43.701 { 00:17:43.701 "name": "BaseBdev4", 00:17:43.701 "uuid": "b2d30b95-2895-469d-a6cd-abe7ccf074d7", 00:17:43.701 "is_configured": true, 00:17:43.701 "data_offset": 0, 00:17:43.701 "data_size": 65536 00:17:43.701 } 00:17:43.701 ] 00:17:43.701 }' 00:17:43.701 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:43.701 08:30:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:44.267 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.267 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:44.267 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:44.267 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:44.526 [2024-07-23 08:30:56.803105] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:44.526 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:44.526 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:44.526 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:44.526 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:44.526 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:44.526 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:44.526 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:44.526 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:44.526 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:44.526 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:44.526 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.526 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:44.526 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:44.526 "name": "Existed_Raid", 00:17:44.526 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:44.526 "strip_size_kb": 64, 00:17:44.526 "state": "configuring", 00:17:44.526 "raid_level": "raid0", 00:17:44.526 "superblock": false, 00:17:44.526 "num_base_bdevs": 4, 00:17:44.526 "num_base_bdevs_discovered": 2, 00:17:44.526 "num_base_bdevs_operational": 4, 00:17:44.526 "base_bdevs_list": [ 00:17:44.526 { 00:17:44.526 "name": "BaseBdev1", 00:17:44.526 "uuid": "caa684d0-e39c-4c46-b314-461bbe38bb86", 00:17:44.526 "is_configured": true, 00:17:44.526 "data_offset": 0, 00:17:44.526 "data_size": 65536 00:17:44.526 }, 00:17:44.526 { 00:17:44.526 "name": null, 00:17:44.526 "uuid": "89c8c0be-d213-4240-9000-8b1c68235e14", 00:17:44.526 "is_configured": false, 00:17:44.526 "data_offset": 0, 00:17:44.526 "data_size": 65536 00:17:44.526 }, 00:17:44.526 { 00:17:44.526 "name": null, 00:17:44.526 "uuid": "b890f75f-90e2-4f2a-8907-b2b185da2eb6", 00:17:44.526 "is_configured": false, 00:17:44.526 "data_offset": 0, 00:17:44.526 "data_size": 65536 00:17:44.526 }, 00:17:44.526 { 00:17:44.526 "name": "BaseBdev4", 00:17:44.526 "uuid": "b2d30b95-2895-469d-a6cd-abe7ccf074d7", 00:17:44.526 "is_configured": true, 00:17:44.526 "data_offset": 0, 00:17:44.526 "data_size": 65536 00:17:44.526 } 00:17:44.526 ] 00:17:44.526 }' 00:17:44.526 08:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:44.526 08:30:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:45.093 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.093 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:45.351 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:45.351 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:45.351 [2024-07-23 08:30:57.789767] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:45.351 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:45.351 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:45.351 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:45.351 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:45.351 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:45.351 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:45.351 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:45.351 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:45.351 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:45.351 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:45.351 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:45.351 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.609 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:45.609 "name": "Existed_Raid", 00:17:45.609 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:45.609 "strip_size_kb": 64, 00:17:45.609 "state": "configuring", 00:17:45.609 "raid_level": "raid0", 00:17:45.609 "superblock": false, 00:17:45.609 "num_base_bdevs": 4, 00:17:45.609 "num_base_bdevs_discovered": 3, 00:17:45.609 "num_base_bdevs_operational": 4, 00:17:45.609 "base_bdevs_list": [ 00:17:45.609 { 00:17:45.609 "name": "BaseBdev1", 00:17:45.609 "uuid": "caa684d0-e39c-4c46-b314-461bbe38bb86", 00:17:45.609 "is_configured": true, 00:17:45.609 "data_offset": 0, 00:17:45.609 "data_size": 65536 00:17:45.609 }, 00:17:45.609 { 00:17:45.609 "name": null, 00:17:45.609 "uuid": "89c8c0be-d213-4240-9000-8b1c68235e14", 00:17:45.609 "is_configured": false, 00:17:45.609 "data_offset": 0, 00:17:45.609 "data_size": 65536 00:17:45.609 }, 00:17:45.609 { 00:17:45.609 "name": "BaseBdev3", 00:17:45.609 "uuid": "b890f75f-90e2-4f2a-8907-b2b185da2eb6", 00:17:45.609 "is_configured": true, 00:17:45.609 "data_offset": 0, 00:17:45.609 "data_size": 65536 00:17:45.609 }, 00:17:45.609 { 00:17:45.609 "name": "BaseBdev4", 00:17:45.609 "uuid": "b2d30b95-2895-469d-a6cd-abe7ccf074d7", 00:17:45.609 "is_configured": true, 00:17:45.609 "data_offset": 0, 00:17:45.609 "data_size": 65536 00:17:45.609 } 00:17:45.609 ] 00:17:45.609 }' 00:17:45.609 08:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:45.609 08:30:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:46.174 08:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:46.174 08:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:46.174 08:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:46.174 08:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:46.432 [2024-07-23 08:30:58.800399] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:46.432 08:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:46.432 08:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:46.432 08:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:46.432 08:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:46.432 08:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:46.433 08:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:46.433 08:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:46.433 08:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:46.433 08:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:46.433 08:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:46.433 08:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:46.433 08:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:46.690 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:46.690 "name": "Existed_Raid", 00:17:46.690 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:46.690 "strip_size_kb": 64, 00:17:46.690 "state": "configuring", 00:17:46.690 "raid_level": "raid0", 00:17:46.690 "superblock": false, 00:17:46.690 "num_base_bdevs": 4, 00:17:46.690 "num_base_bdevs_discovered": 2, 00:17:46.690 "num_base_bdevs_operational": 4, 00:17:46.690 "base_bdevs_list": [ 00:17:46.690 { 00:17:46.690 "name": null, 00:17:46.690 "uuid": "caa684d0-e39c-4c46-b314-461bbe38bb86", 00:17:46.690 "is_configured": false, 00:17:46.690 "data_offset": 0, 00:17:46.690 "data_size": 65536 00:17:46.690 }, 00:17:46.690 { 00:17:46.690 "name": null, 00:17:46.690 "uuid": "89c8c0be-d213-4240-9000-8b1c68235e14", 00:17:46.690 "is_configured": false, 00:17:46.690 "data_offset": 0, 00:17:46.690 "data_size": 65536 00:17:46.690 }, 00:17:46.690 { 00:17:46.690 "name": "BaseBdev3", 00:17:46.690 "uuid": "b890f75f-90e2-4f2a-8907-b2b185da2eb6", 00:17:46.690 "is_configured": true, 00:17:46.690 "data_offset": 0, 00:17:46.690 "data_size": 65536 00:17:46.690 }, 00:17:46.691 { 00:17:46.691 "name": "BaseBdev4", 00:17:46.691 "uuid": "b2d30b95-2895-469d-a6cd-abe7ccf074d7", 00:17:46.691 "is_configured": true, 00:17:46.691 "data_offset": 0, 00:17:46.691 "data_size": 65536 00:17:46.691 } 00:17:46.691 ] 00:17:46.691 }' 00:17:46.691 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:46.691 08:30:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:47.255 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.255 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:47.255 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:47.255 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:47.513 [2024-07-23 08:30:59.915373] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:47.513 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:47.513 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:47.513 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:47.513 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:47.513 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:47.513 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:47.513 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:47.513 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:47.513 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:47.513 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:47.513 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.513 08:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:47.770 08:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:47.770 "name": "Existed_Raid", 00:17:47.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:47.770 "strip_size_kb": 64, 00:17:47.770 "state": "configuring", 00:17:47.770 "raid_level": "raid0", 00:17:47.770 "superblock": false, 00:17:47.770 "num_base_bdevs": 4, 00:17:47.770 "num_base_bdevs_discovered": 3, 00:17:47.770 "num_base_bdevs_operational": 4, 00:17:47.770 "base_bdevs_list": [ 00:17:47.771 { 00:17:47.771 "name": null, 00:17:47.771 "uuid": "caa684d0-e39c-4c46-b314-461bbe38bb86", 00:17:47.771 "is_configured": false, 00:17:47.771 "data_offset": 0, 00:17:47.771 "data_size": 65536 00:17:47.771 }, 00:17:47.771 { 00:17:47.771 "name": "BaseBdev2", 00:17:47.771 "uuid": "89c8c0be-d213-4240-9000-8b1c68235e14", 00:17:47.771 "is_configured": true, 00:17:47.771 "data_offset": 0, 00:17:47.771 "data_size": 65536 00:17:47.771 }, 00:17:47.771 { 00:17:47.771 "name": "BaseBdev3", 00:17:47.771 "uuid": "b890f75f-90e2-4f2a-8907-b2b185da2eb6", 00:17:47.771 "is_configured": true, 00:17:47.771 "data_offset": 0, 00:17:47.771 "data_size": 65536 00:17:47.771 }, 00:17:47.771 { 00:17:47.771 "name": "BaseBdev4", 00:17:47.771 "uuid": "b2d30b95-2895-469d-a6cd-abe7ccf074d7", 00:17:47.771 "is_configured": true, 00:17:47.771 "data_offset": 0, 00:17:47.771 "data_size": 65536 00:17:47.771 } 00:17:47.771 ] 00:17:47.771 }' 00:17:47.771 08:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:47.771 08:31:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:48.336 08:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:48.336 08:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:48.336 08:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:48.336 08:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:48.336 08:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:48.594 08:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u caa684d0-e39c-4c46-b314-461bbe38bb86 00:17:48.852 [2024-07-23 08:31:01.119766] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:48.852 [2024-07-23 08:31:01.119810] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037280 00:17:48.852 [2024-07-23 08:31:01.119818] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:48.852 [2024-07-23 08:31:01.120092] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c3a0 00:17:48.852 [2024-07-23 08:31:01.120266] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037280 00:17:48.852 [2024-07-23 08:31:01.120276] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000037280 00:17:48.852 [2024-07-23 08:31:01.120512] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:48.852 NewBaseBdev 00:17:48.852 08:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:48.852 08:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:48.852 08:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:48.853 08:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:48.853 08:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:48.853 08:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:48.853 08:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:48.853 08:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:49.111 [ 00:17:49.111 { 00:17:49.111 "name": "NewBaseBdev", 00:17:49.111 "aliases": [ 00:17:49.111 "caa684d0-e39c-4c46-b314-461bbe38bb86" 00:17:49.111 ], 00:17:49.111 "product_name": "Malloc disk", 00:17:49.111 "block_size": 512, 00:17:49.111 "num_blocks": 65536, 00:17:49.111 "uuid": "caa684d0-e39c-4c46-b314-461bbe38bb86", 00:17:49.111 "assigned_rate_limits": { 00:17:49.111 "rw_ios_per_sec": 0, 00:17:49.111 "rw_mbytes_per_sec": 0, 00:17:49.111 "r_mbytes_per_sec": 0, 00:17:49.111 "w_mbytes_per_sec": 0 00:17:49.111 }, 00:17:49.111 "claimed": true, 00:17:49.111 "claim_type": "exclusive_write", 00:17:49.111 "zoned": false, 00:17:49.111 "supported_io_types": { 00:17:49.111 "read": true, 00:17:49.111 "write": true, 00:17:49.111 "unmap": true, 00:17:49.111 "flush": true, 00:17:49.111 "reset": true, 00:17:49.111 "nvme_admin": false, 00:17:49.111 "nvme_io": false, 00:17:49.111 "nvme_io_md": false, 00:17:49.111 "write_zeroes": true, 00:17:49.111 "zcopy": true, 00:17:49.111 "get_zone_info": false, 00:17:49.111 "zone_management": false, 00:17:49.111 "zone_append": false, 00:17:49.111 "compare": false, 00:17:49.111 "compare_and_write": false, 00:17:49.111 "abort": true, 00:17:49.111 "seek_hole": false, 00:17:49.111 "seek_data": false, 00:17:49.111 "copy": true, 00:17:49.111 "nvme_iov_md": false 00:17:49.111 }, 00:17:49.111 "memory_domains": [ 00:17:49.111 { 00:17:49.111 "dma_device_id": "system", 00:17:49.111 "dma_device_type": 1 00:17:49.111 }, 00:17:49.111 { 00:17:49.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.111 "dma_device_type": 2 00:17:49.111 } 00:17:49.111 ], 00:17:49.111 "driver_specific": {} 00:17:49.111 } 00:17:49.111 ] 00:17:49.111 08:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:49.111 08:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:49.111 08:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:49.111 08:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:49.111 08:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:49.111 08:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:49.111 08:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:49.111 08:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:49.111 08:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:49.111 08:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:49.111 08:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:49.111 08:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.111 08:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:49.369 08:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:49.369 "name": "Existed_Raid", 00:17:49.369 "uuid": "d765aeb2-7780-43c7-9804-9f49faf69466", 00:17:49.369 "strip_size_kb": 64, 00:17:49.369 "state": "online", 00:17:49.369 "raid_level": "raid0", 00:17:49.369 "superblock": false, 00:17:49.369 "num_base_bdevs": 4, 00:17:49.369 "num_base_bdevs_discovered": 4, 00:17:49.369 "num_base_bdevs_operational": 4, 00:17:49.369 "base_bdevs_list": [ 00:17:49.369 { 00:17:49.369 "name": "NewBaseBdev", 00:17:49.369 "uuid": "caa684d0-e39c-4c46-b314-461bbe38bb86", 00:17:49.369 "is_configured": true, 00:17:49.369 "data_offset": 0, 00:17:49.369 "data_size": 65536 00:17:49.369 }, 00:17:49.369 { 00:17:49.369 "name": "BaseBdev2", 00:17:49.369 "uuid": "89c8c0be-d213-4240-9000-8b1c68235e14", 00:17:49.369 "is_configured": true, 00:17:49.369 "data_offset": 0, 00:17:49.369 "data_size": 65536 00:17:49.369 }, 00:17:49.369 { 00:17:49.369 "name": "BaseBdev3", 00:17:49.369 "uuid": "b890f75f-90e2-4f2a-8907-b2b185da2eb6", 00:17:49.369 "is_configured": true, 00:17:49.369 "data_offset": 0, 00:17:49.369 "data_size": 65536 00:17:49.369 }, 00:17:49.369 { 00:17:49.369 "name": "BaseBdev4", 00:17:49.369 "uuid": "b2d30b95-2895-469d-a6cd-abe7ccf074d7", 00:17:49.369 "is_configured": true, 00:17:49.369 "data_offset": 0, 00:17:49.369 "data_size": 65536 00:17:49.369 } 00:17:49.369 ] 00:17:49.369 }' 00:17:49.369 08:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:49.369 08:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:49.954 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:49.954 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:49.954 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:49.954 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:49.954 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:49.954 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:49.954 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:49.954 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:49.954 [2024-07-23 08:31:02.291198] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:49.954 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:49.954 "name": "Existed_Raid", 00:17:49.954 "aliases": [ 00:17:49.954 "d765aeb2-7780-43c7-9804-9f49faf69466" 00:17:49.954 ], 00:17:49.954 "product_name": "Raid Volume", 00:17:49.954 "block_size": 512, 00:17:49.954 "num_blocks": 262144, 00:17:49.954 "uuid": "d765aeb2-7780-43c7-9804-9f49faf69466", 00:17:49.954 "assigned_rate_limits": { 00:17:49.954 "rw_ios_per_sec": 0, 00:17:49.954 "rw_mbytes_per_sec": 0, 00:17:49.954 "r_mbytes_per_sec": 0, 00:17:49.954 "w_mbytes_per_sec": 0 00:17:49.954 }, 00:17:49.954 "claimed": false, 00:17:49.954 "zoned": false, 00:17:49.954 "supported_io_types": { 00:17:49.954 "read": true, 00:17:49.954 "write": true, 00:17:49.954 "unmap": true, 00:17:49.954 "flush": true, 00:17:49.954 "reset": true, 00:17:49.954 "nvme_admin": false, 00:17:49.954 "nvme_io": false, 00:17:49.954 "nvme_io_md": false, 00:17:49.954 "write_zeroes": true, 00:17:49.954 "zcopy": false, 00:17:49.954 "get_zone_info": false, 00:17:49.954 "zone_management": false, 00:17:49.954 "zone_append": false, 00:17:49.954 "compare": false, 00:17:49.954 "compare_and_write": false, 00:17:49.954 "abort": false, 00:17:49.954 "seek_hole": false, 00:17:49.954 "seek_data": false, 00:17:49.954 "copy": false, 00:17:49.954 "nvme_iov_md": false 00:17:49.954 }, 00:17:49.954 "memory_domains": [ 00:17:49.954 { 00:17:49.954 "dma_device_id": "system", 00:17:49.954 "dma_device_type": 1 00:17:49.954 }, 00:17:49.954 { 00:17:49.954 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.954 "dma_device_type": 2 00:17:49.954 }, 00:17:49.954 { 00:17:49.954 "dma_device_id": "system", 00:17:49.954 "dma_device_type": 1 00:17:49.954 }, 00:17:49.954 { 00:17:49.954 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.954 "dma_device_type": 2 00:17:49.954 }, 00:17:49.954 { 00:17:49.954 "dma_device_id": "system", 00:17:49.954 "dma_device_type": 1 00:17:49.954 }, 00:17:49.954 { 00:17:49.954 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.954 "dma_device_type": 2 00:17:49.954 }, 00:17:49.954 { 00:17:49.954 "dma_device_id": "system", 00:17:49.954 "dma_device_type": 1 00:17:49.954 }, 00:17:49.954 { 00:17:49.954 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.954 "dma_device_type": 2 00:17:49.954 } 00:17:49.954 ], 00:17:49.954 "driver_specific": { 00:17:49.954 "raid": { 00:17:49.954 "uuid": "d765aeb2-7780-43c7-9804-9f49faf69466", 00:17:49.954 "strip_size_kb": 64, 00:17:49.954 "state": "online", 00:17:49.954 "raid_level": "raid0", 00:17:49.954 "superblock": false, 00:17:49.954 "num_base_bdevs": 4, 00:17:49.954 "num_base_bdevs_discovered": 4, 00:17:49.954 "num_base_bdevs_operational": 4, 00:17:49.954 "base_bdevs_list": [ 00:17:49.954 { 00:17:49.954 "name": "NewBaseBdev", 00:17:49.954 "uuid": "caa684d0-e39c-4c46-b314-461bbe38bb86", 00:17:49.954 "is_configured": true, 00:17:49.954 "data_offset": 0, 00:17:49.954 "data_size": 65536 00:17:49.954 }, 00:17:49.954 { 00:17:49.954 "name": "BaseBdev2", 00:17:49.954 "uuid": "89c8c0be-d213-4240-9000-8b1c68235e14", 00:17:49.954 "is_configured": true, 00:17:49.954 "data_offset": 0, 00:17:49.954 "data_size": 65536 00:17:49.954 }, 00:17:49.954 { 00:17:49.954 "name": "BaseBdev3", 00:17:49.954 "uuid": "b890f75f-90e2-4f2a-8907-b2b185da2eb6", 00:17:49.955 "is_configured": true, 00:17:49.955 "data_offset": 0, 00:17:49.955 "data_size": 65536 00:17:49.955 }, 00:17:49.955 { 00:17:49.955 "name": "BaseBdev4", 00:17:49.955 "uuid": "b2d30b95-2895-469d-a6cd-abe7ccf074d7", 00:17:49.955 "is_configured": true, 00:17:49.955 "data_offset": 0, 00:17:49.955 "data_size": 65536 00:17:49.955 } 00:17:49.955 ] 00:17:49.955 } 00:17:49.955 } 00:17:49.955 }' 00:17:49.955 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:49.955 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:49.955 BaseBdev2 00:17:49.955 BaseBdev3 00:17:49.955 BaseBdev4' 00:17:49.955 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:49.955 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:49.955 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:50.223 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:50.223 "name": "NewBaseBdev", 00:17:50.223 "aliases": [ 00:17:50.223 "caa684d0-e39c-4c46-b314-461bbe38bb86" 00:17:50.223 ], 00:17:50.223 "product_name": "Malloc disk", 00:17:50.223 "block_size": 512, 00:17:50.223 "num_blocks": 65536, 00:17:50.223 "uuid": "caa684d0-e39c-4c46-b314-461bbe38bb86", 00:17:50.223 "assigned_rate_limits": { 00:17:50.223 "rw_ios_per_sec": 0, 00:17:50.223 "rw_mbytes_per_sec": 0, 00:17:50.223 "r_mbytes_per_sec": 0, 00:17:50.223 "w_mbytes_per_sec": 0 00:17:50.223 }, 00:17:50.223 "claimed": true, 00:17:50.223 "claim_type": "exclusive_write", 00:17:50.223 "zoned": false, 00:17:50.223 "supported_io_types": { 00:17:50.223 "read": true, 00:17:50.223 "write": true, 00:17:50.223 "unmap": true, 00:17:50.223 "flush": true, 00:17:50.223 "reset": true, 00:17:50.223 "nvme_admin": false, 00:17:50.223 "nvme_io": false, 00:17:50.223 "nvme_io_md": false, 00:17:50.223 "write_zeroes": true, 00:17:50.223 "zcopy": true, 00:17:50.223 "get_zone_info": false, 00:17:50.223 "zone_management": false, 00:17:50.223 "zone_append": false, 00:17:50.223 "compare": false, 00:17:50.223 "compare_and_write": false, 00:17:50.223 "abort": true, 00:17:50.223 "seek_hole": false, 00:17:50.223 "seek_data": false, 00:17:50.223 "copy": true, 00:17:50.223 "nvme_iov_md": false 00:17:50.223 }, 00:17:50.223 "memory_domains": [ 00:17:50.223 { 00:17:50.223 "dma_device_id": "system", 00:17:50.223 "dma_device_type": 1 00:17:50.223 }, 00:17:50.223 { 00:17:50.223 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.223 "dma_device_type": 2 00:17:50.223 } 00:17:50.223 ], 00:17:50.223 "driver_specific": {} 00:17:50.223 }' 00:17:50.223 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:50.223 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:50.223 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:50.223 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:50.223 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:50.223 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:50.223 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:50.223 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:50.480 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:50.480 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:50.480 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:50.480 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:50.480 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:50.480 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:50.480 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:50.737 08:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:50.737 "name": "BaseBdev2", 00:17:50.737 "aliases": [ 00:17:50.737 "89c8c0be-d213-4240-9000-8b1c68235e14" 00:17:50.737 ], 00:17:50.737 "product_name": "Malloc disk", 00:17:50.737 "block_size": 512, 00:17:50.737 "num_blocks": 65536, 00:17:50.737 "uuid": "89c8c0be-d213-4240-9000-8b1c68235e14", 00:17:50.737 "assigned_rate_limits": { 00:17:50.737 "rw_ios_per_sec": 0, 00:17:50.737 "rw_mbytes_per_sec": 0, 00:17:50.737 "r_mbytes_per_sec": 0, 00:17:50.737 "w_mbytes_per_sec": 0 00:17:50.737 }, 00:17:50.737 "claimed": true, 00:17:50.737 "claim_type": "exclusive_write", 00:17:50.737 "zoned": false, 00:17:50.737 "supported_io_types": { 00:17:50.737 "read": true, 00:17:50.737 "write": true, 00:17:50.737 "unmap": true, 00:17:50.737 "flush": true, 00:17:50.737 "reset": true, 00:17:50.737 "nvme_admin": false, 00:17:50.737 "nvme_io": false, 00:17:50.737 "nvme_io_md": false, 00:17:50.737 "write_zeroes": true, 00:17:50.737 "zcopy": true, 00:17:50.737 "get_zone_info": false, 00:17:50.737 "zone_management": false, 00:17:50.737 "zone_append": false, 00:17:50.737 "compare": false, 00:17:50.737 "compare_and_write": false, 00:17:50.737 "abort": true, 00:17:50.737 "seek_hole": false, 00:17:50.737 "seek_data": false, 00:17:50.737 "copy": true, 00:17:50.737 "nvme_iov_md": false 00:17:50.737 }, 00:17:50.737 "memory_domains": [ 00:17:50.737 { 00:17:50.737 "dma_device_id": "system", 00:17:50.737 "dma_device_type": 1 00:17:50.737 }, 00:17:50.737 { 00:17:50.737 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.737 "dma_device_type": 2 00:17:50.737 } 00:17:50.737 ], 00:17:50.737 "driver_specific": {} 00:17:50.737 }' 00:17:50.737 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:50.737 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:50.737 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:50.737 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:50.737 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:50.737 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:50.737 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:50.737 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:50.737 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:50.737 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:50.995 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:50.995 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:50.995 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:50.995 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:50.995 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:50.995 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:50.995 "name": "BaseBdev3", 00:17:50.995 "aliases": [ 00:17:50.995 "b890f75f-90e2-4f2a-8907-b2b185da2eb6" 00:17:50.995 ], 00:17:50.995 "product_name": "Malloc disk", 00:17:50.995 "block_size": 512, 00:17:50.995 "num_blocks": 65536, 00:17:50.995 "uuid": "b890f75f-90e2-4f2a-8907-b2b185da2eb6", 00:17:50.995 "assigned_rate_limits": { 00:17:50.995 "rw_ios_per_sec": 0, 00:17:50.995 "rw_mbytes_per_sec": 0, 00:17:50.995 "r_mbytes_per_sec": 0, 00:17:50.995 "w_mbytes_per_sec": 0 00:17:50.995 }, 00:17:50.995 "claimed": true, 00:17:50.995 "claim_type": "exclusive_write", 00:17:50.995 "zoned": false, 00:17:50.995 "supported_io_types": { 00:17:50.995 "read": true, 00:17:50.995 "write": true, 00:17:50.995 "unmap": true, 00:17:50.995 "flush": true, 00:17:50.995 "reset": true, 00:17:50.995 "nvme_admin": false, 00:17:50.995 "nvme_io": false, 00:17:50.995 "nvme_io_md": false, 00:17:50.995 "write_zeroes": true, 00:17:50.995 "zcopy": true, 00:17:50.995 "get_zone_info": false, 00:17:50.995 "zone_management": false, 00:17:50.995 "zone_append": false, 00:17:50.995 "compare": false, 00:17:50.995 "compare_and_write": false, 00:17:50.995 "abort": true, 00:17:50.995 "seek_hole": false, 00:17:50.995 "seek_data": false, 00:17:50.996 "copy": true, 00:17:50.996 "nvme_iov_md": false 00:17:50.996 }, 00:17:50.996 "memory_domains": [ 00:17:50.996 { 00:17:50.996 "dma_device_id": "system", 00:17:50.996 "dma_device_type": 1 00:17:50.996 }, 00:17:50.996 { 00:17:50.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.996 "dma_device_type": 2 00:17:50.996 } 00:17:50.996 ], 00:17:50.996 "driver_specific": {} 00:17:50.996 }' 00:17:50.996 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:50.996 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:51.254 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:51.254 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:51.254 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:51.254 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:51.254 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:51.254 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:51.254 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:51.254 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:51.254 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:51.254 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:51.254 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:51.254 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:51.254 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:51.512 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:51.512 "name": "BaseBdev4", 00:17:51.512 "aliases": [ 00:17:51.512 "b2d30b95-2895-469d-a6cd-abe7ccf074d7" 00:17:51.512 ], 00:17:51.512 "product_name": "Malloc disk", 00:17:51.512 "block_size": 512, 00:17:51.512 "num_blocks": 65536, 00:17:51.512 "uuid": "b2d30b95-2895-469d-a6cd-abe7ccf074d7", 00:17:51.512 "assigned_rate_limits": { 00:17:51.512 "rw_ios_per_sec": 0, 00:17:51.512 "rw_mbytes_per_sec": 0, 00:17:51.512 "r_mbytes_per_sec": 0, 00:17:51.512 "w_mbytes_per_sec": 0 00:17:51.512 }, 00:17:51.512 "claimed": true, 00:17:51.512 "claim_type": "exclusive_write", 00:17:51.512 "zoned": false, 00:17:51.512 "supported_io_types": { 00:17:51.512 "read": true, 00:17:51.512 "write": true, 00:17:51.512 "unmap": true, 00:17:51.512 "flush": true, 00:17:51.512 "reset": true, 00:17:51.512 "nvme_admin": false, 00:17:51.512 "nvme_io": false, 00:17:51.512 "nvme_io_md": false, 00:17:51.512 "write_zeroes": true, 00:17:51.512 "zcopy": true, 00:17:51.512 "get_zone_info": false, 00:17:51.512 "zone_management": false, 00:17:51.512 "zone_append": false, 00:17:51.512 "compare": false, 00:17:51.512 "compare_and_write": false, 00:17:51.512 "abort": true, 00:17:51.512 "seek_hole": false, 00:17:51.512 "seek_data": false, 00:17:51.512 "copy": true, 00:17:51.512 "nvme_iov_md": false 00:17:51.512 }, 00:17:51.512 "memory_domains": [ 00:17:51.512 { 00:17:51.512 "dma_device_id": "system", 00:17:51.512 "dma_device_type": 1 00:17:51.512 }, 00:17:51.512 { 00:17:51.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:51.512 "dma_device_type": 2 00:17:51.512 } 00:17:51.512 ], 00:17:51.512 "driver_specific": {} 00:17:51.512 }' 00:17:51.512 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:51.512 08:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:51.512 08:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:51.512 08:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:51.769 08:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:51.769 08:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:51.769 08:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:51.769 08:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:51.769 08:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:51.769 08:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:51.769 08:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:51.769 08:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:51.769 08:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:52.027 [2024-07-23 08:31:04.388440] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:52.027 [2024-07-23 08:31:04.388471] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:52.027 [2024-07-23 08:31:04.388547] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:52.027 [2024-07-23 08:31:04.388615] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:52.027 [2024-07-23 08:31:04.388630] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037280 name Existed_Raid, state offline 00:17:52.027 08:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1470536 00:17:52.027 08:31:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1470536 ']' 00:17:52.027 08:31:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1470536 00:17:52.027 08:31:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:17:52.027 08:31:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:52.027 08:31:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1470536 00:17:52.027 08:31:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:52.027 08:31:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:52.027 08:31:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1470536' 00:17:52.027 killing process with pid 1470536 00:17:52.027 08:31:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1470536 00:17:52.027 [2024-07-23 08:31:04.446148] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:52.027 08:31:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1470536 00:17:52.286 [2024-07-23 08:31:04.769955] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:53.660 00:17:53.660 real 0m26.438s 00:17:53.660 user 0m47.282s 00:17:53.660 sys 0m3.931s 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:53.660 ************************************ 00:17:53.660 END TEST raid_state_function_test 00:17:53.660 ************************************ 00:17:53.660 08:31:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:53.660 08:31:06 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:17:53.660 08:31:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:53.660 08:31:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:53.660 08:31:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:53.660 ************************************ 00:17:53.660 START TEST raid_state_function_test_sb 00:17:53.660 ************************************ 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:53.660 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1476040 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1476040' 00:17:53.661 Process raid pid: 1476040 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1476040 /var/tmp/spdk-raid.sock 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1476040 ']' 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:53.661 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:53.661 08:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:53.661 [2024-07-23 08:31:06.160705] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:17:53.661 [2024-07-23 08:31:06.160791] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:53.919 [2024-07-23 08:31:06.284364] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:54.175 [2024-07-23 08:31:06.499216] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:54.433 [2024-07-23 08:31:06.793831] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:54.433 [2024-07-23 08:31:06.793863] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:54.433 08:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:54.433 08:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:17:54.433 08:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:54.692 [2024-07-23 08:31:07.098832] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:54.692 [2024-07-23 08:31:07.098874] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:54.692 [2024-07-23 08:31:07.098884] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:54.692 [2024-07-23 08:31:07.098895] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:54.692 [2024-07-23 08:31:07.098902] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:54.692 [2024-07-23 08:31:07.098910] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:54.692 [2024-07-23 08:31:07.098917] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:54.692 [2024-07-23 08:31:07.098926] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:54.692 08:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:54.692 08:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:54.692 08:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:54.692 08:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:54.692 08:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:54.692 08:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:54.692 08:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.692 08:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.692 08:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.692 08:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.692 08:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.692 08:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:54.950 08:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:54.950 "name": "Existed_Raid", 00:17:54.950 "uuid": "1eab4839-1523-4504-9ce4-00437ab8c7bf", 00:17:54.950 "strip_size_kb": 64, 00:17:54.950 "state": "configuring", 00:17:54.950 "raid_level": "raid0", 00:17:54.950 "superblock": true, 00:17:54.950 "num_base_bdevs": 4, 00:17:54.950 "num_base_bdevs_discovered": 0, 00:17:54.950 "num_base_bdevs_operational": 4, 00:17:54.950 "base_bdevs_list": [ 00:17:54.950 { 00:17:54.950 "name": "BaseBdev1", 00:17:54.950 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.950 "is_configured": false, 00:17:54.950 "data_offset": 0, 00:17:54.950 "data_size": 0 00:17:54.950 }, 00:17:54.950 { 00:17:54.950 "name": "BaseBdev2", 00:17:54.950 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.950 "is_configured": false, 00:17:54.950 "data_offset": 0, 00:17:54.950 "data_size": 0 00:17:54.950 }, 00:17:54.950 { 00:17:54.950 "name": "BaseBdev3", 00:17:54.950 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.950 "is_configured": false, 00:17:54.950 "data_offset": 0, 00:17:54.950 "data_size": 0 00:17:54.950 }, 00:17:54.950 { 00:17:54.950 "name": "BaseBdev4", 00:17:54.950 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.950 "is_configured": false, 00:17:54.950 "data_offset": 0, 00:17:54.950 "data_size": 0 00:17:54.950 } 00:17:54.950 ] 00:17:54.950 }' 00:17:54.950 08:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:54.950 08:31:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:55.515 08:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:55.515 [2024-07-23 08:31:07.944950] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:55.515 [2024-07-23 08:31:07.944984] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:17:55.515 08:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:55.773 [2024-07-23 08:31:08.109416] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:55.773 [2024-07-23 08:31:08.109457] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:55.773 [2024-07-23 08:31:08.109466] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:55.773 [2024-07-23 08:31:08.109475] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:55.773 [2024-07-23 08:31:08.109482] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:55.773 [2024-07-23 08:31:08.109491] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:55.773 [2024-07-23 08:31:08.109497] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:55.773 [2024-07-23 08:31:08.109506] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:55.773 08:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:56.031 [2024-07-23 08:31:08.308750] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:56.031 BaseBdev1 00:17:56.031 08:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:56.031 08:31:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:56.031 08:31:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:56.031 08:31:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:56.031 08:31:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:56.031 08:31:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:56.031 08:31:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:56.031 08:31:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:56.289 [ 00:17:56.289 { 00:17:56.289 "name": "BaseBdev1", 00:17:56.289 "aliases": [ 00:17:56.289 "54f0c4dd-6b19-4bb1-92cf-a5c161a39d83" 00:17:56.289 ], 00:17:56.289 "product_name": "Malloc disk", 00:17:56.289 "block_size": 512, 00:17:56.289 "num_blocks": 65536, 00:17:56.289 "uuid": "54f0c4dd-6b19-4bb1-92cf-a5c161a39d83", 00:17:56.289 "assigned_rate_limits": { 00:17:56.289 "rw_ios_per_sec": 0, 00:17:56.289 "rw_mbytes_per_sec": 0, 00:17:56.289 "r_mbytes_per_sec": 0, 00:17:56.289 "w_mbytes_per_sec": 0 00:17:56.289 }, 00:17:56.289 "claimed": true, 00:17:56.290 "claim_type": "exclusive_write", 00:17:56.290 "zoned": false, 00:17:56.290 "supported_io_types": { 00:17:56.290 "read": true, 00:17:56.290 "write": true, 00:17:56.290 "unmap": true, 00:17:56.290 "flush": true, 00:17:56.290 "reset": true, 00:17:56.290 "nvme_admin": false, 00:17:56.290 "nvme_io": false, 00:17:56.290 "nvme_io_md": false, 00:17:56.290 "write_zeroes": true, 00:17:56.290 "zcopy": true, 00:17:56.290 "get_zone_info": false, 00:17:56.290 "zone_management": false, 00:17:56.290 "zone_append": false, 00:17:56.290 "compare": false, 00:17:56.290 "compare_and_write": false, 00:17:56.290 "abort": true, 00:17:56.290 "seek_hole": false, 00:17:56.290 "seek_data": false, 00:17:56.290 "copy": true, 00:17:56.290 "nvme_iov_md": false 00:17:56.290 }, 00:17:56.290 "memory_domains": [ 00:17:56.290 { 00:17:56.290 "dma_device_id": "system", 00:17:56.290 "dma_device_type": 1 00:17:56.290 }, 00:17:56.290 { 00:17:56.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:56.290 "dma_device_type": 2 00:17:56.290 } 00:17:56.290 ], 00:17:56.290 "driver_specific": {} 00:17:56.290 } 00:17:56.290 ] 00:17:56.290 08:31:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:56.290 08:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:56.290 08:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:56.290 08:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:56.290 08:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:56.290 08:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:56.290 08:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:56.290 08:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.290 08:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.290 08:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.290 08:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.290 08:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:56.290 08:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.548 08:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.548 "name": "Existed_Raid", 00:17:56.548 "uuid": "173ab18d-29ad-4190-8008-c2be8b3ecb99", 00:17:56.548 "strip_size_kb": 64, 00:17:56.548 "state": "configuring", 00:17:56.548 "raid_level": "raid0", 00:17:56.548 "superblock": true, 00:17:56.548 "num_base_bdevs": 4, 00:17:56.548 "num_base_bdevs_discovered": 1, 00:17:56.548 "num_base_bdevs_operational": 4, 00:17:56.548 "base_bdevs_list": [ 00:17:56.548 { 00:17:56.548 "name": "BaseBdev1", 00:17:56.548 "uuid": "54f0c4dd-6b19-4bb1-92cf-a5c161a39d83", 00:17:56.548 "is_configured": true, 00:17:56.548 "data_offset": 2048, 00:17:56.548 "data_size": 63488 00:17:56.548 }, 00:17:56.548 { 00:17:56.548 "name": "BaseBdev2", 00:17:56.548 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.548 "is_configured": false, 00:17:56.548 "data_offset": 0, 00:17:56.548 "data_size": 0 00:17:56.548 }, 00:17:56.548 { 00:17:56.548 "name": "BaseBdev3", 00:17:56.548 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.548 "is_configured": false, 00:17:56.548 "data_offset": 0, 00:17:56.548 "data_size": 0 00:17:56.548 }, 00:17:56.548 { 00:17:56.548 "name": "BaseBdev4", 00:17:56.548 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.548 "is_configured": false, 00:17:56.548 "data_offset": 0, 00:17:56.548 "data_size": 0 00:17:56.548 } 00:17:56.548 ] 00:17:56.548 }' 00:17:56.548 08:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.548 08:31:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:56.806 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:57.063 [2024-07-23 08:31:09.455826] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:57.063 [2024-07-23 08:31:09.455875] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:17:57.063 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:57.321 [2024-07-23 08:31:09.624290] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:57.321 [2024-07-23 08:31:09.625864] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:57.321 [2024-07-23 08:31:09.625898] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:57.321 [2024-07-23 08:31:09.625907] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:57.321 [2024-07-23 08:31:09.625916] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:57.321 [2024-07-23 08:31:09.625923] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:57.321 [2024-07-23 08:31:09.625933] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:57.321 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:57.321 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:57.321 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:57.321 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:57.321 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:57.321 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:57.321 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:57.321 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:57.321 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:57.321 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:57.321 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:57.321 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:57.321 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.321 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:57.321 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:57.321 "name": "Existed_Raid", 00:17:57.321 "uuid": "cfcda846-5fc5-43eb-9831-eeb1c5b7175c", 00:17:57.321 "strip_size_kb": 64, 00:17:57.321 "state": "configuring", 00:17:57.321 "raid_level": "raid0", 00:17:57.321 "superblock": true, 00:17:57.321 "num_base_bdevs": 4, 00:17:57.321 "num_base_bdevs_discovered": 1, 00:17:57.321 "num_base_bdevs_operational": 4, 00:17:57.321 "base_bdevs_list": [ 00:17:57.321 { 00:17:57.321 "name": "BaseBdev1", 00:17:57.321 "uuid": "54f0c4dd-6b19-4bb1-92cf-a5c161a39d83", 00:17:57.322 "is_configured": true, 00:17:57.322 "data_offset": 2048, 00:17:57.322 "data_size": 63488 00:17:57.322 }, 00:17:57.322 { 00:17:57.322 "name": "BaseBdev2", 00:17:57.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:57.322 "is_configured": false, 00:17:57.322 "data_offset": 0, 00:17:57.322 "data_size": 0 00:17:57.322 }, 00:17:57.322 { 00:17:57.322 "name": "BaseBdev3", 00:17:57.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:57.322 "is_configured": false, 00:17:57.322 "data_offset": 0, 00:17:57.322 "data_size": 0 00:17:57.322 }, 00:17:57.322 { 00:17:57.322 "name": "BaseBdev4", 00:17:57.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:57.322 "is_configured": false, 00:17:57.322 "data_offset": 0, 00:17:57.322 "data_size": 0 00:17:57.322 } 00:17:57.322 ] 00:17:57.322 }' 00:17:57.322 08:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:57.322 08:31:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:57.886 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:58.145 [2024-07-23 08:31:10.473833] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:58.145 BaseBdev2 00:17:58.145 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:58.145 08:31:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:58.145 08:31:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:58.145 08:31:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:58.145 08:31:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:58.145 08:31:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:58.145 08:31:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:58.145 08:31:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:58.403 [ 00:17:58.403 { 00:17:58.403 "name": "BaseBdev2", 00:17:58.403 "aliases": [ 00:17:58.403 "75119e0c-3cad-4144-8d68-5541e214222c" 00:17:58.403 ], 00:17:58.403 "product_name": "Malloc disk", 00:17:58.403 "block_size": 512, 00:17:58.403 "num_blocks": 65536, 00:17:58.403 "uuid": "75119e0c-3cad-4144-8d68-5541e214222c", 00:17:58.403 "assigned_rate_limits": { 00:17:58.403 "rw_ios_per_sec": 0, 00:17:58.403 "rw_mbytes_per_sec": 0, 00:17:58.403 "r_mbytes_per_sec": 0, 00:17:58.403 "w_mbytes_per_sec": 0 00:17:58.403 }, 00:17:58.403 "claimed": true, 00:17:58.403 "claim_type": "exclusive_write", 00:17:58.403 "zoned": false, 00:17:58.403 "supported_io_types": { 00:17:58.403 "read": true, 00:17:58.403 "write": true, 00:17:58.403 "unmap": true, 00:17:58.403 "flush": true, 00:17:58.403 "reset": true, 00:17:58.403 "nvme_admin": false, 00:17:58.403 "nvme_io": false, 00:17:58.403 "nvme_io_md": false, 00:17:58.403 "write_zeroes": true, 00:17:58.403 "zcopy": true, 00:17:58.403 "get_zone_info": false, 00:17:58.403 "zone_management": false, 00:17:58.403 "zone_append": false, 00:17:58.403 "compare": false, 00:17:58.403 "compare_and_write": false, 00:17:58.403 "abort": true, 00:17:58.403 "seek_hole": false, 00:17:58.403 "seek_data": false, 00:17:58.403 "copy": true, 00:17:58.403 "nvme_iov_md": false 00:17:58.403 }, 00:17:58.403 "memory_domains": [ 00:17:58.403 { 00:17:58.403 "dma_device_id": "system", 00:17:58.403 "dma_device_type": 1 00:17:58.403 }, 00:17:58.403 { 00:17:58.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.403 "dma_device_type": 2 00:17:58.403 } 00:17:58.403 ], 00:17:58.403 "driver_specific": {} 00:17:58.403 } 00:17:58.403 ] 00:17:58.403 08:31:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:58.403 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:58.403 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:58.403 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:58.403 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:58.403 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:58.403 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:58.403 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:58.403 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:58.403 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:58.403 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:58.403 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:58.403 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:58.403 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.403 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:58.661 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:58.661 "name": "Existed_Raid", 00:17:58.661 "uuid": "cfcda846-5fc5-43eb-9831-eeb1c5b7175c", 00:17:58.661 "strip_size_kb": 64, 00:17:58.661 "state": "configuring", 00:17:58.661 "raid_level": "raid0", 00:17:58.661 "superblock": true, 00:17:58.661 "num_base_bdevs": 4, 00:17:58.661 "num_base_bdevs_discovered": 2, 00:17:58.661 "num_base_bdevs_operational": 4, 00:17:58.661 "base_bdevs_list": [ 00:17:58.661 { 00:17:58.661 "name": "BaseBdev1", 00:17:58.661 "uuid": "54f0c4dd-6b19-4bb1-92cf-a5c161a39d83", 00:17:58.661 "is_configured": true, 00:17:58.661 "data_offset": 2048, 00:17:58.661 "data_size": 63488 00:17:58.661 }, 00:17:58.661 { 00:17:58.661 "name": "BaseBdev2", 00:17:58.661 "uuid": "75119e0c-3cad-4144-8d68-5541e214222c", 00:17:58.661 "is_configured": true, 00:17:58.661 "data_offset": 2048, 00:17:58.661 "data_size": 63488 00:17:58.661 }, 00:17:58.661 { 00:17:58.661 "name": "BaseBdev3", 00:17:58.661 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:58.661 "is_configured": false, 00:17:58.661 "data_offset": 0, 00:17:58.661 "data_size": 0 00:17:58.661 }, 00:17:58.661 { 00:17:58.661 "name": "BaseBdev4", 00:17:58.661 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:58.661 "is_configured": false, 00:17:58.661 "data_offset": 0, 00:17:58.661 "data_size": 0 00:17:58.661 } 00:17:58.661 ] 00:17:58.661 }' 00:17:58.661 08:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:58.661 08:31:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:59.226 08:31:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:59.226 [2024-07-23 08:31:11.685140] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:59.226 BaseBdev3 00:17:59.226 08:31:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:59.226 08:31:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:59.226 08:31:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:59.226 08:31:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:59.226 08:31:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:59.226 08:31:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:59.226 08:31:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:59.487 08:31:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:59.744 [ 00:17:59.744 { 00:17:59.744 "name": "BaseBdev3", 00:17:59.744 "aliases": [ 00:17:59.744 "d973006c-ff59-47a1-8f9a-401e7eb806b3" 00:17:59.744 ], 00:17:59.744 "product_name": "Malloc disk", 00:17:59.744 "block_size": 512, 00:17:59.744 "num_blocks": 65536, 00:17:59.744 "uuid": "d973006c-ff59-47a1-8f9a-401e7eb806b3", 00:17:59.744 "assigned_rate_limits": { 00:17:59.744 "rw_ios_per_sec": 0, 00:17:59.744 "rw_mbytes_per_sec": 0, 00:17:59.744 "r_mbytes_per_sec": 0, 00:17:59.744 "w_mbytes_per_sec": 0 00:17:59.744 }, 00:17:59.744 "claimed": true, 00:17:59.744 "claim_type": "exclusive_write", 00:17:59.744 "zoned": false, 00:17:59.744 "supported_io_types": { 00:17:59.744 "read": true, 00:17:59.744 "write": true, 00:17:59.744 "unmap": true, 00:17:59.744 "flush": true, 00:17:59.744 "reset": true, 00:17:59.744 "nvme_admin": false, 00:17:59.744 "nvme_io": false, 00:17:59.745 "nvme_io_md": false, 00:17:59.745 "write_zeroes": true, 00:17:59.745 "zcopy": true, 00:17:59.745 "get_zone_info": false, 00:17:59.745 "zone_management": false, 00:17:59.745 "zone_append": false, 00:17:59.745 "compare": false, 00:17:59.745 "compare_and_write": false, 00:17:59.745 "abort": true, 00:17:59.745 "seek_hole": false, 00:17:59.745 "seek_data": false, 00:17:59.745 "copy": true, 00:17:59.745 "nvme_iov_md": false 00:17:59.745 }, 00:17:59.745 "memory_domains": [ 00:17:59.745 { 00:17:59.745 "dma_device_id": "system", 00:17:59.745 "dma_device_type": 1 00:17:59.745 }, 00:17:59.745 { 00:17:59.745 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.745 "dma_device_type": 2 00:17:59.745 } 00:17:59.745 ], 00:17:59.745 "driver_specific": {} 00:17:59.745 } 00:17:59.745 ] 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:59.745 "name": "Existed_Raid", 00:17:59.745 "uuid": "cfcda846-5fc5-43eb-9831-eeb1c5b7175c", 00:17:59.745 "strip_size_kb": 64, 00:17:59.745 "state": "configuring", 00:17:59.745 "raid_level": "raid0", 00:17:59.745 "superblock": true, 00:17:59.745 "num_base_bdevs": 4, 00:17:59.745 "num_base_bdevs_discovered": 3, 00:17:59.745 "num_base_bdevs_operational": 4, 00:17:59.745 "base_bdevs_list": [ 00:17:59.745 { 00:17:59.745 "name": "BaseBdev1", 00:17:59.745 "uuid": "54f0c4dd-6b19-4bb1-92cf-a5c161a39d83", 00:17:59.745 "is_configured": true, 00:17:59.745 "data_offset": 2048, 00:17:59.745 "data_size": 63488 00:17:59.745 }, 00:17:59.745 { 00:17:59.745 "name": "BaseBdev2", 00:17:59.745 "uuid": "75119e0c-3cad-4144-8d68-5541e214222c", 00:17:59.745 "is_configured": true, 00:17:59.745 "data_offset": 2048, 00:17:59.745 "data_size": 63488 00:17:59.745 }, 00:17:59.745 { 00:17:59.745 "name": "BaseBdev3", 00:17:59.745 "uuid": "d973006c-ff59-47a1-8f9a-401e7eb806b3", 00:17:59.745 "is_configured": true, 00:17:59.745 "data_offset": 2048, 00:17:59.745 "data_size": 63488 00:17:59.745 }, 00:17:59.745 { 00:17:59.745 "name": "BaseBdev4", 00:17:59.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:59.745 "is_configured": false, 00:17:59.745 "data_offset": 0, 00:17:59.745 "data_size": 0 00:17:59.745 } 00:17:59.745 ] 00:17:59.745 }' 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:59.745 08:31:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:00.310 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:00.568 [2024-07-23 08:31:12.862236] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:00.568 [2024-07-23 08:31:12.862458] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:18:00.568 [2024-07-23 08:31:12.862473] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:00.568 [2024-07-23 08:31:12.862732] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:18:00.568 [2024-07-23 08:31:12.862909] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:18:00.568 [2024-07-23 08:31:12.862921] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:18:00.568 [2024-07-23 08:31:12.863082] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:00.568 BaseBdev4 00:18:00.568 08:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:00.568 08:31:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:00.568 08:31:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:00.568 08:31:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:00.568 08:31:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:00.568 08:31:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:00.568 08:31:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:00.568 08:31:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:00.825 [ 00:18:00.825 { 00:18:00.825 "name": "BaseBdev4", 00:18:00.825 "aliases": [ 00:18:00.825 "0daeccfa-12dc-4a27-82da-eeeb1622808d" 00:18:00.825 ], 00:18:00.825 "product_name": "Malloc disk", 00:18:00.825 "block_size": 512, 00:18:00.825 "num_blocks": 65536, 00:18:00.825 "uuid": "0daeccfa-12dc-4a27-82da-eeeb1622808d", 00:18:00.825 "assigned_rate_limits": { 00:18:00.825 "rw_ios_per_sec": 0, 00:18:00.825 "rw_mbytes_per_sec": 0, 00:18:00.825 "r_mbytes_per_sec": 0, 00:18:00.825 "w_mbytes_per_sec": 0 00:18:00.825 }, 00:18:00.825 "claimed": true, 00:18:00.825 "claim_type": "exclusive_write", 00:18:00.825 "zoned": false, 00:18:00.825 "supported_io_types": { 00:18:00.825 "read": true, 00:18:00.825 "write": true, 00:18:00.825 "unmap": true, 00:18:00.825 "flush": true, 00:18:00.825 "reset": true, 00:18:00.825 "nvme_admin": false, 00:18:00.825 "nvme_io": false, 00:18:00.825 "nvme_io_md": false, 00:18:00.825 "write_zeroes": true, 00:18:00.825 "zcopy": true, 00:18:00.825 "get_zone_info": false, 00:18:00.825 "zone_management": false, 00:18:00.825 "zone_append": false, 00:18:00.825 "compare": false, 00:18:00.825 "compare_and_write": false, 00:18:00.825 "abort": true, 00:18:00.825 "seek_hole": false, 00:18:00.825 "seek_data": false, 00:18:00.825 "copy": true, 00:18:00.825 "nvme_iov_md": false 00:18:00.825 }, 00:18:00.825 "memory_domains": [ 00:18:00.825 { 00:18:00.825 "dma_device_id": "system", 00:18:00.825 "dma_device_type": 1 00:18:00.825 }, 00:18:00.825 { 00:18:00.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.825 "dma_device_type": 2 00:18:00.825 } 00:18:00.825 ], 00:18:00.825 "driver_specific": {} 00:18:00.825 } 00:18:00.825 ] 00:18:00.825 08:31:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:00.825 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:00.826 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:00.826 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:00.826 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:00.826 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:00.826 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:00.826 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:00.826 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:00.826 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.826 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.826 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.826 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.826 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.826 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:01.081 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:01.081 "name": "Existed_Raid", 00:18:01.081 "uuid": "cfcda846-5fc5-43eb-9831-eeb1c5b7175c", 00:18:01.081 "strip_size_kb": 64, 00:18:01.082 "state": "online", 00:18:01.082 "raid_level": "raid0", 00:18:01.082 "superblock": true, 00:18:01.082 "num_base_bdevs": 4, 00:18:01.082 "num_base_bdevs_discovered": 4, 00:18:01.082 "num_base_bdevs_operational": 4, 00:18:01.082 "base_bdevs_list": [ 00:18:01.082 { 00:18:01.082 "name": "BaseBdev1", 00:18:01.082 "uuid": "54f0c4dd-6b19-4bb1-92cf-a5c161a39d83", 00:18:01.082 "is_configured": true, 00:18:01.082 "data_offset": 2048, 00:18:01.082 "data_size": 63488 00:18:01.082 }, 00:18:01.082 { 00:18:01.082 "name": "BaseBdev2", 00:18:01.082 "uuid": "75119e0c-3cad-4144-8d68-5541e214222c", 00:18:01.082 "is_configured": true, 00:18:01.082 "data_offset": 2048, 00:18:01.082 "data_size": 63488 00:18:01.082 }, 00:18:01.082 { 00:18:01.082 "name": "BaseBdev3", 00:18:01.082 "uuid": "d973006c-ff59-47a1-8f9a-401e7eb806b3", 00:18:01.082 "is_configured": true, 00:18:01.082 "data_offset": 2048, 00:18:01.082 "data_size": 63488 00:18:01.082 }, 00:18:01.082 { 00:18:01.082 "name": "BaseBdev4", 00:18:01.082 "uuid": "0daeccfa-12dc-4a27-82da-eeeb1622808d", 00:18:01.082 "is_configured": true, 00:18:01.082 "data_offset": 2048, 00:18:01.082 "data_size": 63488 00:18:01.082 } 00:18:01.082 ] 00:18:01.082 }' 00:18:01.082 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:01.082 08:31:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:01.646 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:01.646 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:01.646 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:01.646 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:01.646 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:01.646 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:01.646 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:01.646 08:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:01.646 [2024-07-23 08:31:14.057739] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:01.646 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:01.646 "name": "Existed_Raid", 00:18:01.646 "aliases": [ 00:18:01.646 "cfcda846-5fc5-43eb-9831-eeb1c5b7175c" 00:18:01.646 ], 00:18:01.646 "product_name": "Raid Volume", 00:18:01.646 "block_size": 512, 00:18:01.646 "num_blocks": 253952, 00:18:01.646 "uuid": "cfcda846-5fc5-43eb-9831-eeb1c5b7175c", 00:18:01.646 "assigned_rate_limits": { 00:18:01.646 "rw_ios_per_sec": 0, 00:18:01.646 "rw_mbytes_per_sec": 0, 00:18:01.646 "r_mbytes_per_sec": 0, 00:18:01.646 "w_mbytes_per_sec": 0 00:18:01.646 }, 00:18:01.646 "claimed": false, 00:18:01.646 "zoned": false, 00:18:01.646 "supported_io_types": { 00:18:01.646 "read": true, 00:18:01.646 "write": true, 00:18:01.646 "unmap": true, 00:18:01.646 "flush": true, 00:18:01.646 "reset": true, 00:18:01.646 "nvme_admin": false, 00:18:01.646 "nvme_io": false, 00:18:01.646 "nvme_io_md": false, 00:18:01.646 "write_zeroes": true, 00:18:01.646 "zcopy": false, 00:18:01.646 "get_zone_info": false, 00:18:01.646 "zone_management": false, 00:18:01.646 "zone_append": false, 00:18:01.646 "compare": false, 00:18:01.646 "compare_and_write": false, 00:18:01.646 "abort": false, 00:18:01.646 "seek_hole": false, 00:18:01.646 "seek_data": false, 00:18:01.646 "copy": false, 00:18:01.646 "nvme_iov_md": false 00:18:01.646 }, 00:18:01.646 "memory_domains": [ 00:18:01.646 { 00:18:01.646 "dma_device_id": "system", 00:18:01.646 "dma_device_type": 1 00:18:01.646 }, 00:18:01.646 { 00:18:01.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.646 "dma_device_type": 2 00:18:01.646 }, 00:18:01.646 { 00:18:01.646 "dma_device_id": "system", 00:18:01.646 "dma_device_type": 1 00:18:01.646 }, 00:18:01.646 { 00:18:01.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.646 "dma_device_type": 2 00:18:01.646 }, 00:18:01.646 { 00:18:01.646 "dma_device_id": "system", 00:18:01.646 "dma_device_type": 1 00:18:01.646 }, 00:18:01.646 { 00:18:01.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.646 "dma_device_type": 2 00:18:01.646 }, 00:18:01.646 { 00:18:01.646 "dma_device_id": "system", 00:18:01.646 "dma_device_type": 1 00:18:01.646 }, 00:18:01.646 { 00:18:01.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.646 "dma_device_type": 2 00:18:01.646 } 00:18:01.646 ], 00:18:01.646 "driver_specific": { 00:18:01.646 "raid": { 00:18:01.646 "uuid": "cfcda846-5fc5-43eb-9831-eeb1c5b7175c", 00:18:01.646 "strip_size_kb": 64, 00:18:01.646 "state": "online", 00:18:01.646 "raid_level": "raid0", 00:18:01.646 "superblock": true, 00:18:01.646 "num_base_bdevs": 4, 00:18:01.646 "num_base_bdevs_discovered": 4, 00:18:01.646 "num_base_bdevs_operational": 4, 00:18:01.646 "base_bdevs_list": [ 00:18:01.646 { 00:18:01.646 "name": "BaseBdev1", 00:18:01.646 "uuid": "54f0c4dd-6b19-4bb1-92cf-a5c161a39d83", 00:18:01.646 "is_configured": true, 00:18:01.646 "data_offset": 2048, 00:18:01.646 "data_size": 63488 00:18:01.646 }, 00:18:01.646 { 00:18:01.646 "name": "BaseBdev2", 00:18:01.646 "uuid": "75119e0c-3cad-4144-8d68-5541e214222c", 00:18:01.646 "is_configured": true, 00:18:01.646 "data_offset": 2048, 00:18:01.646 "data_size": 63488 00:18:01.646 }, 00:18:01.646 { 00:18:01.646 "name": "BaseBdev3", 00:18:01.646 "uuid": "d973006c-ff59-47a1-8f9a-401e7eb806b3", 00:18:01.646 "is_configured": true, 00:18:01.646 "data_offset": 2048, 00:18:01.646 "data_size": 63488 00:18:01.646 }, 00:18:01.646 { 00:18:01.646 "name": "BaseBdev4", 00:18:01.646 "uuid": "0daeccfa-12dc-4a27-82da-eeeb1622808d", 00:18:01.646 "is_configured": true, 00:18:01.646 "data_offset": 2048, 00:18:01.647 "data_size": 63488 00:18:01.647 } 00:18:01.647 ] 00:18:01.647 } 00:18:01.647 } 00:18:01.647 }' 00:18:01.647 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:01.647 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:01.647 BaseBdev2 00:18:01.647 BaseBdev3 00:18:01.647 BaseBdev4' 00:18:01.647 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:01.647 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:01.647 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:01.904 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:01.904 "name": "BaseBdev1", 00:18:01.904 "aliases": [ 00:18:01.904 "54f0c4dd-6b19-4bb1-92cf-a5c161a39d83" 00:18:01.904 ], 00:18:01.904 "product_name": "Malloc disk", 00:18:01.904 "block_size": 512, 00:18:01.904 "num_blocks": 65536, 00:18:01.904 "uuid": "54f0c4dd-6b19-4bb1-92cf-a5c161a39d83", 00:18:01.904 "assigned_rate_limits": { 00:18:01.904 "rw_ios_per_sec": 0, 00:18:01.904 "rw_mbytes_per_sec": 0, 00:18:01.904 "r_mbytes_per_sec": 0, 00:18:01.904 "w_mbytes_per_sec": 0 00:18:01.904 }, 00:18:01.904 "claimed": true, 00:18:01.904 "claim_type": "exclusive_write", 00:18:01.904 "zoned": false, 00:18:01.904 "supported_io_types": { 00:18:01.904 "read": true, 00:18:01.904 "write": true, 00:18:01.904 "unmap": true, 00:18:01.904 "flush": true, 00:18:01.904 "reset": true, 00:18:01.904 "nvme_admin": false, 00:18:01.904 "nvme_io": false, 00:18:01.904 "nvme_io_md": false, 00:18:01.904 "write_zeroes": true, 00:18:01.904 "zcopy": true, 00:18:01.904 "get_zone_info": false, 00:18:01.904 "zone_management": false, 00:18:01.904 "zone_append": false, 00:18:01.904 "compare": false, 00:18:01.904 "compare_and_write": false, 00:18:01.904 "abort": true, 00:18:01.905 "seek_hole": false, 00:18:01.905 "seek_data": false, 00:18:01.905 "copy": true, 00:18:01.905 "nvme_iov_md": false 00:18:01.905 }, 00:18:01.905 "memory_domains": [ 00:18:01.905 { 00:18:01.905 "dma_device_id": "system", 00:18:01.905 "dma_device_type": 1 00:18:01.905 }, 00:18:01.905 { 00:18:01.905 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.905 "dma_device_type": 2 00:18:01.905 } 00:18:01.905 ], 00:18:01.905 "driver_specific": {} 00:18:01.905 }' 00:18:01.905 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:01.905 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:01.905 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:01.905 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.905 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.162 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:02.162 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.162 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.162 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:02.162 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.162 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.162 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:02.162 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:02.162 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:02.162 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:02.419 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:02.419 "name": "BaseBdev2", 00:18:02.419 "aliases": [ 00:18:02.419 "75119e0c-3cad-4144-8d68-5541e214222c" 00:18:02.419 ], 00:18:02.419 "product_name": "Malloc disk", 00:18:02.419 "block_size": 512, 00:18:02.419 "num_blocks": 65536, 00:18:02.419 "uuid": "75119e0c-3cad-4144-8d68-5541e214222c", 00:18:02.419 "assigned_rate_limits": { 00:18:02.419 "rw_ios_per_sec": 0, 00:18:02.419 "rw_mbytes_per_sec": 0, 00:18:02.419 "r_mbytes_per_sec": 0, 00:18:02.419 "w_mbytes_per_sec": 0 00:18:02.419 }, 00:18:02.419 "claimed": true, 00:18:02.419 "claim_type": "exclusive_write", 00:18:02.419 "zoned": false, 00:18:02.419 "supported_io_types": { 00:18:02.419 "read": true, 00:18:02.419 "write": true, 00:18:02.419 "unmap": true, 00:18:02.419 "flush": true, 00:18:02.419 "reset": true, 00:18:02.419 "nvme_admin": false, 00:18:02.419 "nvme_io": false, 00:18:02.419 "nvme_io_md": false, 00:18:02.419 "write_zeroes": true, 00:18:02.419 "zcopy": true, 00:18:02.419 "get_zone_info": false, 00:18:02.419 "zone_management": false, 00:18:02.419 "zone_append": false, 00:18:02.419 "compare": false, 00:18:02.419 "compare_and_write": false, 00:18:02.419 "abort": true, 00:18:02.419 "seek_hole": false, 00:18:02.419 "seek_data": false, 00:18:02.419 "copy": true, 00:18:02.419 "nvme_iov_md": false 00:18:02.419 }, 00:18:02.419 "memory_domains": [ 00:18:02.419 { 00:18:02.419 "dma_device_id": "system", 00:18:02.419 "dma_device_type": 1 00:18:02.419 }, 00:18:02.419 { 00:18:02.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.419 "dma_device_type": 2 00:18:02.419 } 00:18:02.419 ], 00:18:02.419 "driver_specific": {} 00:18:02.419 }' 00:18:02.419 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.419 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.419 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:02.419 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.419 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.419 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:02.419 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.677 08:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.677 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:02.677 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.677 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.677 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:02.677 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:02.677 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:02.677 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:02.946 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:02.946 "name": "BaseBdev3", 00:18:02.946 "aliases": [ 00:18:02.946 "d973006c-ff59-47a1-8f9a-401e7eb806b3" 00:18:02.946 ], 00:18:02.946 "product_name": "Malloc disk", 00:18:02.946 "block_size": 512, 00:18:02.946 "num_blocks": 65536, 00:18:02.946 "uuid": "d973006c-ff59-47a1-8f9a-401e7eb806b3", 00:18:02.946 "assigned_rate_limits": { 00:18:02.946 "rw_ios_per_sec": 0, 00:18:02.946 "rw_mbytes_per_sec": 0, 00:18:02.946 "r_mbytes_per_sec": 0, 00:18:02.946 "w_mbytes_per_sec": 0 00:18:02.946 }, 00:18:02.946 "claimed": true, 00:18:02.946 "claim_type": "exclusive_write", 00:18:02.946 "zoned": false, 00:18:02.946 "supported_io_types": { 00:18:02.946 "read": true, 00:18:02.946 "write": true, 00:18:02.946 "unmap": true, 00:18:02.946 "flush": true, 00:18:02.946 "reset": true, 00:18:02.946 "nvme_admin": false, 00:18:02.946 "nvme_io": false, 00:18:02.946 "nvme_io_md": false, 00:18:02.946 "write_zeroes": true, 00:18:02.946 "zcopy": true, 00:18:02.946 "get_zone_info": false, 00:18:02.946 "zone_management": false, 00:18:02.946 "zone_append": false, 00:18:02.946 "compare": false, 00:18:02.946 "compare_and_write": false, 00:18:02.946 "abort": true, 00:18:02.946 "seek_hole": false, 00:18:02.946 "seek_data": false, 00:18:02.946 "copy": true, 00:18:02.946 "nvme_iov_md": false 00:18:02.946 }, 00:18:02.946 "memory_domains": [ 00:18:02.946 { 00:18:02.946 "dma_device_id": "system", 00:18:02.946 "dma_device_type": 1 00:18:02.946 }, 00:18:02.946 { 00:18:02.946 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.946 "dma_device_type": 2 00:18:02.946 } 00:18:02.946 ], 00:18:02.946 "driver_specific": {} 00:18:02.946 }' 00:18:02.946 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.946 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.946 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:02.946 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.946 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.946 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:02.946 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.946 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:03.260 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:03.260 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.260 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.260 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:03.260 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:03.260 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:03.260 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:03.260 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:03.260 "name": "BaseBdev4", 00:18:03.260 "aliases": [ 00:18:03.260 "0daeccfa-12dc-4a27-82da-eeeb1622808d" 00:18:03.260 ], 00:18:03.260 "product_name": "Malloc disk", 00:18:03.260 "block_size": 512, 00:18:03.260 "num_blocks": 65536, 00:18:03.260 "uuid": "0daeccfa-12dc-4a27-82da-eeeb1622808d", 00:18:03.260 "assigned_rate_limits": { 00:18:03.260 "rw_ios_per_sec": 0, 00:18:03.260 "rw_mbytes_per_sec": 0, 00:18:03.260 "r_mbytes_per_sec": 0, 00:18:03.260 "w_mbytes_per_sec": 0 00:18:03.260 }, 00:18:03.260 "claimed": true, 00:18:03.260 "claim_type": "exclusive_write", 00:18:03.260 "zoned": false, 00:18:03.260 "supported_io_types": { 00:18:03.260 "read": true, 00:18:03.260 "write": true, 00:18:03.260 "unmap": true, 00:18:03.260 "flush": true, 00:18:03.260 "reset": true, 00:18:03.260 "nvme_admin": false, 00:18:03.260 "nvme_io": false, 00:18:03.260 "nvme_io_md": false, 00:18:03.260 "write_zeroes": true, 00:18:03.260 "zcopy": true, 00:18:03.260 "get_zone_info": false, 00:18:03.260 "zone_management": false, 00:18:03.260 "zone_append": false, 00:18:03.260 "compare": false, 00:18:03.260 "compare_and_write": false, 00:18:03.260 "abort": true, 00:18:03.260 "seek_hole": false, 00:18:03.260 "seek_data": false, 00:18:03.260 "copy": true, 00:18:03.260 "nvme_iov_md": false 00:18:03.260 }, 00:18:03.260 "memory_domains": [ 00:18:03.260 { 00:18:03.260 "dma_device_id": "system", 00:18:03.260 "dma_device_type": 1 00:18:03.260 }, 00:18:03.260 { 00:18:03.260 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.260 "dma_device_type": 2 00:18:03.260 } 00:18:03.260 ], 00:18:03.260 "driver_specific": {} 00:18:03.260 }' 00:18:03.260 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:03.517 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:03.517 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:03.517 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:03.517 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:03.517 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:03.517 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:03.517 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:03.517 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:03.517 08:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.517 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.775 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:03.775 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:03.775 [2024-07-23 08:31:16.239389] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:03.775 [2024-07-23 08:31:16.239421] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:03.775 [2024-07-23 08:31:16.239472] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:03.775 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:03.775 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:03.775 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:03.775 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:18:03.775 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:03.775 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:18:03.775 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:03.775 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:04.033 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:04.033 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:04.033 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:04.033 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:04.033 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:04.033 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:04.033 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:04.033 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:04.033 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.033 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:04.033 "name": "Existed_Raid", 00:18:04.033 "uuid": "cfcda846-5fc5-43eb-9831-eeb1c5b7175c", 00:18:04.033 "strip_size_kb": 64, 00:18:04.033 "state": "offline", 00:18:04.033 "raid_level": "raid0", 00:18:04.033 "superblock": true, 00:18:04.033 "num_base_bdevs": 4, 00:18:04.033 "num_base_bdevs_discovered": 3, 00:18:04.033 "num_base_bdevs_operational": 3, 00:18:04.033 "base_bdevs_list": [ 00:18:04.033 { 00:18:04.033 "name": null, 00:18:04.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:04.033 "is_configured": false, 00:18:04.033 "data_offset": 2048, 00:18:04.033 "data_size": 63488 00:18:04.033 }, 00:18:04.033 { 00:18:04.033 "name": "BaseBdev2", 00:18:04.033 "uuid": "75119e0c-3cad-4144-8d68-5541e214222c", 00:18:04.033 "is_configured": true, 00:18:04.033 "data_offset": 2048, 00:18:04.033 "data_size": 63488 00:18:04.033 }, 00:18:04.033 { 00:18:04.033 "name": "BaseBdev3", 00:18:04.033 "uuid": "d973006c-ff59-47a1-8f9a-401e7eb806b3", 00:18:04.033 "is_configured": true, 00:18:04.033 "data_offset": 2048, 00:18:04.033 "data_size": 63488 00:18:04.033 }, 00:18:04.033 { 00:18:04.033 "name": "BaseBdev4", 00:18:04.033 "uuid": "0daeccfa-12dc-4a27-82da-eeeb1622808d", 00:18:04.033 "is_configured": true, 00:18:04.033 "data_offset": 2048, 00:18:04.033 "data_size": 63488 00:18:04.033 } 00:18:04.033 ] 00:18:04.033 }' 00:18:04.033 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:04.033 08:31:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:04.599 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:04.599 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:04.599 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.599 08:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:04.856 08:31:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:04.856 08:31:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:04.856 08:31:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:04.856 [2024-07-23 08:31:17.288484] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:05.131 08:31:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:05.131 08:31:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:05.131 08:31:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.131 08:31:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:05.131 08:31:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:05.131 08:31:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:05.132 08:31:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:05.390 [2024-07-23 08:31:17.730597] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:05.390 08:31:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:05.390 08:31:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:05.390 08:31:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.390 08:31:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:05.649 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:05.649 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:05.649 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:05.907 [2024-07-23 08:31:18.169778] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:05.907 [2024-07-23 08:31:18.169834] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:18:05.907 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:05.907 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:05.907 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.907 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:06.166 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:06.166 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:06.166 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:06.166 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:06.166 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:06.166 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:06.166 BaseBdev2 00:18:06.166 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:06.166 08:31:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:06.166 08:31:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:06.166 08:31:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:06.166 08:31:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:06.166 08:31:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:06.166 08:31:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:06.424 08:31:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:06.683 [ 00:18:06.683 { 00:18:06.683 "name": "BaseBdev2", 00:18:06.683 "aliases": [ 00:18:06.683 "68d7d8b5-3916-4ea2-9afd-1377fbadd496" 00:18:06.683 ], 00:18:06.683 "product_name": "Malloc disk", 00:18:06.683 "block_size": 512, 00:18:06.683 "num_blocks": 65536, 00:18:06.683 "uuid": "68d7d8b5-3916-4ea2-9afd-1377fbadd496", 00:18:06.683 "assigned_rate_limits": { 00:18:06.683 "rw_ios_per_sec": 0, 00:18:06.683 "rw_mbytes_per_sec": 0, 00:18:06.683 "r_mbytes_per_sec": 0, 00:18:06.683 "w_mbytes_per_sec": 0 00:18:06.683 }, 00:18:06.683 "claimed": false, 00:18:06.683 "zoned": false, 00:18:06.683 "supported_io_types": { 00:18:06.683 "read": true, 00:18:06.683 "write": true, 00:18:06.683 "unmap": true, 00:18:06.683 "flush": true, 00:18:06.683 "reset": true, 00:18:06.683 "nvme_admin": false, 00:18:06.683 "nvme_io": false, 00:18:06.683 "nvme_io_md": false, 00:18:06.683 "write_zeroes": true, 00:18:06.683 "zcopy": true, 00:18:06.683 "get_zone_info": false, 00:18:06.683 "zone_management": false, 00:18:06.683 "zone_append": false, 00:18:06.683 "compare": false, 00:18:06.683 "compare_and_write": false, 00:18:06.683 "abort": true, 00:18:06.683 "seek_hole": false, 00:18:06.683 "seek_data": false, 00:18:06.683 "copy": true, 00:18:06.683 "nvme_iov_md": false 00:18:06.683 }, 00:18:06.683 "memory_domains": [ 00:18:06.683 { 00:18:06.683 "dma_device_id": "system", 00:18:06.683 "dma_device_type": 1 00:18:06.683 }, 00:18:06.683 { 00:18:06.683 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:06.683 "dma_device_type": 2 00:18:06.683 } 00:18:06.683 ], 00:18:06.683 "driver_specific": {} 00:18:06.683 } 00:18:06.683 ] 00:18:06.683 08:31:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:06.683 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:06.683 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:06.683 08:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:06.683 BaseBdev3 00:18:06.942 08:31:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:06.942 08:31:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:06.942 08:31:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:06.942 08:31:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:06.942 08:31:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:06.942 08:31:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:06.942 08:31:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:06.942 08:31:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:07.200 [ 00:18:07.200 { 00:18:07.200 "name": "BaseBdev3", 00:18:07.200 "aliases": [ 00:18:07.200 "872732c4-c8d6-431b-8b7d-8fd470473d7b" 00:18:07.200 ], 00:18:07.200 "product_name": "Malloc disk", 00:18:07.200 "block_size": 512, 00:18:07.200 "num_blocks": 65536, 00:18:07.200 "uuid": "872732c4-c8d6-431b-8b7d-8fd470473d7b", 00:18:07.200 "assigned_rate_limits": { 00:18:07.200 "rw_ios_per_sec": 0, 00:18:07.200 "rw_mbytes_per_sec": 0, 00:18:07.200 "r_mbytes_per_sec": 0, 00:18:07.200 "w_mbytes_per_sec": 0 00:18:07.200 }, 00:18:07.200 "claimed": false, 00:18:07.200 "zoned": false, 00:18:07.200 "supported_io_types": { 00:18:07.200 "read": true, 00:18:07.200 "write": true, 00:18:07.200 "unmap": true, 00:18:07.200 "flush": true, 00:18:07.200 "reset": true, 00:18:07.200 "nvme_admin": false, 00:18:07.200 "nvme_io": false, 00:18:07.200 "nvme_io_md": false, 00:18:07.200 "write_zeroes": true, 00:18:07.200 "zcopy": true, 00:18:07.200 "get_zone_info": false, 00:18:07.200 "zone_management": false, 00:18:07.200 "zone_append": false, 00:18:07.200 "compare": false, 00:18:07.200 "compare_and_write": false, 00:18:07.200 "abort": true, 00:18:07.200 "seek_hole": false, 00:18:07.200 "seek_data": false, 00:18:07.200 "copy": true, 00:18:07.200 "nvme_iov_md": false 00:18:07.200 }, 00:18:07.200 "memory_domains": [ 00:18:07.200 { 00:18:07.200 "dma_device_id": "system", 00:18:07.200 "dma_device_type": 1 00:18:07.200 }, 00:18:07.200 { 00:18:07.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:07.200 "dma_device_type": 2 00:18:07.200 } 00:18:07.200 ], 00:18:07.200 "driver_specific": {} 00:18:07.200 } 00:18:07.200 ] 00:18:07.200 08:31:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:07.200 08:31:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:07.200 08:31:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:07.200 08:31:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:07.458 BaseBdev4 00:18:07.458 08:31:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:07.458 08:31:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:07.458 08:31:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:07.458 08:31:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:07.458 08:31:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:07.458 08:31:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:07.458 08:31:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:07.458 08:31:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:07.715 [ 00:18:07.715 { 00:18:07.715 "name": "BaseBdev4", 00:18:07.715 "aliases": [ 00:18:07.715 "4b8c7632-83f8-4565-b46e-c1497f767650" 00:18:07.715 ], 00:18:07.715 "product_name": "Malloc disk", 00:18:07.715 "block_size": 512, 00:18:07.715 "num_blocks": 65536, 00:18:07.715 "uuid": "4b8c7632-83f8-4565-b46e-c1497f767650", 00:18:07.715 "assigned_rate_limits": { 00:18:07.715 "rw_ios_per_sec": 0, 00:18:07.715 "rw_mbytes_per_sec": 0, 00:18:07.715 "r_mbytes_per_sec": 0, 00:18:07.715 "w_mbytes_per_sec": 0 00:18:07.715 }, 00:18:07.715 "claimed": false, 00:18:07.715 "zoned": false, 00:18:07.715 "supported_io_types": { 00:18:07.715 "read": true, 00:18:07.715 "write": true, 00:18:07.715 "unmap": true, 00:18:07.715 "flush": true, 00:18:07.715 "reset": true, 00:18:07.715 "nvme_admin": false, 00:18:07.715 "nvme_io": false, 00:18:07.716 "nvme_io_md": false, 00:18:07.716 "write_zeroes": true, 00:18:07.716 "zcopy": true, 00:18:07.716 "get_zone_info": false, 00:18:07.716 "zone_management": false, 00:18:07.716 "zone_append": false, 00:18:07.716 "compare": false, 00:18:07.716 "compare_and_write": false, 00:18:07.716 "abort": true, 00:18:07.716 "seek_hole": false, 00:18:07.716 "seek_data": false, 00:18:07.716 "copy": true, 00:18:07.716 "nvme_iov_md": false 00:18:07.716 }, 00:18:07.716 "memory_domains": [ 00:18:07.716 { 00:18:07.716 "dma_device_id": "system", 00:18:07.716 "dma_device_type": 1 00:18:07.716 }, 00:18:07.716 { 00:18:07.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:07.716 "dma_device_type": 2 00:18:07.716 } 00:18:07.716 ], 00:18:07.716 "driver_specific": {} 00:18:07.716 } 00:18:07.716 ] 00:18:07.716 08:31:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:07.716 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:07.716 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:07.716 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:07.716 [2024-07-23 08:31:20.228662] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:07.716 [2024-07-23 08:31:20.228703] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:07.716 [2024-07-23 08:31:20.228729] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:07.716 [2024-07-23 08:31:20.230393] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:07.716 [2024-07-23 08:31:20.230443] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:07.974 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:07.974 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:07.974 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:07.974 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:07.974 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:07.974 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:07.974 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:07.974 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:07.974 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:07.974 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:07.974 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.974 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:07.974 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:07.974 "name": "Existed_Raid", 00:18:07.974 "uuid": "88866f6d-feb0-46f2-8a19-213a3ff0bc18", 00:18:07.974 "strip_size_kb": 64, 00:18:07.974 "state": "configuring", 00:18:07.974 "raid_level": "raid0", 00:18:07.974 "superblock": true, 00:18:07.974 "num_base_bdevs": 4, 00:18:07.974 "num_base_bdevs_discovered": 3, 00:18:07.974 "num_base_bdevs_operational": 4, 00:18:07.974 "base_bdevs_list": [ 00:18:07.974 { 00:18:07.974 "name": "BaseBdev1", 00:18:07.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:07.974 "is_configured": false, 00:18:07.974 "data_offset": 0, 00:18:07.974 "data_size": 0 00:18:07.974 }, 00:18:07.974 { 00:18:07.974 "name": "BaseBdev2", 00:18:07.974 "uuid": "68d7d8b5-3916-4ea2-9afd-1377fbadd496", 00:18:07.974 "is_configured": true, 00:18:07.974 "data_offset": 2048, 00:18:07.974 "data_size": 63488 00:18:07.974 }, 00:18:07.974 { 00:18:07.974 "name": "BaseBdev3", 00:18:07.974 "uuid": "872732c4-c8d6-431b-8b7d-8fd470473d7b", 00:18:07.974 "is_configured": true, 00:18:07.974 "data_offset": 2048, 00:18:07.974 "data_size": 63488 00:18:07.974 }, 00:18:07.974 { 00:18:07.974 "name": "BaseBdev4", 00:18:07.974 "uuid": "4b8c7632-83f8-4565-b46e-c1497f767650", 00:18:07.974 "is_configured": true, 00:18:07.974 "data_offset": 2048, 00:18:07.974 "data_size": 63488 00:18:07.974 } 00:18:07.974 ] 00:18:07.974 }' 00:18:07.974 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:07.974 08:31:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:08.540 08:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:08.540 [2024-07-23 08:31:21.046770] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:08.798 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:08.798 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:08.798 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:08.798 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:08.798 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:08.798 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:08.798 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:08.798 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:08.798 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:08.798 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:08.798 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.798 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:08.798 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:08.798 "name": "Existed_Raid", 00:18:08.798 "uuid": "88866f6d-feb0-46f2-8a19-213a3ff0bc18", 00:18:08.798 "strip_size_kb": 64, 00:18:08.798 "state": "configuring", 00:18:08.798 "raid_level": "raid0", 00:18:08.798 "superblock": true, 00:18:08.798 "num_base_bdevs": 4, 00:18:08.798 "num_base_bdevs_discovered": 2, 00:18:08.798 "num_base_bdevs_operational": 4, 00:18:08.798 "base_bdevs_list": [ 00:18:08.798 { 00:18:08.798 "name": "BaseBdev1", 00:18:08.798 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.798 "is_configured": false, 00:18:08.798 "data_offset": 0, 00:18:08.798 "data_size": 0 00:18:08.798 }, 00:18:08.798 { 00:18:08.798 "name": null, 00:18:08.799 "uuid": "68d7d8b5-3916-4ea2-9afd-1377fbadd496", 00:18:08.799 "is_configured": false, 00:18:08.799 "data_offset": 2048, 00:18:08.799 "data_size": 63488 00:18:08.799 }, 00:18:08.799 { 00:18:08.799 "name": "BaseBdev3", 00:18:08.799 "uuid": "872732c4-c8d6-431b-8b7d-8fd470473d7b", 00:18:08.799 "is_configured": true, 00:18:08.799 "data_offset": 2048, 00:18:08.799 "data_size": 63488 00:18:08.799 }, 00:18:08.799 { 00:18:08.799 "name": "BaseBdev4", 00:18:08.799 "uuid": "4b8c7632-83f8-4565-b46e-c1497f767650", 00:18:08.799 "is_configured": true, 00:18:08.799 "data_offset": 2048, 00:18:08.799 "data_size": 63488 00:18:08.799 } 00:18:08.799 ] 00:18:08.799 }' 00:18:08.799 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:08.799 08:31:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:09.365 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.365 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:09.365 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:09.366 08:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:09.624 [2024-07-23 08:31:22.045371] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:09.624 BaseBdev1 00:18:09.624 08:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:09.624 08:31:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:09.624 08:31:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:09.624 08:31:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:09.624 08:31:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:09.624 08:31:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:09.624 08:31:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:09.882 08:31:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:09.882 [ 00:18:09.882 { 00:18:09.882 "name": "BaseBdev1", 00:18:09.882 "aliases": [ 00:18:09.882 "6d5873b8-f34a-461c-a418-c690afae3a3d" 00:18:09.882 ], 00:18:09.882 "product_name": "Malloc disk", 00:18:09.882 "block_size": 512, 00:18:09.882 "num_blocks": 65536, 00:18:09.882 "uuid": "6d5873b8-f34a-461c-a418-c690afae3a3d", 00:18:09.882 "assigned_rate_limits": { 00:18:09.882 "rw_ios_per_sec": 0, 00:18:09.882 "rw_mbytes_per_sec": 0, 00:18:09.882 "r_mbytes_per_sec": 0, 00:18:09.882 "w_mbytes_per_sec": 0 00:18:09.882 }, 00:18:09.882 "claimed": true, 00:18:09.882 "claim_type": "exclusive_write", 00:18:09.882 "zoned": false, 00:18:09.882 "supported_io_types": { 00:18:09.882 "read": true, 00:18:09.882 "write": true, 00:18:09.882 "unmap": true, 00:18:09.882 "flush": true, 00:18:09.882 "reset": true, 00:18:09.882 "nvme_admin": false, 00:18:09.882 "nvme_io": false, 00:18:09.882 "nvme_io_md": false, 00:18:09.882 "write_zeroes": true, 00:18:09.882 "zcopy": true, 00:18:09.882 "get_zone_info": false, 00:18:09.882 "zone_management": false, 00:18:09.882 "zone_append": false, 00:18:09.882 "compare": false, 00:18:09.882 "compare_and_write": false, 00:18:09.882 "abort": true, 00:18:09.882 "seek_hole": false, 00:18:09.882 "seek_data": false, 00:18:09.882 "copy": true, 00:18:09.882 "nvme_iov_md": false 00:18:09.882 }, 00:18:09.882 "memory_domains": [ 00:18:09.882 { 00:18:09.882 "dma_device_id": "system", 00:18:09.882 "dma_device_type": 1 00:18:09.882 }, 00:18:09.882 { 00:18:09.882 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.882 "dma_device_type": 2 00:18:09.882 } 00:18:09.882 ], 00:18:09.882 "driver_specific": {} 00:18:09.882 } 00:18:09.882 ] 00:18:09.882 08:31:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:09.882 08:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:09.882 08:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:09.882 08:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:09.882 08:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:09.882 08:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:09.882 08:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:09.882 08:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:09.882 08:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:09.882 08:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:09.882 08:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:10.140 08:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.140 08:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:10.140 08:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:10.140 "name": "Existed_Raid", 00:18:10.140 "uuid": "88866f6d-feb0-46f2-8a19-213a3ff0bc18", 00:18:10.140 "strip_size_kb": 64, 00:18:10.140 "state": "configuring", 00:18:10.140 "raid_level": "raid0", 00:18:10.140 "superblock": true, 00:18:10.140 "num_base_bdevs": 4, 00:18:10.140 "num_base_bdevs_discovered": 3, 00:18:10.140 "num_base_bdevs_operational": 4, 00:18:10.140 "base_bdevs_list": [ 00:18:10.140 { 00:18:10.140 "name": "BaseBdev1", 00:18:10.140 "uuid": "6d5873b8-f34a-461c-a418-c690afae3a3d", 00:18:10.140 "is_configured": true, 00:18:10.140 "data_offset": 2048, 00:18:10.140 "data_size": 63488 00:18:10.140 }, 00:18:10.140 { 00:18:10.140 "name": null, 00:18:10.140 "uuid": "68d7d8b5-3916-4ea2-9afd-1377fbadd496", 00:18:10.140 "is_configured": false, 00:18:10.140 "data_offset": 2048, 00:18:10.140 "data_size": 63488 00:18:10.140 }, 00:18:10.140 { 00:18:10.140 "name": "BaseBdev3", 00:18:10.140 "uuid": "872732c4-c8d6-431b-8b7d-8fd470473d7b", 00:18:10.140 "is_configured": true, 00:18:10.140 "data_offset": 2048, 00:18:10.140 "data_size": 63488 00:18:10.140 }, 00:18:10.140 { 00:18:10.140 "name": "BaseBdev4", 00:18:10.140 "uuid": "4b8c7632-83f8-4565-b46e-c1497f767650", 00:18:10.140 "is_configured": true, 00:18:10.140 "data_offset": 2048, 00:18:10.140 "data_size": 63488 00:18:10.140 } 00:18:10.140 ] 00:18:10.140 }' 00:18:10.140 08:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:10.140 08:31:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:10.705 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:10.705 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.963 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:10.963 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:10.963 [2024-07-23 08:31:23.380951] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:10.963 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:10.963 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:10.963 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:10.963 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:10.963 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:10.963 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:10.963 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:10.963 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:10.963 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:10.963 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:10.963 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.963 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:11.222 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:11.222 "name": "Existed_Raid", 00:18:11.222 "uuid": "88866f6d-feb0-46f2-8a19-213a3ff0bc18", 00:18:11.222 "strip_size_kb": 64, 00:18:11.222 "state": "configuring", 00:18:11.222 "raid_level": "raid0", 00:18:11.222 "superblock": true, 00:18:11.222 "num_base_bdevs": 4, 00:18:11.222 "num_base_bdevs_discovered": 2, 00:18:11.222 "num_base_bdevs_operational": 4, 00:18:11.222 "base_bdevs_list": [ 00:18:11.222 { 00:18:11.222 "name": "BaseBdev1", 00:18:11.222 "uuid": "6d5873b8-f34a-461c-a418-c690afae3a3d", 00:18:11.222 "is_configured": true, 00:18:11.222 "data_offset": 2048, 00:18:11.222 "data_size": 63488 00:18:11.222 }, 00:18:11.222 { 00:18:11.222 "name": null, 00:18:11.222 "uuid": "68d7d8b5-3916-4ea2-9afd-1377fbadd496", 00:18:11.222 "is_configured": false, 00:18:11.222 "data_offset": 2048, 00:18:11.222 "data_size": 63488 00:18:11.222 }, 00:18:11.222 { 00:18:11.222 "name": null, 00:18:11.222 "uuid": "872732c4-c8d6-431b-8b7d-8fd470473d7b", 00:18:11.222 "is_configured": false, 00:18:11.222 "data_offset": 2048, 00:18:11.222 "data_size": 63488 00:18:11.222 }, 00:18:11.222 { 00:18:11.222 "name": "BaseBdev4", 00:18:11.222 "uuid": "4b8c7632-83f8-4565-b46e-c1497f767650", 00:18:11.222 "is_configured": true, 00:18:11.222 "data_offset": 2048, 00:18:11.222 "data_size": 63488 00:18:11.222 } 00:18:11.222 ] 00:18:11.222 }' 00:18:11.222 08:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:11.222 08:31:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:11.788 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:11.788 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.788 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:11.788 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:12.046 [2024-07-23 08:31:24.371598] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:12.046 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:12.046 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:12.046 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:12.046 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:12.046 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:12.046 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:12.046 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:12.046 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:12.046 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:12.046 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:12.046 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.046 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:12.046 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:12.046 "name": "Existed_Raid", 00:18:12.046 "uuid": "88866f6d-feb0-46f2-8a19-213a3ff0bc18", 00:18:12.046 "strip_size_kb": 64, 00:18:12.046 "state": "configuring", 00:18:12.046 "raid_level": "raid0", 00:18:12.046 "superblock": true, 00:18:12.046 "num_base_bdevs": 4, 00:18:12.046 "num_base_bdevs_discovered": 3, 00:18:12.046 "num_base_bdevs_operational": 4, 00:18:12.046 "base_bdevs_list": [ 00:18:12.046 { 00:18:12.047 "name": "BaseBdev1", 00:18:12.047 "uuid": "6d5873b8-f34a-461c-a418-c690afae3a3d", 00:18:12.047 "is_configured": true, 00:18:12.047 "data_offset": 2048, 00:18:12.047 "data_size": 63488 00:18:12.047 }, 00:18:12.047 { 00:18:12.047 "name": null, 00:18:12.047 "uuid": "68d7d8b5-3916-4ea2-9afd-1377fbadd496", 00:18:12.047 "is_configured": false, 00:18:12.047 "data_offset": 2048, 00:18:12.047 "data_size": 63488 00:18:12.047 }, 00:18:12.047 { 00:18:12.047 "name": "BaseBdev3", 00:18:12.047 "uuid": "872732c4-c8d6-431b-8b7d-8fd470473d7b", 00:18:12.047 "is_configured": true, 00:18:12.047 "data_offset": 2048, 00:18:12.047 "data_size": 63488 00:18:12.047 }, 00:18:12.047 { 00:18:12.047 "name": "BaseBdev4", 00:18:12.047 "uuid": "4b8c7632-83f8-4565-b46e-c1497f767650", 00:18:12.047 "is_configured": true, 00:18:12.047 "data_offset": 2048, 00:18:12.047 "data_size": 63488 00:18:12.047 } 00:18:12.047 ] 00:18:12.047 }' 00:18:12.047 08:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:12.047 08:31:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:12.613 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.613 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:12.871 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:12.871 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:12.871 [2024-07-23 08:31:25.366245] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:13.129 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:13.129 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:13.129 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:13.129 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:13.129 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:13.129 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:13.129 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:13.129 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:13.129 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:13.129 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:13.129 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:13.129 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.387 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:13.387 "name": "Existed_Raid", 00:18:13.387 "uuid": "88866f6d-feb0-46f2-8a19-213a3ff0bc18", 00:18:13.387 "strip_size_kb": 64, 00:18:13.387 "state": "configuring", 00:18:13.387 "raid_level": "raid0", 00:18:13.387 "superblock": true, 00:18:13.387 "num_base_bdevs": 4, 00:18:13.387 "num_base_bdevs_discovered": 2, 00:18:13.387 "num_base_bdevs_operational": 4, 00:18:13.387 "base_bdevs_list": [ 00:18:13.387 { 00:18:13.387 "name": null, 00:18:13.387 "uuid": "6d5873b8-f34a-461c-a418-c690afae3a3d", 00:18:13.387 "is_configured": false, 00:18:13.387 "data_offset": 2048, 00:18:13.387 "data_size": 63488 00:18:13.387 }, 00:18:13.387 { 00:18:13.387 "name": null, 00:18:13.387 "uuid": "68d7d8b5-3916-4ea2-9afd-1377fbadd496", 00:18:13.387 "is_configured": false, 00:18:13.387 "data_offset": 2048, 00:18:13.387 "data_size": 63488 00:18:13.387 }, 00:18:13.387 { 00:18:13.387 "name": "BaseBdev3", 00:18:13.387 "uuid": "872732c4-c8d6-431b-8b7d-8fd470473d7b", 00:18:13.387 "is_configured": true, 00:18:13.387 "data_offset": 2048, 00:18:13.387 "data_size": 63488 00:18:13.387 }, 00:18:13.387 { 00:18:13.387 "name": "BaseBdev4", 00:18:13.387 "uuid": "4b8c7632-83f8-4565-b46e-c1497f767650", 00:18:13.387 "is_configured": true, 00:18:13.387 "data_offset": 2048, 00:18:13.387 "data_size": 63488 00:18:13.387 } 00:18:13.387 ] 00:18:13.387 }' 00:18:13.387 08:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:13.387 08:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:13.954 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.954 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:13.954 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:13.954 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:14.212 [2024-07-23 08:31:26.481850] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:14.212 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:14.212 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:14.212 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:14.212 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:14.212 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:14.212 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:14.212 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:14.212 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:14.212 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:14.212 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:14.212 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.212 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:14.212 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:14.212 "name": "Existed_Raid", 00:18:14.212 "uuid": "88866f6d-feb0-46f2-8a19-213a3ff0bc18", 00:18:14.212 "strip_size_kb": 64, 00:18:14.212 "state": "configuring", 00:18:14.212 "raid_level": "raid0", 00:18:14.212 "superblock": true, 00:18:14.212 "num_base_bdevs": 4, 00:18:14.212 "num_base_bdevs_discovered": 3, 00:18:14.212 "num_base_bdevs_operational": 4, 00:18:14.212 "base_bdevs_list": [ 00:18:14.212 { 00:18:14.212 "name": null, 00:18:14.212 "uuid": "6d5873b8-f34a-461c-a418-c690afae3a3d", 00:18:14.212 "is_configured": false, 00:18:14.212 "data_offset": 2048, 00:18:14.212 "data_size": 63488 00:18:14.212 }, 00:18:14.212 { 00:18:14.212 "name": "BaseBdev2", 00:18:14.213 "uuid": "68d7d8b5-3916-4ea2-9afd-1377fbadd496", 00:18:14.213 "is_configured": true, 00:18:14.213 "data_offset": 2048, 00:18:14.213 "data_size": 63488 00:18:14.213 }, 00:18:14.213 { 00:18:14.213 "name": "BaseBdev3", 00:18:14.213 "uuid": "872732c4-c8d6-431b-8b7d-8fd470473d7b", 00:18:14.213 "is_configured": true, 00:18:14.213 "data_offset": 2048, 00:18:14.213 "data_size": 63488 00:18:14.213 }, 00:18:14.213 { 00:18:14.213 "name": "BaseBdev4", 00:18:14.213 "uuid": "4b8c7632-83f8-4565-b46e-c1497f767650", 00:18:14.213 "is_configured": true, 00:18:14.213 "data_offset": 2048, 00:18:14.213 "data_size": 63488 00:18:14.213 } 00:18:14.213 ] 00:18:14.213 }' 00:18:14.213 08:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:14.213 08:31:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:14.779 08:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:14.779 08:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.036 08:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:15.036 08:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.036 08:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:15.036 08:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6d5873b8-f34a-461c-a418-c690afae3a3d 00:18:15.295 [2024-07-23 08:31:27.694761] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:15.295 [2024-07-23 08:31:27.694980] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037280 00:18:15.295 [2024-07-23 08:31:27.694995] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:15.295 [2024-07-23 08:31:27.695221] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c3a0 00:18:15.295 [2024-07-23 08:31:27.695393] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037280 00:18:15.295 [2024-07-23 08:31:27.695404] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000037280 00:18:15.295 [2024-07-23 08:31:27.695545] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:15.295 NewBaseBdev 00:18:15.295 08:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:15.295 08:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:15.295 08:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:15.295 08:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:15.295 08:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:15.295 08:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:15.295 08:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:15.553 08:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:15.553 [ 00:18:15.553 { 00:18:15.553 "name": "NewBaseBdev", 00:18:15.553 "aliases": [ 00:18:15.553 "6d5873b8-f34a-461c-a418-c690afae3a3d" 00:18:15.553 ], 00:18:15.553 "product_name": "Malloc disk", 00:18:15.553 "block_size": 512, 00:18:15.553 "num_blocks": 65536, 00:18:15.553 "uuid": "6d5873b8-f34a-461c-a418-c690afae3a3d", 00:18:15.553 "assigned_rate_limits": { 00:18:15.553 "rw_ios_per_sec": 0, 00:18:15.553 "rw_mbytes_per_sec": 0, 00:18:15.553 "r_mbytes_per_sec": 0, 00:18:15.553 "w_mbytes_per_sec": 0 00:18:15.553 }, 00:18:15.553 "claimed": true, 00:18:15.553 "claim_type": "exclusive_write", 00:18:15.553 "zoned": false, 00:18:15.553 "supported_io_types": { 00:18:15.553 "read": true, 00:18:15.554 "write": true, 00:18:15.554 "unmap": true, 00:18:15.554 "flush": true, 00:18:15.554 "reset": true, 00:18:15.554 "nvme_admin": false, 00:18:15.554 "nvme_io": false, 00:18:15.554 "nvme_io_md": false, 00:18:15.554 "write_zeroes": true, 00:18:15.554 "zcopy": true, 00:18:15.554 "get_zone_info": false, 00:18:15.554 "zone_management": false, 00:18:15.554 "zone_append": false, 00:18:15.554 "compare": false, 00:18:15.554 "compare_and_write": false, 00:18:15.554 "abort": true, 00:18:15.554 "seek_hole": false, 00:18:15.554 "seek_data": false, 00:18:15.554 "copy": true, 00:18:15.554 "nvme_iov_md": false 00:18:15.554 }, 00:18:15.554 "memory_domains": [ 00:18:15.554 { 00:18:15.554 "dma_device_id": "system", 00:18:15.554 "dma_device_type": 1 00:18:15.554 }, 00:18:15.554 { 00:18:15.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.554 "dma_device_type": 2 00:18:15.554 } 00:18:15.554 ], 00:18:15.554 "driver_specific": {} 00:18:15.554 } 00:18:15.554 ] 00:18:15.554 08:31:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:15.554 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:15.554 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:15.554 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:15.554 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:15.554 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:15.554 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:15.554 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:15.554 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:15.554 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:15.554 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:15.554 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.554 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:15.812 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:15.812 "name": "Existed_Raid", 00:18:15.812 "uuid": "88866f6d-feb0-46f2-8a19-213a3ff0bc18", 00:18:15.812 "strip_size_kb": 64, 00:18:15.812 "state": "online", 00:18:15.812 "raid_level": "raid0", 00:18:15.812 "superblock": true, 00:18:15.812 "num_base_bdevs": 4, 00:18:15.812 "num_base_bdevs_discovered": 4, 00:18:15.812 "num_base_bdevs_operational": 4, 00:18:15.812 "base_bdevs_list": [ 00:18:15.812 { 00:18:15.812 "name": "NewBaseBdev", 00:18:15.812 "uuid": "6d5873b8-f34a-461c-a418-c690afae3a3d", 00:18:15.812 "is_configured": true, 00:18:15.812 "data_offset": 2048, 00:18:15.812 "data_size": 63488 00:18:15.812 }, 00:18:15.812 { 00:18:15.812 "name": "BaseBdev2", 00:18:15.812 "uuid": "68d7d8b5-3916-4ea2-9afd-1377fbadd496", 00:18:15.812 "is_configured": true, 00:18:15.812 "data_offset": 2048, 00:18:15.812 "data_size": 63488 00:18:15.812 }, 00:18:15.812 { 00:18:15.812 "name": "BaseBdev3", 00:18:15.812 "uuid": "872732c4-c8d6-431b-8b7d-8fd470473d7b", 00:18:15.812 "is_configured": true, 00:18:15.812 "data_offset": 2048, 00:18:15.812 "data_size": 63488 00:18:15.812 }, 00:18:15.812 { 00:18:15.812 "name": "BaseBdev4", 00:18:15.812 "uuid": "4b8c7632-83f8-4565-b46e-c1497f767650", 00:18:15.812 "is_configured": true, 00:18:15.812 "data_offset": 2048, 00:18:15.812 "data_size": 63488 00:18:15.812 } 00:18:15.812 ] 00:18:15.812 }' 00:18:15.812 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:15.812 08:31:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:16.409 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:16.409 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:16.409 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:16.409 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:16.409 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:16.409 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:16.409 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:16.409 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:16.409 [2024-07-23 08:31:28.858379] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:16.409 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:16.409 "name": "Existed_Raid", 00:18:16.409 "aliases": [ 00:18:16.409 "88866f6d-feb0-46f2-8a19-213a3ff0bc18" 00:18:16.409 ], 00:18:16.409 "product_name": "Raid Volume", 00:18:16.409 "block_size": 512, 00:18:16.409 "num_blocks": 253952, 00:18:16.409 "uuid": "88866f6d-feb0-46f2-8a19-213a3ff0bc18", 00:18:16.409 "assigned_rate_limits": { 00:18:16.409 "rw_ios_per_sec": 0, 00:18:16.409 "rw_mbytes_per_sec": 0, 00:18:16.409 "r_mbytes_per_sec": 0, 00:18:16.409 "w_mbytes_per_sec": 0 00:18:16.409 }, 00:18:16.409 "claimed": false, 00:18:16.409 "zoned": false, 00:18:16.409 "supported_io_types": { 00:18:16.409 "read": true, 00:18:16.409 "write": true, 00:18:16.409 "unmap": true, 00:18:16.409 "flush": true, 00:18:16.409 "reset": true, 00:18:16.409 "nvme_admin": false, 00:18:16.409 "nvme_io": false, 00:18:16.409 "nvme_io_md": false, 00:18:16.409 "write_zeroes": true, 00:18:16.409 "zcopy": false, 00:18:16.409 "get_zone_info": false, 00:18:16.409 "zone_management": false, 00:18:16.409 "zone_append": false, 00:18:16.409 "compare": false, 00:18:16.409 "compare_and_write": false, 00:18:16.409 "abort": false, 00:18:16.409 "seek_hole": false, 00:18:16.409 "seek_data": false, 00:18:16.409 "copy": false, 00:18:16.409 "nvme_iov_md": false 00:18:16.409 }, 00:18:16.409 "memory_domains": [ 00:18:16.409 { 00:18:16.409 "dma_device_id": "system", 00:18:16.409 "dma_device_type": 1 00:18:16.409 }, 00:18:16.409 { 00:18:16.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.409 "dma_device_type": 2 00:18:16.409 }, 00:18:16.409 { 00:18:16.409 "dma_device_id": "system", 00:18:16.409 "dma_device_type": 1 00:18:16.409 }, 00:18:16.409 { 00:18:16.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.409 "dma_device_type": 2 00:18:16.409 }, 00:18:16.409 { 00:18:16.409 "dma_device_id": "system", 00:18:16.409 "dma_device_type": 1 00:18:16.409 }, 00:18:16.409 { 00:18:16.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.409 "dma_device_type": 2 00:18:16.409 }, 00:18:16.409 { 00:18:16.409 "dma_device_id": "system", 00:18:16.409 "dma_device_type": 1 00:18:16.409 }, 00:18:16.409 { 00:18:16.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.409 "dma_device_type": 2 00:18:16.409 } 00:18:16.409 ], 00:18:16.409 "driver_specific": { 00:18:16.409 "raid": { 00:18:16.409 "uuid": "88866f6d-feb0-46f2-8a19-213a3ff0bc18", 00:18:16.410 "strip_size_kb": 64, 00:18:16.410 "state": "online", 00:18:16.410 "raid_level": "raid0", 00:18:16.410 "superblock": true, 00:18:16.410 "num_base_bdevs": 4, 00:18:16.410 "num_base_bdevs_discovered": 4, 00:18:16.410 "num_base_bdevs_operational": 4, 00:18:16.410 "base_bdevs_list": [ 00:18:16.410 { 00:18:16.410 "name": "NewBaseBdev", 00:18:16.410 "uuid": "6d5873b8-f34a-461c-a418-c690afae3a3d", 00:18:16.410 "is_configured": true, 00:18:16.410 "data_offset": 2048, 00:18:16.410 "data_size": 63488 00:18:16.410 }, 00:18:16.410 { 00:18:16.410 "name": "BaseBdev2", 00:18:16.410 "uuid": "68d7d8b5-3916-4ea2-9afd-1377fbadd496", 00:18:16.410 "is_configured": true, 00:18:16.410 "data_offset": 2048, 00:18:16.410 "data_size": 63488 00:18:16.410 }, 00:18:16.410 { 00:18:16.410 "name": "BaseBdev3", 00:18:16.410 "uuid": "872732c4-c8d6-431b-8b7d-8fd470473d7b", 00:18:16.410 "is_configured": true, 00:18:16.410 "data_offset": 2048, 00:18:16.410 "data_size": 63488 00:18:16.410 }, 00:18:16.410 { 00:18:16.410 "name": "BaseBdev4", 00:18:16.410 "uuid": "4b8c7632-83f8-4565-b46e-c1497f767650", 00:18:16.410 "is_configured": true, 00:18:16.410 "data_offset": 2048, 00:18:16.410 "data_size": 63488 00:18:16.410 } 00:18:16.410 ] 00:18:16.410 } 00:18:16.410 } 00:18:16.410 }' 00:18:16.410 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:16.410 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:16.410 BaseBdev2 00:18:16.410 BaseBdev3 00:18:16.410 BaseBdev4' 00:18:16.410 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:16.410 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:16.410 08:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:16.668 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:16.668 "name": "NewBaseBdev", 00:18:16.668 "aliases": [ 00:18:16.668 "6d5873b8-f34a-461c-a418-c690afae3a3d" 00:18:16.668 ], 00:18:16.668 "product_name": "Malloc disk", 00:18:16.668 "block_size": 512, 00:18:16.668 "num_blocks": 65536, 00:18:16.668 "uuid": "6d5873b8-f34a-461c-a418-c690afae3a3d", 00:18:16.668 "assigned_rate_limits": { 00:18:16.668 "rw_ios_per_sec": 0, 00:18:16.668 "rw_mbytes_per_sec": 0, 00:18:16.668 "r_mbytes_per_sec": 0, 00:18:16.668 "w_mbytes_per_sec": 0 00:18:16.668 }, 00:18:16.668 "claimed": true, 00:18:16.668 "claim_type": "exclusive_write", 00:18:16.668 "zoned": false, 00:18:16.668 "supported_io_types": { 00:18:16.668 "read": true, 00:18:16.668 "write": true, 00:18:16.668 "unmap": true, 00:18:16.668 "flush": true, 00:18:16.668 "reset": true, 00:18:16.668 "nvme_admin": false, 00:18:16.668 "nvme_io": false, 00:18:16.668 "nvme_io_md": false, 00:18:16.668 "write_zeroes": true, 00:18:16.668 "zcopy": true, 00:18:16.668 "get_zone_info": false, 00:18:16.668 "zone_management": false, 00:18:16.668 "zone_append": false, 00:18:16.668 "compare": false, 00:18:16.668 "compare_and_write": false, 00:18:16.668 "abort": true, 00:18:16.668 "seek_hole": false, 00:18:16.668 "seek_data": false, 00:18:16.668 "copy": true, 00:18:16.668 "nvme_iov_md": false 00:18:16.668 }, 00:18:16.668 "memory_domains": [ 00:18:16.668 { 00:18:16.668 "dma_device_id": "system", 00:18:16.668 "dma_device_type": 1 00:18:16.668 }, 00:18:16.668 { 00:18:16.668 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.668 "dma_device_type": 2 00:18:16.668 } 00:18:16.668 ], 00:18:16.668 "driver_specific": {} 00:18:16.668 }' 00:18:16.668 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:16.668 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:16.668 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:16.668 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:16.926 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:16.926 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:16.926 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:16.926 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:16.926 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:16.926 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:16.926 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:16.926 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:16.926 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:16.926 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:16.926 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:17.183 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:17.183 "name": "BaseBdev2", 00:18:17.183 "aliases": [ 00:18:17.183 "68d7d8b5-3916-4ea2-9afd-1377fbadd496" 00:18:17.183 ], 00:18:17.183 "product_name": "Malloc disk", 00:18:17.183 "block_size": 512, 00:18:17.184 "num_blocks": 65536, 00:18:17.184 "uuid": "68d7d8b5-3916-4ea2-9afd-1377fbadd496", 00:18:17.184 "assigned_rate_limits": { 00:18:17.184 "rw_ios_per_sec": 0, 00:18:17.184 "rw_mbytes_per_sec": 0, 00:18:17.184 "r_mbytes_per_sec": 0, 00:18:17.184 "w_mbytes_per_sec": 0 00:18:17.184 }, 00:18:17.184 "claimed": true, 00:18:17.184 "claim_type": "exclusive_write", 00:18:17.184 "zoned": false, 00:18:17.184 "supported_io_types": { 00:18:17.184 "read": true, 00:18:17.184 "write": true, 00:18:17.184 "unmap": true, 00:18:17.184 "flush": true, 00:18:17.184 "reset": true, 00:18:17.184 "nvme_admin": false, 00:18:17.184 "nvme_io": false, 00:18:17.184 "nvme_io_md": false, 00:18:17.184 "write_zeroes": true, 00:18:17.184 "zcopy": true, 00:18:17.184 "get_zone_info": false, 00:18:17.184 "zone_management": false, 00:18:17.184 "zone_append": false, 00:18:17.184 "compare": false, 00:18:17.184 "compare_and_write": false, 00:18:17.184 "abort": true, 00:18:17.184 "seek_hole": false, 00:18:17.184 "seek_data": false, 00:18:17.184 "copy": true, 00:18:17.184 "nvme_iov_md": false 00:18:17.184 }, 00:18:17.184 "memory_domains": [ 00:18:17.184 { 00:18:17.184 "dma_device_id": "system", 00:18:17.184 "dma_device_type": 1 00:18:17.184 }, 00:18:17.184 { 00:18:17.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.184 "dma_device_type": 2 00:18:17.184 } 00:18:17.184 ], 00:18:17.184 "driver_specific": {} 00:18:17.184 }' 00:18:17.184 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.184 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.184 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:17.184 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.184 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.442 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:17.442 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.442 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.442 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:17.442 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:17.442 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:17.442 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:17.442 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:17.442 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:17.442 08:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:17.701 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:17.701 "name": "BaseBdev3", 00:18:17.701 "aliases": [ 00:18:17.701 "872732c4-c8d6-431b-8b7d-8fd470473d7b" 00:18:17.701 ], 00:18:17.701 "product_name": "Malloc disk", 00:18:17.701 "block_size": 512, 00:18:17.701 "num_blocks": 65536, 00:18:17.701 "uuid": "872732c4-c8d6-431b-8b7d-8fd470473d7b", 00:18:17.701 "assigned_rate_limits": { 00:18:17.701 "rw_ios_per_sec": 0, 00:18:17.701 "rw_mbytes_per_sec": 0, 00:18:17.701 "r_mbytes_per_sec": 0, 00:18:17.701 "w_mbytes_per_sec": 0 00:18:17.701 }, 00:18:17.701 "claimed": true, 00:18:17.701 "claim_type": "exclusive_write", 00:18:17.701 "zoned": false, 00:18:17.701 "supported_io_types": { 00:18:17.701 "read": true, 00:18:17.701 "write": true, 00:18:17.701 "unmap": true, 00:18:17.701 "flush": true, 00:18:17.701 "reset": true, 00:18:17.701 "nvme_admin": false, 00:18:17.701 "nvme_io": false, 00:18:17.701 "nvme_io_md": false, 00:18:17.701 "write_zeroes": true, 00:18:17.701 "zcopy": true, 00:18:17.701 "get_zone_info": false, 00:18:17.701 "zone_management": false, 00:18:17.701 "zone_append": false, 00:18:17.701 "compare": false, 00:18:17.701 "compare_and_write": false, 00:18:17.701 "abort": true, 00:18:17.701 "seek_hole": false, 00:18:17.701 "seek_data": false, 00:18:17.701 "copy": true, 00:18:17.701 "nvme_iov_md": false 00:18:17.701 }, 00:18:17.701 "memory_domains": [ 00:18:17.701 { 00:18:17.701 "dma_device_id": "system", 00:18:17.701 "dma_device_type": 1 00:18:17.701 }, 00:18:17.701 { 00:18:17.701 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.701 "dma_device_type": 2 00:18:17.701 } 00:18:17.701 ], 00:18:17.701 "driver_specific": {} 00:18:17.701 }' 00:18:17.701 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.701 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.701 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:17.701 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.701 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.701 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:17.959 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.959 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.959 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:17.959 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:17.959 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:17.959 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:17.959 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:17.959 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:17.959 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:18.217 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:18.217 "name": "BaseBdev4", 00:18:18.217 "aliases": [ 00:18:18.217 "4b8c7632-83f8-4565-b46e-c1497f767650" 00:18:18.217 ], 00:18:18.217 "product_name": "Malloc disk", 00:18:18.217 "block_size": 512, 00:18:18.217 "num_blocks": 65536, 00:18:18.217 "uuid": "4b8c7632-83f8-4565-b46e-c1497f767650", 00:18:18.217 "assigned_rate_limits": { 00:18:18.217 "rw_ios_per_sec": 0, 00:18:18.217 "rw_mbytes_per_sec": 0, 00:18:18.217 "r_mbytes_per_sec": 0, 00:18:18.217 "w_mbytes_per_sec": 0 00:18:18.217 }, 00:18:18.217 "claimed": true, 00:18:18.217 "claim_type": "exclusive_write", 00:18:18.217 "zoned": false, 00:18:18.217 "supported_io_types": { 00:18:18.217 "read": true, 00:18:18.217 "write": true, 00:18:18.217 "unmap": true, 00:18:18.217 "flush": true, 00:18:18.217 "reset": true, 00:18:18.217 "nvme_admin": false, 00:18:18.217 "nvme_io": false, 00:18:18.217 "nvme_io_md": false, 00:18:18.217 "write_zeroes": true, 00:18:18.217 "zcopy": true, 00:18:18.217 "get_zone_info": false, 00:18:18.217 "zone_management": false, 00:18:18.217 "zone_append": false, 00:18:18.217 "compare": false, 00:18:18.217 "compare_and_write": false, 00:18:18.217 "abort": true, 00:18:18.217 "seek_hole": false, 00:18:18.217 "seek_data": false, 00:18:18.217 "copy": true, 00:18:18.217 "nvme_iov_md": false 00:18:18.217 }, 00:18:18.217 "memory_domains": [ 00:18:18.217 { 00:18:18.217 "dma_device_id": "system", 00:18:18.217 "dma_device_type": 1 00:18:18.217 }, 00:18:18.217 { 00:18:18.217 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:18.217 "dma_device_type": 2 00:18:18.217 } 00:18:18.217 ], 00:18:18.217 "driver_specific": {} 00:18:18.217 }' 00:18:18.217 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.217 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.217 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:18.217 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.217 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.217 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:18.217 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.474 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.475 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:18.475 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.475 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.475 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:18.475 08:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:18.733 [2024-07-23 08:31:31.007780] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:18.733 [2024-07-23 08:31:31.007810] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:18.733 [2024-07-23 08:31:31.007885] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:18.733 [2024-07-23 08:31:31.007950] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:18.733 [2024-07-23 08:31:31.007960] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037280 name Existed_Raid, state offline 00:18:18.733 08:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1476040 00:18:18.733 08:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1476040 ']' 00:18:18.733 08:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1476040 00:18:18.733 08:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:18:18.733 08:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:18.733 08:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1476040 00:18:18.733 08:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:18.733 08:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:18.733 08:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1476040' 00:18:18.733 killing process with pid 1476040 00:18:18.733 08:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1476040 00:18:18.733 [2024-07-23 08:31:31.062743] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:18.733 08:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1476040 00:18:18.992 [2024-07-23 08:31:31.431052] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:20.367 08:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:20.367 00:18:20.367 real 0m26.619s 00:18:20.367 user 0m47.496s 00:18:20.367 sys 0m3.992s 00:18:20.367 08:31:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:20.367 08:31:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:20.367 ************************************ 00:18:20.367 END TEST raid_state_function_test_sb 00:18:20.367 ************************************ 00:18:20.367 08:31:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:20.367 08:31:32 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:18:20.367 08:31:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:18:20.367 08:31:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:20.367 08:31:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:20.367 ************************************ 00:18:20.367 START TEST raid_superblock_test 00:18:20.367 ************************************ 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1481726 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1481726 /var/tmp/spdk-raid.sock 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1481726 ']' 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:20.367 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:20.367 08:31:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:20.367 [2024-07-23 08:31:32.832280] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:18:20.367 [2024-07-23 08:31:32.832381] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1481726 ] 00:18:20.626 [2024-07-23 08:31:32.954893] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:20.887 [2024-07-23 08:31:33.160246] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:20.887 [2024-07-23 08:31:33.403659] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:20.887 [2024-07-23 08:31:33.403691] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:21.147 08:31:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:21.147 08:31:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:18:21.147 08:31:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:18:21.147 08:31:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:21.147 08:31:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:18:21.147 08:31:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:18:21.147 08:31:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:21.147 08:31:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:21.147 08:31:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:21.147 08:31:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:21.147 08:31:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:21.405 malloc1 00:18:21.405 08:31:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:21.663 [2024-07-23 08:31:33.987013] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:21.663 [2024-07-23 08:31:33.987063] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:21.663 [2024-07-23 08:31:33.987086] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:18:21.663 [2024-07-23 08:31:33.987099] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:21.663 [2024-07-23 08:31:33.989102] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:21.663 [2024-07-23 08:31:33.989130] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:21.663 pt1 00:18:21.663 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:21.663 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:21.663 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:18:21.663 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:18:21.663 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:21.663 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:21.663 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:21.663 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:21.663 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:21.921 malloc2 00:18:21.921 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:21.921 [2024-07-23 08:31:34.388798] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:21.921 [2024-07-23 08:31:34.388846] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:21.921 [2024-07-23 08:31:34.388879] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:18:21.921 [2024-07-23 08:31:34.388889] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:21.921 [2024-07-23 08:31:34.390818] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:21.921 [2024-07-23 08:31:34.390843] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:21.921 pt2 00:18:21.921 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:21.921 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:21.921 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:18:21.921 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:18:21.921 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:21.921 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:21.921 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:21.921 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:21.921 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:22.180 malloc3 00:18:22.180 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:22.438 [2024-07-23 08:31:34.748272] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:22.438 [2024-07-23 08:31:34.748325] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:22.438 [2024-07-23 08:31:34.748346] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036080 00:18:22.438 [2024-07-23 08:31:34.748355] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:22.438 [2024-07-23 08:31:34.750509] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:22.438 [2024-07-23 08:31:34.750537] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:22.438 pt3 00:18:22.438 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:22.438 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:22.438 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:18:22.438 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:18:22.438 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:18:22.438 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:22.438 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:22.438 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:22.438 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:18:22.697 malloc4 00:18:22.697 08:31:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:22.697 [2024-07-23 08:31:35.139979] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:22.697 [2024-07-23 08:31:35.140033] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:22.697 [2024-07-23 08:31:35.140052] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036c80 00:18:22.697 [2024-07-23 08:31:35.140061] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:22.697 [2024-07-23 08:31:35.142034] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:22.697 [2024-07-23 08:31:35.142062] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:22.697 pt4 00:18:22.697 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:22.697 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:22.697 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:18:22.955 [2024-07-23 08:31:35.296445] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:22.955 [2024-07-23 08:31:35.298030] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:22.955 [2024-07-23 08:31:35.298092] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:22.955 [2024-07-23 08:31:35.298131] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:22.955 [2024-07-23 08:31:35.298320] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037280 00:18:22.955 [2024-07-23 08:31:35.298331] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:22.955 [2024-07-23 08:31:35.298581] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:18:22.955 [2024-07-23 08:31:35.298778] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037280 00:18:22.955 [2024-07-23 08:31:35.298790] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037280 00:18:22.955 [2024-07-23 08:31:35.298930] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:22.955 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:22.955 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:22.955 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:22.955 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:22.955 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:22.955 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:22.955 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:22.955 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:22.955 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:22.955 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:22.955 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.955 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:23.214 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:23.214 "name": "raid_bdev1", 00:18:23.214 "uuid": "566e07bf-6339-497e-bf2a-b8893f81d29b", 00:18:23.214 "strip_size_kb": 64, 00:18:23.214 "state": "online", 00:18:23.214 "raid_level": "raid0", 00:18:23.214 "superblock": true, 00:18:23.214 "num_base_bdevs": 4, 00:18:23.214 "num_base_bdevs_discovered": 4, 00:18:23.214 "num_base_bdevs_operational": 4, 00:18:23.214 "base_bdevs_list": [ 00:18:23.214 { 00:18:23.214 "name": "pt1", 00:18:23.214 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:23.214 "is_configured": true, 00:18:23.214 "data_offset": 2048, 00:18:23.214 "data_size": 63488 00:18:23.214 }, 00:18:23.214 { 00:18:23.214 "name": "pt2", 00:18:23.214 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:23.214 "is_configured": true, 00:18:23.214 "data_offset": 2048, 00:18:23.214 "data_size": 63488 00:18:23.214 }, 00:18:23.214 { 00:18:23.214 "name": "pt3", 00:18:23.214 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:23.214 "is_configured": true, 00:18:23.214 "data_offset": 2048, 00:18:23.214 "data_size": 63488 00:18:23.214 }, 00:18:23.214 { 00:18:23.214 "name": "pt4", 00:18:23.214 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:23.214 "is_configured": true, 00:18:23.214 "data_offset": 2048, 00:18:23.214 "data_size": 63488 00:18:23.214 } 00:18:23.214 ] 00:18:23.214 }' 00:18:23.214 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:23.214 08:31:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:23.473 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:18:23.473 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:23.473 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:23.473 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:23.473 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:23.473 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:23.473 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:23.473 08:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:23.731 [2024-07-23 08:31:36.138913] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:23.731 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:23.731 "name": "raid_bdev1", 00:18:23.731 "aliases": [ 00:18:23.731 "566e07bf-6339-497e-bf2a-b8893f81d29b" 00:18:23.731 ], 00:18:23.731 "product_name": "Raid Volume", 00:18:23.731 "block_size": 512, 00:18:23.731 "num_blocks": 253952, 00:18:23.731 "uuid": "566e07bf-6339-497e-bf2a-b8893f81d29b", 00:18:23.731 "assigned_rate_limits": { 00:18:23.731 "rw_ios_per_sec": 0, 00:18:23.731 "rw_mbytes_per_sec": 0, 00:18:23.731 "r_mbytes_per_sec": 0, 00:18:23.731 "w_mbytes_per_sec": 0 00:18:23.731 }, 00:18:23.731 "claimed": false, 00:18:23.731 "zoned": false, 00:18:23.731 "supported_io_types": { 00:18:23.731 "read": true, 00:18:23.731 "write": true, 00:18:23.731 "unmap": true, 00:18:23.731 "flush": true, 00:18:23.731 "reset": true, 00:18:23.731 "nvme_admin": false, 00:18:23.731 "nvme_io": false, 00:18:23.731 "nvme_io_md": false, 00:18:23.731 "write_zeroes": true, 00:18:23.731 "zcopy": false, 00:18:23.731 "get_zone_info": false, 00:18:23.731 "zone_management": false, 00:18:23.731 "zone_append": false, 00:18:23.731 "compare": false, 00:18:23.731 "compare_and_write": false, 00:18:23.731 "abort": false, 00:18:23.731 "seek_hole": false, 00:18:23.731 "seek_data": false, 00:18:23.731 "copy": false, 00:18:23.731 "nvme_iov_md": false 00:18:23.731 }, 00:18:23.731 "memory_domains": [ 00:18:23.731 { 00:18:23.731 "dma_device_id": "system", 00:18:23.731 "dma_device_type": 1 00:18:23.731 }, 00:18:23.731 { 00:18:23.731 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:23.731 "dma_device_type": 2 00:18:23.731 }, 00:18:23.731 { 00:18:23.731 "dma_device_id": "system", 00:18:23.731 "dma_device_type": 1 00:18:23.731 }, 00:18:23.731 { 00:18:23.731 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:23.731 "dma_device_type": 2 00:18:23.731 }, 00:18:23.731 { 00:18:23.731 "dma_device_id": "system", 00:18:23.731 "dma_device_type": 1 00:18:23.731 }, 00:18:23.731 { 00:18:23.731 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:23.731 "dma_device_type": 2 00:18:23.731 }, 00:18:23.731 { 00:18:23.731 "dma_device_id": "system", 00:18:23.731 "dma_device_type": 1 00:18:23.731 }, 00:18:23.731 { 00:18:23.731 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:23.731 "dma_device_type": 2 00:18:23.731 } 00:18:23.731 ], 00:18:23.731 "driver_specific": { 00:18:23.731 "raid": { 00:18:23.731 "uuid": "566e07bf-6339-497e-bf2a-b8893f81d29b", 00:18:23.731 "strip_size_kb": 64, 00:18:23.731 "state": "online", 00:18:23.731 "raid_level": "raid0", 00:18:23.731 "superblock": true, 00:18:23.731 "num_base_bdevs": 4, 00:18:23.731 "num_base_bdevs_discovered": 4, 00:18:23.731 "num_base_bdevs_operational": 4, 00:18:23.731 "base_bdevs_list": [ 00:18:23.731 { 00:18:23.731 "name": "pt1", 00:18:23.731 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:23.731 "is_configured": true, 00:18:23.731 "data_offset": 2048, 00:18:23.731 "data_size": 63488 00:18:23.731 }, 00:18:23.731 { 00:18:23.731 "name": "pt2", 00:18:23.731 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:23.731 "is_configured": true, 00:18:23.731 "data_offset": 2048, 00:18:23.731 "data_size": 63488 00:18:23.731 }, 00:18:23.731 { 00:18:23.731 "name": "pt3", 00:18:23.731 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:23.731 "is_configured": true, 00:18:23.731 "data_offset": 2048, 00:18:23.731 "data_size": 63488 00:18:23.731 }, 00:18:23.731 { 00:18:23.731 "name": "pt4", 00:18:23.731 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:23.731 "is_configured": true, 00:18:23.731 "data_offset": 2048, 00:18:23.731 "data_size": 63488 00:18:23.731 } 00:18:23.731 ] 00:18:23.731 } 00:18:23.731 } 00:18:23.731 }' 00:18:23.732 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:23.732 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:23.732 pt2 00:18:23.732 pt3 00:18:23.732 pt4' 00:18:23.732 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:23.732 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:23.732 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:23.990 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:23.990 "name": "pt1", 00:18:23.990 "aliases": [ 00:18:23.990 "00000000-0000-0000-0000-000000000001" 00:18:23.990 ], 00:18:23.990 "product_name": "passthru", 00:18:23.990 "block_size": 512, 00:18:23.990 "num_blocks": 65536, 00:18:23.990 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:23.990 "assigned_rate_limits": { 00:18:23.990 "rw_ios_per_sec": 0, 00:18:23.990 "rw_mbytes_per_sec": 0, 00:18:23.990 "r_mbytes_per_sec": 0, 00:18:23.990 "w_mbytes_per_sec": 0 00:18:23.990 }, 00:18:23.990 "claimed": true, 00:18:23.990 "claim_type": "exclusive_write", 00:18:23.990 "zoned": false, 00:18:23.990 "supported_io_types": { 00:18:23.990 "read": true, 00:18:23.990 "write": true, 00:18:23.990 "unmap": true, 00:18:23.990 "flush": true, 00:18:23.990 "reset": true, 00:18:23.990 "nvme_admin": false, 00:18:23.990 "nvme_io": false, 00:18:23.990 "nvme_io_md": false, 00:18:23.990 "write_zeroes": true, 00:18:23.990 "zcopy": true, 00:18:23.990 "get_zone_info": false, 00:18:23.990 "zone_management": false, 00:18:23.990 "zone_append": false, 00:18:23.990 "compare": false, 00:18:23.990 "compare_and_write": false, 00:18:23.990 "abort": true, 00:18:23.990 "seek_hole": false, 00:18:23.990 "seek_data": false, 00:18:23.990 "copy": true, 00:18:23.990 "nvme_iov_md": false 00:18:23.990 }, 00:18:23.990 "memory_domains": [ 00:18:23.990 { 00:18:23.990 "dma_device_id": "system", 00:18:23.990 "dma_device_type": 1 00:18:23.991 }, 00:18:23.991 { 00:18:23.991 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:23.991 "dma_device_type": 2 00:18:23.991 } 00:18:23.991 ], 00:18:23.991 "driver_specific": { 00:18:23.991 "passthru": { 00:18:23.991 "name": "pt1", 00:18:23.991 "base_bdev_name": "malloc1" 00:18:23.991 } 00:18:23.991 } 00:18:23.991 }' 00:18:23.991 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:23.991 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:23.991 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:23.991 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:23.991 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:24.249 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:24.249 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:24.249 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:24.249 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:24.249 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:24.249 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:24.249 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:24.249 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:24.249 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:24.249 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:24.507 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:24.507 "name": "pt2", 00:18:24.507 "aliases": [ 00:18:24.507 "00000000-0000-0000-0000-000000000002" 00:18:24.507 ], 00:18:24.507 "product_name": "passthru", 00:18:24.507 "block_size": 512, 00:18:24.507 "num_blocks": 65536, 00:18:24.507 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:24.507 "assigned_rate_limits": { 00:18:24.507 "rw_ios_per_sec": 0, 00:18:24.507 "rw_mbytes_per_sec": 0, 00:18:24.507 "r_mbytes_per_sec": 0, 00:18:24.507 "w_mbytes_per_sec": 0 00:18:24.507 }, 00:18:24.507 "claimed": true, 00:18:24.507 "claim_type": "exclusive_write", 00:18:24.507 "zoned": false, 00:18:24.507 "supported_io_types": { 00:18:24.507 "read": true, 00:18:24.507 "write": true, 00:18:24.507 "unmap": true, 00:18:24.507 "flush": true, 00:18:24.507 "reset": true, 00:18:24.507 "nvme_admin": false, 00:18:24.507 "nvme_io": false, 00:18:24.507 "nvme_io_md": false, 00:18:24.507 "write_zeroes": true, 00:18:24.507 "zcopy": true, 00:18:24.507 "get_zone_info": false, 00:18:24.507 "zone_management": false, 00:18:24.507 "zone_append": false, 00:18:24.507 "compare": false, 00:18:24.507 "compare_and_write": false, 00:18:24.507 "abort": true, 00:18:24.507 "seek_hole": false, 00:18:24.507 "seek_data": false, 00:18:24.507 "copy": true, 00:18:24.507 "nvme_iov_md": false 00:18:24.507 }, 00:18:24.507 "memory_domains": [ 00:18:24.507 { 00:18:24.507 "dma_device_id": "system", 00:18:24.507 "dma_device_type": 1 00:18:24.507 }, 00:18:24.508 { 00:18:24.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.508 "dma_device_type": 2 00:18:24.508 } 00:18:24.508 ], 00:18:24.508 "driver_specific": { 00:18:24.508 "passthru": { 00:18:24.508 "name": "pt2", 00:18:24.508 "base_bdev_name": "malloc2" 00:18:24.508 } 00:18:24.508 } 00:18:24.508 }' 00:18:24.508 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:24.508 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:24.508 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:24.508 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:24.508 08:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:24.508 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:24.508 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:24.766 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:24.766 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:24.766 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:24.766 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:24.766 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:24.766 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:24.766 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:24.766 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:25.024 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:25.024 "name": "pt3", 00:18:25.024 "aliases": [ 00:18:25.024 "00000000-0000-0000-0000-000000000003" 00:18:25.024 ], 00:18:25.024 "product_name": "passthru", 00:18:25.024 "block_size": 512, 00:18:25.024 "num_blocks": 65536, 00:18:25.024 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:25.024 "assigned_rate_limits": { 00:18:25.024 "rw_ios_per_sec": 0, 00:18:25.024 "rw_mbytes_per_sec": 0, 00:18:25.024 "r_mbytes_per_sec": 0, 00:18:25.024 "w_mbytes_per_sec": 0 00:18:25.024 }, 00:18:25.024 "claimed": true, 00:18:25.024 "claim_type": "exclusive_write", 00:18:25.024 "zoned": false, 00:18:25.024 "supported_io_types": { 00:18:25.024 "read": true, 00:18:25.024 "write": true, 00:18:25.024 "unmap": true, 00:18:25.024 "flush": true, 00:18:25.024 "reset": true, 00:18:25.024 "nvme_admin": false, 00:18:25.024 "nvme_io": false, 00:18:25.024 "nvme_io_md": false, 00:18:25.024 "write_zeroes": true, 00:18:25.024 "zcopy": true, 00:18:25.024 "get_zone_info": false, 00:18:25.024 "zone_management": false, 00:18:25.024 "zone_append": false, 00:18:25.024 "compare": false, 00:18:25.024 "compare_and_write": false, 00:18:25.024 "abort": true, 00:18:25.024 "seek_hole": false, 00:18:25.024 "seek_data": false, 00:18:25.024 "copy": true, 00:18:25.024 "nvme_iov_md": false 00:18:25.024 }, 00:18:25.024 "memory_domains": [ 00:18:25.024 { 00:18:25.024 "dma_device_id": "system", 00:18:25.024 "dma_device_type": 1 00:18:25.024 }, 00:18:25.024 { 00:18:25.024 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:25.024 "dma_device_type": 2 00:18:25.024 } 00:18:25.024 ], 00:18:25.024 "driver_specific": { 00:18:25.024 "passthru": { 00:18:25.024 "name": "pt3", 00:18:25.024 "base_bdev_name": "malloc3" 00:18:25.024 } 00:18:25.024 } 00:18:25.024 }' 00:18:25.024 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:25.024 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:25.024 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:25.024 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:25.024 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:25.024 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:25.024 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:25.024 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:25.283 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:25.283 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:25.283 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:25.283 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:25.283 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:25.283 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:25.283 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:25.283 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:25.283 "name": "pt4", 00:18:25.283 "aliases": [ 00:18:25.283 "00000000-0000-0000-0000-000000000004" 00:18:25.283 ], 00:18:25.283 "product_name": "passthru", 00:18:25.283 "block_size": 512, 00:18:25.283 "num_blocks": 65536, 00:18:25.283 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:25.283 "assigned_rate_limits": { 00:18:25.283 "rw_ios_per_sec": 0, 00:18:25.283 "rw_mbytes_per_sec": 0, 00:18:25.283 "r_mbytes_per_sec": 0, 00:18:25.283 "w_mbytes_per_sec": 0 00:18:25.283 }, 00:18:25.283 "claimed": true, 00:18:25.283 "claim_type": "exclusive_write", 00:18:25.283 "zoned": false, 00:18:25.283 "supported_io_types": { 00:18:25.283 "read": true, 00:18:25.283 "write": true, 00:18:25.283 "unmap": true, 00:18:25.283 "flush": true, 00:18:25.283 "reset": true, 00:18:25.283 "nvme_admin": false, 00:18:25.283 "nvme_io": false, 00:18:25.283 "nvme_io_md": false, 00:18:25.283 "write_zeroes": true, 00:18:25.283 "zcopy": true, 00:18:25.283 "get_zone_info": false, 00:18:25.283 "zone_management": false, 00:18:25.283 "zone_append": false, 00:18:25.283 "compare": false, 00:18:25.283 "compare_and_write": false, 00:18:25.283 "abort": true, 00:18:25.283 "seek_hole": false, 00:18:25.283 "seek_data": false, 00:18:25.283 "copy": true, 00:18:25.283 "nvme_iov_md": false 00:18:25.283 }, 00:18:25.283 "memory_domains": [ 00:18:25.283 { 00:18:25.283 "dma_device_id": "system", 00:18:25.283 "dma_device_type": 1 00:18:25.283 }, 00:18:25.283 { 00:18:25.283 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:25.283 "dma_device_type": 2 00:18:25.283 } 00:18:25.283 ], 00:18:25.283 "driver_specific": { 00:18:25.283 "passthru": { 00:18:25.283 "name": "pt4", 00:18:25.283 "base_bdev_name": "malloc4" 00:18:25.283 } 00:18:25.283 } 00:18:25.283 }' 00:18:25.283 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:25.542 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:25.542 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:25.542 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:25.542 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:25.542 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:25.542 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:25.542 08:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:25.542 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:25.542 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:25.800 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:25.800 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:25.800 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:25.800 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:18:25.800 [2024-07-23 08:31:38.276561] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:25.800 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=566e07bf-6339-497e-bf2a-b8893f81d29b 00:18:25.800 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 566e07bf-6339-497e-bf2a-b8893f81d29b ']' 00:18:25.800 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:26.058 [2024-07-23 08:31:38.444686] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:26.058 [2024-07-23 08:31:38.444713] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:26.058 [2024-07-23 08:31:38.444782] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:26.058 [2024-07-23 08:31:38.444849] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:26.058 [2024-07-23 08:31:38.444861] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037280 name raid_bdev1, state offline 00:18:26.058 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.058 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:18:26.316 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:18:26.316 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:18:26.316 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:26.316 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:26.316 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:26.316 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:26.575 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:26.575 08:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:26.833 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:26.833 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:26.833 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:26.833 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:27.092 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:18:27.092 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:27.092 08:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:18:27.092 08:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:27.092 08:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:27.092 08:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:27.092 08:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:27.092 08:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:27.092 08:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:27.092 08:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:27.092 08:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:27.092 08:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:27.092 08:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:27.092 [2024-07-23 08:31:39.595735] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:27.092 [2024-07-23 08:31:39.597329] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:27.092 [2024-07-23 08:31:39.597374] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:27.092 [2024-07-23 08:31:39.597406] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:18:27.092 [2024-07-23 08:31:39.597454] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:27.092 [2024-07-23 08:31:39.597497] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:27.092 [2024-07-23 08:31:39.597533] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:27.092 [2024-07-23 08:31:39.597552] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:18:27.092 [2024-07-23 08:31:39.597565] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:27.092 [2024-07-23 08:31:39.597578] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037880 name raid_bdev1, state configuring 00:18:27.092 request: 00:18:27.092 { 00:18:27.092 "name": "raid_bdev1", 00:18:27.092 "raid_level": "raid0", 00:18:27.092 "base_bdevs": [ 00:18:27.092 "malloc1", 00:18:27.092 "malloc2", 00:18:27.092 "malloc3", 00:18:27.092 "malloc4" 00:18:27.092 ], 00:18:27.092 "strip_size_kb": 64, 00:18:27.092 "superblock": false, 00:18:27.092 "method": "bdev_raid_create", 00:18:27.092 "req_id": 1 00:18:27.092 } 00:18:27.092 Got JSON-RPC error response 00:18:27.092 response: 00:18:27.092 { 00:18:27.092 "code": -17, 00:18:27.092 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:27.092 } 00:18:27.350 08:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:18:27.350 08:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:18:27.350 08:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:18:27.350 08:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:18:27.350 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.350 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:18:27.350 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:18:27.350 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:18:27.350 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:27.609 [2024-07-23 08:31:39.924498] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:27.609 [2024-07-23 08:31:39.924571] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:27.609 [2024-07-23 08:31:39.924588] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037e80 00:18:27.609 [2024-07-23 08:31:39.924599] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:27.609 [2024-07-23 08:31:39.926509] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:27.609 [2024-07-23 08:31:39.926536] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:27.609 [2024-07-23 08:31:39.926635] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:27.609 [2024-07-23 08:31:39.926686] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:27.609 pt1 00:18:27.609 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:18:27.609 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:27.609 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:27.609 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:27.609 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:27.609 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:27.609 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.609 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.609 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.609 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.609 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.609 08:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:27.609 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:27.609 "name": "raid_bdev1", 00:18:27.609 "uuid": "566e07bf-6339-497e-bf2a-b8893f81d29b", 00:18:27.609 "strip_size_kb": 64, 00:18:27.609 "state": "configuring", 00:18:27.609 "raid_level": "raid0", 00:18:27.609 "superblock": true, 00:18:27.609 "num_base_bdevs": 4, 00:18:27.609 "num_base_bdevs_discovered": 1, 00:18:27.609 "num_base_bdevs_operational": 4, 00:18:27.609 "base_bdevs_list": [ 00:18:27.609 { 00:18:27.609 "name": "pt1", 00:18:27.609 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:27.609 "is_configured": true, 00:18:27.609 "data_offset": 2048, 00:18:27.609 "data_size": 63488 00:18:27.609 }, 00:18:27.609 { 00:18:27.609 "name": null, 00:18:27.609 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:27.609 "is_configured": false, 00:18:27.609 "data_offset": 2048, 00:18:27.609 "data_size": 63488 00:18:27.609 }, 00:18:27.609 { 00:18:27.609 "name": null, 00:18:27.609 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:27.609 "is_configured": false, 00:18:27.609 "data_offset": 2048, 00:18:27.609 "data_size": 63488 00:18:27.609 }, 00:18:27.609 { 00:18:27.609 "name": null, 00:18:27.609 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:27.609 "is_configured": false, 00:18:27.609 "data_offset": 2048, 00:18:27.609 "data_size": 63488 00:18:27.609 } 00:18:27.609 ] 00:18:27.609 }' 00:18:27.609 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:27.609 08:31:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:28.176 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:18:28.176 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:28.434 [2024-07-23 08:31:40.762767] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:28.434 [2024-07-23 08:31:40.762825] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:28.434 [2024-07-23 08:31:40.762844] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038780 00:18:28.434 [2024-07-23 08:31:40.762854] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:28.434 [2024-07-23 08:31:40.763286] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:28.434 [2024-07-23 08:31:40.763304] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:28.434 [2024-07-23 08:31:40.763379] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:28.434 [2024-07-23 08:31:40.763405] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:28.434 pt2 00:18:28.434 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:28.434 [2024-07-23 08:31:40.923204] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:28.434 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:18:28.434 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:28.434 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:28.434 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:28.434 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:28.434 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:28.434 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:28.434 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:28.434 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:28.434 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:28.434 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.434 08:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:28.692 08:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:28.692 "name": "raid_bdev1", 00:18:28.692 "uuid": "566e07bf-6339-497e-bf2a-b8893f81d29b", 00:18:28.692 "strip_size_kb": 64, 00:18:28.692 "state": "configuring", 00:18:28.692 "raid_level": "raid0", 00:18:28.692 "superblock": true, 00:18:28.692 "num_base_bdevs": 4, 00:18:28.692 "num_base_bdevs_discovered": 1, 00:18:28.692 "num_base_bdevs_operational": 4, 00:18:28.692 "base_bdevs_list": [ 00:18:28.692 { 00:18:28.692 "name": "pt1", 00:18:28.692 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:28.692 "is_configured": true, 00:18:28.692 "data_offset": 2048, 00:18:28.692 "data_size": 63488 00:18:28.692 }, 00:18:28.692 { 00:18:28.692 "name": null, 00:18:28.692 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:28.692 "is_configured": false, 00:18:28.692 "data_offset": 2048, 00:18:28.692 "data_size": 63488 00:18:28.692 }, 00:18:28.692 { 00:18:28.692 "name": null, 00:18:28.692 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:28.692 "is_configured": false, 00:18:28.692 "data_offset": 2048, 00:18:28.692 "data_size": 63488 00:18:28.692 }, 00:18:28.692 { 00:18:28.692 "name": null, 00:18:28.692 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:28.692 "is_configured": false, 00:18:28.692 "data_offset": 2048, 00:18:28.692 "data_size": 63488 00:18:28.692 } 00:18:28.692 ] 00:18:28.692 }' 00:18:28.692 08:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:28.692 08:31:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:29.295 08:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:18:29.295 08:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:29.295 08:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:29.295 [2024-07-23 08:31:41.729252] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:29.295 [2024-07-23 08:31:41.729306] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:29.295 [2024-07-23 08:31:41.729339] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038a80 00:18:29.295 [2024-07-23 08:31:41.729348] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:29.295 [2024-07-23 08:31:41.729793] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:29.295 [2024-07-23 08:31:41.729812] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:29.295 [2024-07-23 08:31:41.729888] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:29.295 [2024-07-23 08:31:41.729911] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:29.295 pt2 00:18:29.295 08:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:29.295 08:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:29.295 08:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:29.556 [2024-07-23 08:31:41.897727] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:29.556 [2024-07-23 08:31:41.897776] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:29.556 [2024-07-23 08:31:41.897798] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038d80 00:18:29.556 [2024-07-23 08:31:41.897808] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:29.556 [2024-07-23 08:31:41.898245] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:29.556 [2024-07-23 08:31:41.898262] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:29.556 [2024-07-23 08:31:41.898334] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:29.556 [2024-07-23 08:31:41.898358] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:29.556 pt3 00:18:29.556 08:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:29.556 08:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:29.556 08:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:29.556 [2024-07-23 08:31:42.066183] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:29.556 [2024-07-23 08:31:42.066236] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:29.556 [2024-07-23 08:31:42.066255] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039080 00:18:29.556 [2024-07-23 08:31:42.066265] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:29.556 [2024-07-23 08:31:42.066708] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:29.556 [2024-07-23 08:31:42.066729] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:29.557 [2024-07-23 08:31:42.066803] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:18:29.557 [2024-07-23 08:31:42.066834] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:29.557 [2024-07-23 08:31:42.066992] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000038480 00:18:29.557 [2024-07-23 08:31:42.067001] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:29.557 [2024-07-23 08:31:42.067269] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:18:29.557 [2024-07-23 08:31:42.067453] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000038480 00:18:29.557 [2024-07-23 08:31:42.067465] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000038480 00:18:29.557 [2024-07-23 08:31:42.067620] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:29.557 pt4 00:18:29.815 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:29.816 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:29.816 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:29.816 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:29.816 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:29.816 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:29.816 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:29.816 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:29.816 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:29.816 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:29.816 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:29.816 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:29.816 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.816 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:29.816 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:29.816 "name": "raid_bdev1", 00:18:29.816 "uuid": "566e07bf-6339-497e-bf2a-b8893f81d29b", 00:18:29.816 "strip_size_kb": 64, 00:18:29.816 "state": "online", 00:18:29.816 "raid_level": "raid0", 00:18:29.816 "superblock": true, 00:18:29.816 "num_base_bdevs": 4, 00:18:29.816 "num_base_bdevs_discovered": 4, 00:18:29.816 "num_base_bdevs_operational": 4, 00:18:29.816 "base_bdevs_list": [ 00:18:29.816 { 00:18:29.816 "name": "pt1", 00:18:29.816 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:29.816 "is_configured": true, 00:18:29.816 "data_offset": 2048, 00:18:29.816 "data_size": 63488 00:18:29.816 }, 00:18:29.816 { 00:18:29.816 "name": "pt2", 00:18:29.816 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:29.816 "is_configured": true, 00:18:29.816 "data_offset": 2048, 00:18:29.816 "data_size": 63488 00:18:29.816 }, 00:18:29.816 { 00:18:29.816 "name": "pt3", 00:18:29.816 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:29.816 "is_configured": true, 00:18:29.816 "data_offset": 2048, 00:18:29.816 "data_size": 63488 00:18:29.816 }, 00:18:29.816 { 00:18:29.816 "name": "pt4", 00:18:29.816 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:29.816 "is_configured": true, 00:18:29.816 "data_offset": 2048, 00:18:29.816 "data_size": 63488 00:18:29.816 } 00:18:29.816 ] 00:18:29.816 }' 00:18:29.816 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:29.816 08:31:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:30.384 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:18:30.384 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:30.384 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:30.384 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:30.384 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:30.384 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:30.384 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:30.384 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:30.384 [2024-07-23 08:31:42.860557] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:30.384 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:30.384 "name": "raid_bdev1", 00:18:30.384 "aliases": [ 00:18:30.384 "566e07bf-6339-497e-bf2a-b8893f81d29b" 00:18:30.384 ], 00:18:30.384 "product_name": "Raid Volume", 00:18:30.384 "block_size": 512, 00:18:30.384 "num_blocks": 253952, 00:18:30.384 "uuid": "566e07bf-6339-497e-bf2a-b8893f81d29b", 00:18:30.384 "assigned_rate_limits": { 00:18:30.384 "rw_ios_per_sec": 0, 00:18:30.384 "rw_mbytes_per_sec": 0, 00:18:30.384 "r_mbytes_per_sec": 0, 00:18:30.384 "w_mbytes_per_sec": 0 00:18:30.384 }, 00:18:30.384 "claimed": false, 00:18:30.384 "zoned": false, 00:18:30.384 "supported_io_types": { 00:18:30.384 "read": true, 00:18:30.384 "write": true, 00:18:30.384 "unmap": true, 00:18:30.384 "flush": true, 00:18:30.384 "reset": true, 00:18:30.384 "nvme_admin": false, 00:18:30.384 "nvme_io": false, 00:18:30.384 "nvme_io_md": false, 00:18:30.384 "write_zeroes": true, 00:18:30.384 "zcopy": false, 00:18:30.384 "get_zone_info": false, 00:18:30.384 "zone_management": false, 00:18:30.384 "zone_append": false, 00:18:30.384 "compare": false, 00:18:30.384 "compare_and_write": false, 00:18:30.384 "abort": false, 00:18:30.384 "seek_hole": false, 00:18:30.384 "seek_data": false, 00:18:30.384 "copy": false, 00:18:30.384 "nvme_iov_md": false 00:18:30.384 }, 00:18:30.384 "memory_domains": [ 00:18:30.384 { 00:18:30.384 "dma_device_id": "system", 00:18:30.384 "dma_device_type": 1 00:18:30.384 }, 00:18:30.384 { 00:18:30.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.384 "dma_device_type": 2 00:18:30.384 }, 00:18:30.384 { 00:18:30.384 "dma_device_id": "system", 00:18:30.384 "dma_device_type": 1 00:18:30.384 }, 00:18:30.384 { 00:18:30.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.384 "dma_device_type": 2 00:18:30.384 }, 00:18:30.384 { 00:18:30.384 "dma_device_id": "system", 00:18:30.384 "dma_device_type": 1 00:18:30.384 }, 00:18:30.384 { 00:18:30.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.384 "dma_device_type": 2 00:18:30.384 }, 00:18:30.384 { 00:18:30.384 "dma_device_id": "system", 00:18:30.384 "dma_device_type": 1 00:18:30.384 }, 00:18:30.384 { 00:18:30.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.384 "dma_device_type": 2 00:18:30.384 } 00:18:30.384 ], 00:18:30.384 "driver_specific": { 00:18:30.384 "raid": { 00:18:30.384 "uuid": "566e07bf-6339-497e-bf2a-b8893f81d29b", 00:18:30.384 "strip_size_kb": 64, 00:18:30.384 "state": "online", 00:18:30.384 "raid_level": "raid0", 00:18:30.384 "superblock": true, 00:18:30.384 "num_base_bdevs": 4, 00:18:30.384 "num_base_bdevs_discovered": 4, 00:18:30.384 "num_base_bdevs_operational": 4, 00:18:30.384 "base_bdevs_list": [ 00:18:30.384 { 00:18:30.384 "name": "pt1", 00:18:30.384 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:30.384 "is_configured": true, 00:18:30.384 "data_offset": 2048, 00:18:30.384 "data_size": 63488 00:18:30.384 }, 00:18:30.384 { 00:18:30.384 "name": "pt2", 00:18:30.384 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:30.384 "is_configured": true, 00:18:30.384 "data_offset": 2048, 00:18:30.384 "data_size": 63488 00:18:30.384 }, 00:18:30.384 { 00:18:30.384 "name": "pt3", 00:18:30.384 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:30.384 "is_configured": true, 00:18:30.384 "data_offset": 2048, 00:18:30.384 "data_size": 63488 00:18:30.384 }, 00:18:30.384 { 00:18:30.384 "name": "pt4", 00:18:30.384 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:30.384 "is_configured": true, 00:18:30.384 "data_offset": 2048, 00:18:30.384 "data_size": 63488 00:18:30.384 } 00:18:30.384 ] 00:18:30.384 } 00:18:30.384 } 00:18:30.384 }' 00:18:30.384 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:30.644 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:30.644 pt2 00:18:30.644 pt3 00:18:30.644 pt4' 00:18:30.644 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:30.644 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:30.644 08:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:30.644 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:30.644 "name": "pt1", 00:18:30.644 "aliases": [ 00:18:30.644 "00000000-0000-0000-0000-000000000001" 00:18:30.644 ], 00:18:30.644 "product_name": "passthru", 00:18:30.644 "block_size": 512, 00:18:30.644 "num_blocks": 65536, 00:18:30.644 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:30.644 "assigned_rate_limits": { 00:18:30.644 "rw_ios_per_sec": 0, 00:18:30.644 "rw_mbytes_per_sec": 0, 00:18:30.644 "r_mbytes_per_sec": 0, 00:18:30.644 "w_mbytes_per_sec": 0 00:18:30.644 }, 00:18:30.644 "claimed": true, 00:18:30.644 "claim_type": "exclusive_write", 00:18:30.644 "zoned": false, 00:18:30.644 "supported_io_types": { 00:18:30.644 "read": true, 00:18:30.644 "write": true, 00:18:30.644 "unmap": true, 00:18:30.644 "flush": true, 00:18:30.644 "reset": true, 00:18:30.644 "nvme_admin": false, 00:18:30.644 "nvme_io": false, 00:18:30.644 "nvme_io_md": false, 00:18:30.644 "write_zeroes": true, 00:18:30.644 "zcopy": true, 00:18:30.644 "get_zone_info": false, 00:18:30.644 "zone_management": false, 00:18:30.644 "zone_append": false, 00:18:30.644 "compare": false, 00:18:30.644 "compare_and_write": false, 00:18:30.644 "abort": true, 00:18:30.644 "seek_hole": false, 00:18:30.644 "seek_data": false, 00:18:30.644 "copy": true, 00:18:30.644 "nvme_iov_md": false 00:18:30.644 }, 00:18:30.644 "memory_domains": [ 00:18:30.644 { 00:18:30.644 "dma_device_id": "system", 00:18:30.644 "dma_device_type": 1 00:18:30.644 }, 00:18:30.644 { 00:18:30.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.644 "dma_device_type": 2 00:18:30.644 } 00:18:30.644 ], 00:18:30.644 "driver_specific": { 00:18:30.644 "passthru": { 00:18:30.644 "name": "pt1", 00:18:30.644 "base_bdev_name": "malloc1" 00:18:30.644 } 00:18:30.644 } 00:18:30.644 }' 00:18:30.644 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.644 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.903 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:30.903 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.903 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.903 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:30.903 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:30.903 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:30.903 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:30.903 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:30.903 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:30.903 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:30.903 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:30.903 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:30.903 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:31.161 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:31.161 "name": "pt2", 00:18:31.161 "aliases": [ 00:18:31.161 "00000000-0000-0000-0000-000000000002" 00:18:31.161 ], 00:18:31.161 "product_name": "passthru", 00:18:31.161 "block_size": 512, 00:18:31.161 "num_blocks": 65536, 00:18:31.161 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:31.161 "assigned_rate_limits": { 00:18:31.161 "rw_ios_per_sec": 0, 00:18:31.161 "rw_mbytes_per_sec": 0, 00:18:31.161 "r_mbytes_per_sec": 0, 00:18:31.161 "w_mbytes_per_sec": 0 00:18:31.161 }, 00:18:31.161 "claimed": true, 00:18:31.161 "claim_type": "exclusive_write", 00:18:31.161 "zoned": false, 00:18:31.161 "supported_io_types": { 00:18:31.161 "read": true, 00:18:31.161 "write": true, 00:18:31.161 "unmap": true, 00:18:31.161 "flush": true, 00:18:31.161 "reset": true, 00:18:31.161 "nvme_admin": false, 00:18:31.161 "nvme_io": false, 00:18:31.161 "nvme_io_md": false, 00:18:31.161 "write_zeroes": true, 00:18:31.161 "zcopy": true, 00:18:31.161 "get_zone_info": false, 00:18:31.161 "zone_management": false, 00:18:31.161 "zone_append": false, 00:18:31.161 "compare": false, 00:18:31.161 "compare_and_write": false, 00:18:31.161 "abort": true, 00:18:31.161 "seek_hole": false, 00:18:31.161 "seek_data": false, 00:18:31.161 "copy": true, 00:18:31.161 "nvme_iov_md": false 00:18:31.161 }, 00:18:31.161 "memory_domains": [ 00:18:31.161 { 00:18:31.161 "dma_device_id": "system", 00:18:31.161 "dma_device_type": 1 00:18:31.161 }, 00:18:31.161 { 00:18:31.161 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.161 "dma_device_type": 2 00:18:31.161 } 00:18:31.161 ], 00:18:31.161 "driver_specific": { 00:18:31.161 "passthru": { 00:18:31.161 "name": "pt2", 00:18:31.161 "base_bdev_name": "malloc2" 00:18:31.161 } 00:18:31.161 } 00:18:31.161 }' 00:18:31.162 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:31.162 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:31.162 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:31.162 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.420 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.420 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:31.420 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:31.420 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:31.420 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:31.420 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:31.420 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:31.420 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:31.420 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:31.420 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:31.420 08:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:31.679 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:31.679 "name": "pt3", 00:18:31.679 "aliases": [ 00:18:31.679 "00000000-0000-0000-0000-000000000003" 00:18:31.680 ], 00:18:31.680 "product_name": "passthru", 00:18:31.680 "block_size": 512, 00:18:31.680 "num_blocks": 65536, 00:18:31.680 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:31.680 "assigned_rate_limits": { 00:18:31.680 "rw_ios_per_sec": 0, 00:18:31.680 "rw_mbytes_per_sec": 0, 00:18:31.680 "r_mbytes_per_sec": 0, 00:18:31.680 "w_mbytes_per_sec": 0 00:18:31.680 }, 00:18:31.680 "claimed": true, 00:18:31.680 "claim_type": "exclusive_write", 00:18:31.680 "zoned": false, 00:18:31.680 "supported_io_types": { 00:18:31.680 "read": true, 00:18:31.680 "write": true, 00:18:31.680 "unmap": true, 00:18:31.680 "flush": true, 00:18:31.680 "reset": true, 00:18:31.680 "nvme_admin": false, 00:18:31.680 "nvme_io": false, 00:18:31.680 "nvme_io_md": false, 00:18:31.680 "write_zeroes": true, 00:18:31.680 "zcopy": true, 00:18:31.680 "get_zone_info": false, 00:18:31.680 "zone_management": false, 00:18:31.680 "zone_append": false, 00:18:31.680 "compare": false, 00:18:31.680 "compare_and_write": false, 00:18:31.680 "abort": true, 00:18:31.680 "seek_hole": false, 00:18:31.680 "seek_data": false, 00:18:31.680 "copy": true, 00:18:31.680 "nvme_iov_md": false 00:18:31.680 }, 00:18:31.680 "memory_domains": [ 00:18:31.680 { 00:18:31.680 "dma_device_id": "system", 00:18:31.680 "dma_device_type": 1 00:18:31.680 }, 00:18:31.680 { 00:18:31.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.680 "dma_device_type": 2 00:18:31.680 } 00:18:31.680 ], 00:18:31.680 "driver_specific": { 00:18:31.680 "passthru": { 00:18:31.680 "name": "pt3", 00:18:31.680 "base_bdev_name": "malloc3" 00:18:31.680 } 00:18:31.680 } 00:18:31.680 }' 00:18:31.680 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:31.680 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:31.680 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:31.680 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.680 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.938 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:31.938 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:31.938 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:31.938 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:31.938 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:31.938 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:31.938 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:31.938 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:31.938 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:31.938 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:32.197 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:32.197 "name": "pt4", 00:18:32.197 "aliases": [ 00:18:32.197 "00000000-0000-0000-0000-000000000004" 00:18:32.197 ], 00:18:32.197 "product_name": "passthru", 00:18:32.197 "block_size": 512, 00:18:32.197 "num_blocks": 65536, 00:18:32.197 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:32.197 "assigned_rate_limits": { 00:18:32.197 "rw_ios_per_sec": 0, 00:18:32.197 "rw_mbytes_per_sec": 0, 00:18:32.197 "r_mbytes_per_sec": 0, 00:18:32.197 "w_mbytes_per_sec": 0 00:18:32.197 }, 00:18:32.197 "claimed": true, 00:18:32.197 "claim_type": "exclusive_write", 00:18:32.197 "zoned": false, 00:18:32.197 "supported_io_types": { 00:18:32.197 "read": true, 00:18:32.197 "write": true, 00:18:32.197 "unmap": true, 00:18:32.197 "flush": true, 00:18:32.197 "reset": true, 00:18:32.197 "nvme_admin": false, 00:18:32.197 "nvme_io": false, 00:18:32.197 "nvme_io_md": false, 00:18:32.197 "write_zeroes": true, 00:18:32.197 "zcopy": true, 00:18:32.197 "get_zone_info": false, 00:18:32.197 "zone_management": false, 00:18:32.197 "zone_append": false, 00:18:32.197 "compare": false, 00:18:32.197 "compare_and_write": false, 00:18:32.197 "abort": true, 00:18:32.197 "seek_hole": false, 00:18:32.197 "seek_data": false, 00:18:32.197 "copy": true, 00:18:32.197 "nvme_iov_md": false 00:18:32.197 }, 00:18:32.197 "memory_domains": [ 00:18:32.197 { 00:18:32.197 "dma_device_id": "system", 00:18:32.197 "dma_device_type": 1 00:18:32.197 }, 00:18:32.197 { 00:18:32.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.197 "dma_device_type": 2 00:18:32.197 } 00:18:32.197 ], 00:18:32.197 "driver_specific": { 00:18:32.197 "passthru": { 00:18:32.197 "name": "pt4", 00:18:32.197 "base_bdev_name": "malloc4" 00:18:32.197 } 00:18:32.197 } 00:18:32.197 }' 00:18:32.197 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:32.197 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:32.197 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:32.197 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:32.197 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:32.197 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:32.197 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:32.455 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:32.455 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:32.455 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:32.455 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:32.455 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:32.455 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:18:32.455 08:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:32.714 [2024-07-23 08:31:44.994204] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:32.714 08:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 566e07bf-6339-497e-bf2a-b8893f81d29b '!=' 566e07bf-6339-497e-bf2a-b8893f81d29b ']' 00:18:32.714 08:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:18:32.714 08:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:32.714 08:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:32.714 08:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1481726 00:18:32.714 08:31:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1481726 ']' 00:18:32.714 08:31:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1481726 00:18:32.714 08:31:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:18:32.714 08:31:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:32.714 08:31:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1481726 00:18:32.714 08:31:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:32.714 08:31:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:32.714 08:31:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1481726' 00:18:32.714 killing process with pid 1481726 00:18:32.714 08:31:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1481726 00:18:32.714 [2024-07-23 08:31:45.046947] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:32.714 08:31:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1481726 00:18:32.714 [2024-07-23 08:31:45.047027] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:32.714 [2024-07-23 08:31:45.047097] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:32.714 [2024-07-23 08:31:45.047107] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038480 name raid_bdev1, state offline 00:18:32.973 [2024-07-23 08:31:45.370969] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:34.351 08:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:18:34.351 00:18:34.351 real 0m13.883s 00:18:34.351 user 0m23.878s 00:18:34.351 sys 0m2.028s 00:18:34.351 08:31:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:34.351 08:31:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:34.351 ************************************ 00:18:34.351 END TEST raid_superblock_test 00:18:34.351 ************************************ 00:18:34.351 08:31:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:34.351 08:31:46 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:18:34.351 08:31:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:34.351 08:31:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:34.351 08:31:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:34.351 ************************************ 00:18:34.351 START TEST raid_read_error_test 00:18:34.351 ************************************ 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ewt9UlUAzI 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1484584 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1484584 /var/tmp/spdk-raid.sock 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1484584 ']' 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:34.351 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:34.351 08:31:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:34.351 [2024-07-23 08:31:46.792494] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:18:34.351 [2024-07-23 08:31:46.792590] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1484584 ] 00:18:34.610 [2024-07-23 08:31:46.917183] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:34.869 [2024-07-23 08:31:47.137329] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:35.127 [2024-07-23 08:31:47.405424] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:35.127 [2024-07-23 08:31:47.405455] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:35.127 08:31:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:35.127 08:31:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:35.127 08:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:35.127 08:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:35.386 BaseBdev1_malloc 00:18:35.386 08:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:35.645 true 00:18:35.645 08:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:35.645 [2024-07-23 08:31:48.083575] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:35.645 [2024-07-23 08:31:48.083635] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:35.645 [2024-07-23 08:31:48.083654] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:18:35.645 [2024-07-23 08:31:48.083665] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:35.645 [2024-07-23 08:31:48.085597] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:35.645 [2024-07-23 08:31:48.085634] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:35.645 BaseBdev1 00:18:35.645 08:31:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:35.645 08:31:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:35.904 BaseBdev2_malloc 00:18:35.904 08:31:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:36.163 true 00:18:36.163 08:31:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:36.163 [2024-07-23 08:31:48.615852] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:36.163 [2024-07-23 08:31:48.615903] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:36.163 [2024-07-23 08:31:48.615938] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:18:36.163 [2024-07-23 08:31:48.615951] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:36.163 [2024-07-23 08:31:48.617974] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:36.163 [2024-07-23 08:31:48.618004] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:36.163 BaseBdev2 00:18:36.163 08:31:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:36.163 08:31:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:36.421 BaseBdev3_malloc 00:18:36.421 08:31:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:36.680 true 00:18:36.680 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:36.680 [2024-07-23 08:31:49.169957] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:36.680 [2024-07-23 08:31:49.170008] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:36.680 [2024-07-23 08:31:49.170029] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036980 00:18:36.680 [2024-07-23 08:31:49.170040] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:36.680 [2024-07-23 08:31:49.172030] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:36.680 [2024-07-23 08:31:49.172058] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:36.680 BaseBdev3 00:18:36.680 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:36.680 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:18:36.938 BaseBdev4_malloc 00:18:36.938 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:18:37.197 true 00:18:37.197 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:18:37.197 [2024-07-23 08:31:49.702536] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:18:37.197 [2024-07-23 08:31:49.702588] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:37.197 [2024-07-23 08:31:49.702607] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037880 00:18:37.197 [2024-07-23 08:31:49.702626] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:37.197 [2024-07-23 08:31:49.704550] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:37.197 [2024-07-23 08:31:49.704588] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:18:37.197 BaseBdev4 00:18:37.456 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:18:37.456 [2024-07-23 08:31:49.871007] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:37.456 [2024-07-23 08:31:49.872515] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:37.456 [2024-07-23 08:31:49.872582] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:37.456 [2024-07-23 08:31:49.872664] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:37.456 [2024-07-23 08:31:49.872890] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037e80 00:18:37.456 [2024-07-23 08:31:49.872904] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:37.456 [2024-07-23 08:31:49.873141] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:18:37.456 [2024-07-23 08:31:49.873327] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037e80 00:18:37.456 [2024-07-23 08:31:49.873336] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037e80 00:18:37.456 [2024-07-23 08:31:49.873493] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:37.456 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:37.457 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:37.457 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:37.457 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:37.457 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:37.457 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:37.457 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:37.457 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:37.457 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:37.457 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:37.457 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:37.457 08:31:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.715 08:31:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:37.715 "name": "raid_bdev1", 00:18:37.715 "uuid": "348350ea-b722-46e9-adf8-da794deddb99", 00:18:37.715 "strip_size_kb": 64, 00:18:37.715 "state": "online", 00:18:37.715 "raid_level": "raid0", 00:18:37.715 "superblock": true, 00:18:37.715 "num_base_bdevs": 4, 00:18:37.715 "num_base_bdevs_discovered": 4, 00:18:37.715 "num_base_bdevs_operational": 4, 00:18:37.715 "base_bdevs_list": [ 00:18:37.715 { 00:18:37.715 "name": "BaseBdev1", 00:18:37.715 "uuid": "3cf41654-cc5d-5633-9d4b-c8042a07c62a", 00:18:37.715 "is_configured": true, 00:18:37.715 "data_offset": 2048, 00:18:37.715 "data_size": 63488 00:18:37.715 }, 00:18:37.715 { 00:18:37.715 "name": "BaseBdev2", 00:18:37.715 "uuid": "6e419e0b-7c0c-5dec-aca1-7cbbc3963c29", 00:18:37.715 "is_configured": true, 00:18:37.715 "data_offset": 2048, 00:18:37.715 "data_size": 63488 00:18:37.715 }, 00:18:37.715 { 00:18:37.715 "name": "BaseBdev3", 00:18:37.715 "uuid": "544b71b9-7b58-5ce5-b9fb-6976d52e2e3b", 00:18:37.715 "is_configured": true, 00:18:37.715 "data_offset": 2048, 00:18:37.715 "data_size": 63488 00:18:37.715 }, 00:18:37.715 { 00:18:37.715 "name": "BaseBdev4", 00:18:37.715 "uuid": "ad0148a7-70a7-5ea9-9ed8-54be2439a846", 00:18:37.715 "is_configured": true, 00:18:37.715 "data_offset": 2048, 00:18:37.715 "data_size": 63488 00:18:37.715 } 00:18:37.715 ] 00:18:37.715 }' 00:18:37.715 08:31:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:37.715 08:31:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:38.281 08:31:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:38.281 08:31:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:38.281 [2024-07-23 08:31:50.646546] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c130 00:18:39.217 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:39.475 "name": "raid_bdev1", 00:18:39.475 "uuid": "348350ea-b722-46e9-adf8-da794deddb99", 00:18:39.475 "strip_size_kb": 64, 00:18:39.475 "state": "online", 00:18:39.475 "raid_level": "raid0", 00:18:39.475 "superblock": true, 00:18:39.475 "num_base_bdevs": 4, 00:18:39.475 "num_base_bdevs_discovered": 4, 00:18:39.475 "num_base_bdevs_operational": 4, 00:18:39.475 "base_bdevs_list": [ 00:18:39.475 { 00:18:39.475 "name": "BaseBdev1", 00:18:39.475 "uuid": "3cf41654-cc5d-5633-9d4b-c8042a07c62a", 00:18:39.475 "is_configured": true, 00:18:39.475 "data_offset": 2048, 00:18:39.475 "data_size": 63488 00:18:39.475 }, 00:18:39.475 { 00:18:39.475 "name": "BaseBdev2", 00:18:39.475 "uuid": "6e419e0b-7c0c-5dec-aca1-7cbbc3963c29", 00:18:39.475 "is_configured": true, 00:18:39.475 "data_offset": 2048, 00:18:39.475 "data_size": 63488 00:18:39.475 }, 00:18:39.475 { 00:18:39.475 "name": "BaseBdev3", 00:18:39.475 "uuid": "544b71b9-7b58-5ce5-b9fb-6976d52e2e3b", 00:18:39.475 "is_configured": true, 00:18:39.475 "data_offset": 2048, 00:18:39.475 "data_size": 63488 00:18:39.475 }, 00:18:39.475 { 00:18:39.475 "name": "BaseBdev4", 00:18:39.475 "uuid": "ad0148a7-70a7-5ea9-9ed8-54be2439a846", 00:18:39.475 "is_configured": true, 00:18:39.475 "data_offset": 2048, 00:18:39.475 "data_size": 63488 00:18:39.475 } 00:18:39.475 ] 00:18:39.475 }' 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:39.475 08:31:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:40.043 08:31:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:40.302 [2024-07-23 08:31:52.568296] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:40.302 [2024-07-23 08:31:52.568334] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:40.302 [2024-07-23 08:31:52.570675] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:40.302 [2024-07-23 08:31:52.570718] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:40.302 [2024-07-23 08:31:52.570757] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:40.302 [2024-07-23 08:31:52.570771] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037e80 name raid_bdev1, state offline 00:18:40.302 0 00:18:40.302 08:31:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1484584 00:18:40.302 08:31:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1484584 ']' 00:18:40.302 08:31:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1484584 00:18:40.302 08:31:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:18:40.302 08:31:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:40.302 08:31:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1484584 00:18:40.302 08:31:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:40.302 08:31:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:40.302 08:31:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1484584' 00:18:40.302 killing process with pid 1484584 00:18:40.302 08:31:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1484584 00:18:40.302 [2024-07-23 08:31:52.636313] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:40.302 08:31:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1484584 00:18:40.561 [2024-07-23 08:31:52.900067] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:41.937 08:31:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ewt9UlUAzI 00:18:41.937 08:31:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:41.937 08:31:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:41.937 08:31:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:18:41.937 08:31:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:18:41.937 08:31:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:41.937 08:31:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:41.937 08:31:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:18:41.937 00:18:41.937 real 0m7.564s 00:18:41.937 user 0m10.879s 00:18:41.937 sys 0m1.014s 00:18:41.937 08:31:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:41.937 08:31:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:41.937 ************************************ 00:18:41.937 END TEST raid_read_error_test 00:18:41.937 ************************************ 00:18:41.937 08:31:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:41.937 08:31:54 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:18:41.937 08:31:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:41.937 08:31:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:41.937 08:31:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:41.937 ************************************ 00:18:41.937 START TEST raid_write_error_test 00:18:41.937 ************************************ 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.n3DYqgT1HP 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1485977 00:18:41.937 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1485977 /var/tmp/spdk-raid.sock 00:18:41.938 08:31:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:41.938 08:31:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1485977 ']' 00:18:41.938 08:31:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:41.938 08:31:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:41.938 08:31:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:41.938 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:41.938 08:31:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:41.938 08:31:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:41.938 [2024-07-23 08:31:54.426315] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:18:41.938 [2024-07-23 08:31:54.426410] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1485977 ] 00:18:42.197 [2024-07-23 08:31:54.552854] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:42.456 [2024-07-23 08:31:54.778058] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:42.752 [2024-07-23 08:31:55.072667] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:42.753 [2024-07-23 08:31:55.072700] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:42.753 08:31:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:42.753 08:31:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:42.753 08:31:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:42.753 08:31:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:43.012 BaseBdev1_malloc 00:18:43.012 08:31:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:43.271 true 00:18:43.271 08:31:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:43.271 [2024-07-23 08:31:55.733082] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:43.271 [2024-07-23 08:31:55.733138] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:43.271 [2024-07-23 08:31:55.733157] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:18:43.271 [2024-07-23 08:31:55.733168] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:43.271 [2024-07-23 08:31:55.735224] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:43.271 [2024-07-23 08:31:55.735253] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:43.271 BaseBdev1 00:18:43.271 08:31:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:43.271 08:31:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:43.529 BaseBdev2_malloc 00:18:43.529 08:31:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:43.788 true 00:18:43.788 08:31:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:43.788 [2024-07-23 08:31:56.275186] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:43.788 [2024-07-23 08:31:56.275241] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:43.788 [2024-07-23 08:31:56.275260] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:18:43.788 [2024-07-23 08:31:56.275273] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:43.788 [2024-07-23 08:31:56.277327] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:43.788 [2024-07-23 08:31:56.277356] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:43.788 BaseBdev2 00:18:43.788 08:31:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:43.788 08:31:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:44.046 BaseBdev3_malloc 00:18:44.047 08:31:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:44.305 true 00:18:44.305 08:31:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:44.305 [2024-07-23 08:31:56.812468] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:44.305 [2024-07-23 08:31:56.812520] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:44.305 [2024-07-23 08:31:56.812541] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036980 00:18:44.306 [2024-07-23 08:31:56.812552] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:44.306 [2024-07-23 08:31:56.814511] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:44.306 [2024-07-23 08:31:56.814540] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:44.306 BaseBdev3 00:18:44.565 08:31:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:44.565 08:31:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:18:44.565 BaseBdev4_malloc 00:18:44.565 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:18:44.823 true 00:18:44.823 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:18:45.081 [2024-07-23 08:31:57.367781] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:18:45.081 [2024-07-23 08:31:57.367834] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:45.081 [2024-07-23 08:31:57.367872] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037880 00:18:45.081 [2024-07-23 08:31:57.367882] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:45.081 [2024-07-23 08:31:57.369880] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:45.081 [2024-07-23 08:31:57.369909] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:18:45.081 BaseBdev4 00:18:45.081 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:18:45.081 [2024-07-23 08:31:57.524225] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:45.081 [2024-07-23 08:31:57.525857] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:45.081 [2024-07-23 08:31:57.525930] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:45.081 [2024-07-23 08:31:57.525991] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:45.081 [2024-07-23 08:31:57.526228] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037e80 00:18:45.081 [2024-07-23 08:31:57.526242] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:45.081 [2024-07-23 08:31:57.526484] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:18:45.081 [2024-07-23 08:31:57.526689] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037e80 00:18:45.081 [2024-07-23 08:31:57.526700] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037e80 00:18:45.081 [2024-07-23 08:31:57.526861] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:45.081 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:45.081 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:45.081 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:45.081 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:45.081 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:45.081 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:45.081 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:45.081 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:45.081 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:45.081 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:45.081 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.081 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:45.339 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:45.339 "name": "raid_bdev1", 00:18:45.339 "uuid": "acca00c5-7166-461f-b81d-f33b4270790c", 00:18:45.339 "strip_size_kb": 64, 00:18:45.339 "state": "online", 00:18:45.339 "raid_level": "raid0", 00:18:45.339 "superblock": true, 00:18:45.339 "num_base_bdevs": 4, 00:18:45.339 "num_base_bdevs_discovered": 4, 00:18:45.339 "num_base_bdevs_operational": 4, 00:18:45.339 "base_bdevs_list": [ 00:18:45.339 { 00:18:45.339 "name": "BaseBdev1", 00:18:45.339 "uuid": "96337ef6-752b-55c7-977f-41303d75e163", 00:18:45.339 "is_configured": true, 00:18:45.339 "data_offset": 2048, 00:18:45.339 "data_size": 63488 00:18:45.339 }, 00:18:45.339 { 00:18:45.339 "name": "BaseBdev2", 00:18:45.339 "uuid": "3249292f-2121-5bcc-a761-6fb1e90c4935", 00:18:45.339 "is_configured": true, 00:18:45.339 "data_offset": 2048, 00:18:45.339 "data_size": 63488 00:18:45.339 }, 00:18:45.339 { 00:18:45.339 "name": "BaseBdev3", 00:18:45.339 "uuid": "e104e0ec-5a24-5773-8efa-45fab567d36f", 00:18:45.339 "is_configured": true, 00:18:45.339 "data_offset": 2048, 00:18:45.339 "data_size": 63488 00:18:45.339 }, 00:18:45.339 { 00:18:45.339 "name": "BaseBdev4", 00:18:45.339 "uuid": "bda5069a-81a7-5eb3-8d58-5c82a3f63fea", 00:18:45.339 "is_configured": true, 00:18:45.339 "data_offset": 2048, 00:18:45.339 "data_size": 63488 00:18:45.339 } 00:18:45.339 ] 00:18:45.339 }' 00:18:45.339 08:31:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:45.339 08:31:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:45.906 08:31:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:45.906 08:31:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:45.906 [2024-07-23 08:31:58.291649] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c130 00:18:46.842 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:18:47.099 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:47.099 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:18:47.099 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:18:47.099 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:47.099 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:47.099 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:47.099 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:47.099 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:47.099 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:47.099 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:47.099 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:47.100 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:47.100 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:47.100 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:47.100 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.100 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:47.100 "name": "raid_bdev1", 00:18:47.100 "uuid": "acca00c5-7166-461f-b81d-f33b4270790c", 00:18:47.100 "strip_size_kb": 64, 00:18:47.100 "state": "online", 00:18:47.100 "raid_level": "raid0", 00:18:47.100 "superblock": true, 00:18:47.100 "num_base_bdevs": 4, 00:18:47.100 "num_base_bdevs_discovered": 4, 00:18:47.100 "num_base_bdevs_operational": 4, 00:18:47.100 "base_bdevs_list": [ 00:18:47.100 { 00:18:47.100 "name": "BaseBdev1", 00:18:47.100 "uuid": "96337ef6-752b-55c7-977f-41303d75e163", 00:18:47.100 "is_configured": true, 00:18:47.100 "data_offset": 2048, 00:18:47.100 "data_size": 63488 00:18:47.100 }, 00:18:47.100 { 00:18:47.100 "name": "BaseBdev2", 00:18:47.100 "uuid": "3249292f-2121-5bcc-a761-6fb1e90c4935", 00:18:47.100 "is_configured": true, 00:18:47.100 "data_offset": 2048, 00:18:47.100 "data_size": 63488 00:18:47.100 }, 00:18:47.100 { 00:18:47.100 "name": "BaseBdev3", 00:18:47.100 "uuid": "e104e0ec-5a24-5773-8efa-45fab567d36f", 00:18:47.100 "is_configured": true, 00:18:47.100 "data_offset": 2048, 00:18:47.100 "data_size": 63488 00:18:47.100 }, 00:18:47.100 { 00:18:47.100 "name": "BaseBdev4", 00:18:47.100 "uuid": "bda5069a-81a7-5eb3-8d58-5c82a3f63fea", 00:18:47.100 "is_configured": true, 00:18:47.100 "data_offset": 2048, 00:18:47.100 "data_size": 63488 00:18:47.100 } 00:18:47.100 ] 00:18:47.100 }' 00:18:47.100 08:31:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:47.100 08:31:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:47.665 08:32:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:47.923 [2024-07-23 08:32:00.200752] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:47.923 [2024-07-23 08:32:00.200790] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:47.923 [2024-07-23 08:32:00.203328] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:47.923 [2024-07-23 08:32:00.203376] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:47.923 [2024-07-23 08:32:00.203417] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:47.923 [2024-07-23 08:32:00.203431] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037e80 name raid_bdev1, state offline 00:18:47.923 0 00:18:47.923 08:32:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1485977 00:18:47.923 08:32:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1485977 ']' 00:18:47.923 08:32:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1485977 00:18:47.923 08:32:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:18:47.923 08:32:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:47.923 08:32:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1485977 00:18:47.923 08:32:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:47.923 08:32:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:47.923 08:32:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1485977' 00:18:47.923 killing process with pid 1485977 00:18:47.923 08:32:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1485977 00:18:47.923 [2024-07-23 08:32:00.263959] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:47.923 08:32:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1485977 00:18:48.181 [2024-07-23 08:32:00.541618] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:49.552 08:32:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.n3DYqgT1HP 00:18:49.552 08:32:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:49.552 08:32:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:49.552 08:32:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:18:49.552 08:32:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:18:49.552 08:32:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:49.552 08:32:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:49.552 08:32:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:18:49.552 00:18:49.552 real 0m7.563s 00:18:49.552 user 0m10.822s 00:18:49.552 sys 0m1.022s 00:18:49.552 08:32:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:49.552 08:32:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:49.552 ************************************ 00:18:49.552 END TEST raid_write_error_test 00:18:49.552 ************************************ 00:18:49.552 08:32:01 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:49.552 08:32:01 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:18:49.552 08:32:01 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:18:49.552 08:32:01 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:49.552 08:32:01 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:49.552 08:32:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:49.552 ************************************ 00:18:49.552 START TEST raid_state_function_test 00:18:49.552 ************************************ 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1487549 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1487549' 00:18:49.552 Process raid pid: 1487549 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1487549 /var/tmp/spdk-raid.sock 00:18:49.552 08:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1487549 ']' 00:18:49.553 08:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:49.553 08:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:49.553 08:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:49.553 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:49.553 08:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:49.553 08:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:49.553 [2024-07-23 08:32:02.051548] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:18:49.553 [2024-07-23 08:32:02.051644] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:49.810 [2024-07-23 08:32:02.177960] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:50.067 [2024-07-23 08:32:02.386655] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:50.326 [2024-07-23 08:32:02.631022] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:50.326 [2024-07-23 08:32:02.631054] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:50.326 08:32:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:50.326 08:32:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:18:50.326 08:32:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:50.584 [2024-07-23 08:32:02.970880] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:50.584 [2024-07-23 08:32:02.970922] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:50.584 [2024-07-23 08:32:02.970936] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:50.584 [2024-07-23 08:32:02.970962] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:50.584 [2024-07-23 08:32:02.970969] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:50.584 [2024-07-23 08:32:02.970978] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:50.584 [2024-07-23 08:32:02.970985] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:50.584 [2024-07-23 08:32:02.970993] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:50.584 08:32:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:50.584 08:32:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:50.584 08:32:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:50.584 08:32:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:50.585 08:32:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:50.585 08:32:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:50.585 08:32:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:50.585 08:32:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:50.585 08:32:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:50.585 08:32:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:50.585 08:32:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.585 08:32:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:50.842 08:32:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:50.842 "name": "Existed_Raid", 00:18:50.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:50.842 "strip_size_kb": 64, 00:18:50.842 "state": "configuring", 00:18:50.842 "raid_level": "concat", 00:18:50.842 "superblock": false, 00:18:50.842 "num_base_bdevs": 4, 00:18:50.842 "num_base_bdevs_discovered": 0, 00:18:50.842 "num_base_bdevs_operational": 4, 00:18:50.842 "base_bdevs_list": [ 00:18:50.842 { 00:18:50.842 "name": "BaseBdev1", 00:18:50.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:50.842 "is_configured": false, 00:18:50.842 "data_offset": 0, 00:18:50.842 "data_size": 0 00:18:50.842 }, 00:18:50.842 { 00:18:50.842 "name": "BaseBdev2", 00:18:50.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:50.842 "is_configured": false, 00:18:50.842 "data_offset": 0, 00:18:50.842 "data_size": 0 00:18:50.842 }, 00:18:50.842 { 00:18:50.842 "name": "BaseBdev3", 00:18:50.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:50.842 "is_configured": false, 00:18:50.842 "data_offset": 0, 00:18:50.842 "data_size": 0 00:18:50.842 }, 00:18:50.842 { 00:18:50.842 "name": "BaseBdev4", 00:18:50.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:50.842 "is_configured": false, 00:18:50.842 "data_offset": 0, 00:18:50.842 "data_size": 0 00:18:50.842 } 00:18:50.842 ] 00:18:50.842 }' 00:18:50.842 08:32:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:50.842 08:32:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:51.099 08:32:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:51.357 [2024-07-23 08:32:03.736775] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:51.357 [2024-07-23 08:32:03.736809] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:18:51.357 08:32:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:51.615 [2024-07-23 08:32:03.905254] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:51.615 [2024-07-23 08:32:03.905298] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:51.615 [2024-07-23 08:32:03.905308] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:51.615 [2024-07-23 08:32:03.905317] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:51.615 [2024-07-23 08:32:03.905323] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:51.615 [2024-07-23 08:32:03.905333] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:51.615 [2024-07-23 08:32:03.905340] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:51.615 [2024-07-23 08:32:03.905348] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:51.615 08:32:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:51.615 [2024-07-23 08:32:04.101835] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:51.615 BaseBdev1 00:18:51.615 08:32:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:51.615 08:32:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:51.615 08:32:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:51.615 08:32:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:51.615 08:32:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:51.615 08:32:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:51.615 08:32:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:51.873 08:32:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:52.131 [ 00:18:52.131 { 00:18:52.131 "name": "BaseBdev1", 00:18:52.131 "aliases": [ 00:18:52.131 "f1b7002c-574b-40ba-9388-c351ad91e82d" 00:18:52.131 ], 00:18:52.131 "product_name": "Malloc disk", 00:18:52.131 "block_size": 512, 00:18:52.131 "num_blocks": 65536, 00:18:52.131 "uuid": "f1b7002c-574b-40ba-9388-c351ad91e82d", 00:18:52.131 "assigned_rate_limits": { 00:18:52.131 "rw_ios_per_sec": 0, 00:18:52.131 "rw_mbytes_per_sec": 0, 00:18:52.131 "r_mbytes_per_sec": 0, 00:18:52.131 "w_mbytes_per_sec": 0 00:18:52.131 }, 00:18:52.131 "claimed": true, 00:18:52.131 "claim_type": "exclusive_write", 00:18:52.131 "zoned": false, 00:18:52.131 "supported_io_types": { 00:18:52.131 "read": true, 00:18:52.131 "write": true, 00:18:52.131 "unmap": true, 00:18:52.131 "flush": true, 00:18:52.131 "reset": true, 00:18:52.131 "nvme_admin": false, 00:18:52.131 "nvme_io": false, 00:18:52.131 "nvme_io_md": false, 00:18:52.131 "write_zeroes": true, 00:18:52.131 "zcopy": true, 00:18:52.131 "get_zone_info": false, 00:18:52.131 "zone_management": false, 00:18:52.131 "zone_append": false, 00:18:52.131 "compare": false, 00:18:52.131 "compare_and_write": false, 00:18:52.131 "abort": true, 00:18:52.131 "seek_hole": false, 00:18:52.131 "seek_data": false, 00:18:52.131 "copy": true, 00:18:52.131 "nvme_iov_md": false 00:18:52.131 }, 00:18:52.131 "memory_domains": [ 00:18:52.131 { 00:18:52.131 "dma_device_id": "system", 00:18:52.131 "dma_device_type": 1 00:18:52.131 }, 00:18:52.131 { 00:18:52.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.131 "dma_device_type": 2 00:18:52.131 } 00:18:52.131 ], 00:18:52.131 "driver_specific": {} 00:18:52.131 } 00:18:52.131 ] 00:18:52.131 08:32:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:52.131 08:32:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:52.131 08:32:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:52.131 08:32:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:52.131 08:32:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:52.131 08:32:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:52.131 08:32:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:52.131 08:32:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:52.131 08:32:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:52.131 08:32:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:52.131 08:32:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:52.132 08:32:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.132 08:32:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:52.132 08:32:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:52.132 "name": "Existed_Raid", 00:18:52.132 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:52.132 "strip_size_kb": 64, 00:18:52.132 "state": "configuring", 00:18:52.132 "raid_level": "concat", 00:18:52.132 "superblock": false, 00:18:52.132 "num_base_bdevs": 4, 00:18:52.132 "num_base_bdevs_discovered": 1, 00:18:52.132 "num_base_bdevs_operational": 4, 00:18:52.132 "base_bdevs_list": [ 00:18:52.132 { 00:18:52.132 "name": "BaseBdev1", 00:18:52.132 "uuid": "f1b7002c-574b-40ba-9388-c351ad91e82d", 00:18:52.132 "is_configured": true, 00:18:52.132 "data_offset": 0, 00:18:52.132 "data_size": 65536 00:18:52.132 }, 00:18:52.132 { 00:18:52.132 "name": "BaseBdev2", 00:18:52.132 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:52.132 "is_configured": false, 00:18:52.132 "data_offset": 0, 00:18:52.132 "data_size": 0 00:18:52.132 }, 00:18:52.132 { 00:18:52.132 "name": "BaseBdev3", 00:18:52.132 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:52.132 "is_configured": false, 00:18:52.132 "data_offset": 0, 00:18:52.132 "data_size": 0 00:18:52.132 }, 00:18:52.132 { 00:18:52.132 "name": "BaseBdev4", 00:18:52.132 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:52.132 "is_configured": false, 00:18:52.132 "data_offset": 0, 00:18:52.132 "data_size": 0 00:18:52.132 } 00:18:52.132 ] 00:18:52.132 }' 00:18:52.132 08:32:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:52.132 08:32:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:52.697 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:52.954 [2024-07-23 08:32:05.297034] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:52.954 [2024-07-23 08:32:05.297083] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:18:52.954 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:52.954 [2024-07-23 08:32:05.469533] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:52.954 [2024-07-23 08:32:05.471361] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:52.954 [2024-07-23 08:32:05.471399] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:52.954 [2024-07-23 08:32:05.471410] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:52.954 [2024-07-23 08:32:05.471421] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:52.954 [2024-07-23 08:32:05.471429] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:52.954 [2024-07-23 08:32:05.471442] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:53.212 "name": "Existed_Raid", 00:18:53.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:53.212 "strip_size_kb": 64, 00:18:53.212 "state": "configuring", 00:18:53.212 "raid_level": "concat", 00:18:53.212 "superblock": false, 00:18:53.212 "num_base_bdevs": 4, 00:18:53.212 "num_base_bdevs_discovered": 1, 00:18:53.212 "num_base_bdevs_operational": 4, 00:18:53.212 "base_bdevs_list": [ 00:18:53.212 { 00:18:53.212 "name": "BaseBdev1", 00:18:53.212 "uuid": "f1b7002c-574b-40ba-9388-c351ad91e82d", 00:18:53.212 "is_configured": true, 00:18:53.212 "data_offset": 0, 00:18:53.212 "data_size": 65536 00:18:53.212 }, 00:18:53.212 { 00:18:53.212 "name": "BaseBdev2", 00:18:53.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:53.212 "is_configured": false, 00:18:53.212 "data_offset": 0, 00:18:53.212 "data_size": 0 00:18:53.212 }, 00:18:53.212 { 00:18:53.212 "name": "BaseBdev3", 00:18:53.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:53.212 "is_configured": false, 00:18:53.212 "data_offset": 0, 00:18:53.212 "data_size": 0 00:18:53.212 }, 00:18:53.212 { 00:18:53.212 "name": "BaseBdev4", 00:18:53.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:53.212 "is_configured": false, 00:18:53.212 "data_offset": 0, 00:18:53.212 "data_size": 0 00:18:53.212 } 00:18:53.212 ] 00:18:53.212 }' 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:53.212 08:32:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:53.777 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:54.035 [2024-07-23 08:32:06.314104] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:54.035 BaseBdev2 00:18:54.035 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:54.035 08:32:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:54.035 08:32:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:54.035 08:32:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:54.035 08:32:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:54.035 08:32:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:54.035 08:32:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:54.035 08:32:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:54.293 [ 00:18:54.293 { 00:18:54.293 "name": "BaseBdev2", 00:18:54.293 "aliases": [ 00:18:54.293 "b06be9e3-e5da-499f-957b-7bfe9ea51551" 00:18:54.293 ], 00:18:54.293 "product_name": "Malloc disk", 00:18:54.293 "block_size": 512, 00:18:54.293 "num_blocks": 65536, 00:18:54.293 "uuid": "b06be9e3-e5da-499f-957b-7bfe9ea51551", 00:18:54.293 "assigned_rate_limits": { 00:18:54.293 "rw_ios_per_sec": 0, 00:18:54.293 "rw_mbytes_per_sec": 0, 00:18:54.293 "r_mbytes_per_sec": 0, 00:18:54.293 "w_mbytes_per_sec": 0 00:18:54.293 }, 00:18:54.293 "claimed": true, 00:18:54.293 "claim_type": "exclusive_write", 00:18:54.293 "zoned": false, 00:18:54.293 "supported_io_types": { 00:18:54.293 "read": true, 00:18:54.293 "write": true, 00:18:54.293 "unmap": true, 00:18:54.293 "flush": true, 00:18:54.293 "reset": true, 00:18:54.293 "nvme_admin": false, 00:18:54.293 "nvme_io": false, 00:18:54.293 "nvme_io_md": false, 00:18:54.293 "write_zeroes": true, 00:18:54.293 "zcopy": true, 00:18:54.294 "get_zone_info": false, 00:18:54.294 "zone_management": false, 00:18:54.294 "zone_append": false, 00:18:54.294 "compare": false, 00:18:54.294 "compare_and_write": false, 00:18:54.294 "abort": true, 00:18:54.294 "seek_hole": false, 00:18:54.294 "seek_data": false, 00:18:54.294 "copy": true, 00:18:54.294 "nvme_iov_md": false 00:18:54.294 }, 00:18:54.294 "memory_domains": [ 00:18:54.294 { 00:18:54.294 "dma_device_id": "system", 00:18:54.294 "dma_device_type": 1 00:18:54.294 }, 00:18:54.294 { 00:18:54.294 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.294 "dma_device_type": 2 00:18:54.294 } 00:18:54.294 ], 00:18:54.294 "driver_specific": {} 00:18:54.294 } 00:18:54.294 ] 00:18:54.294 08:32:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:54.294 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:54.294 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:54.294 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:54.294 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:54.294 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:54.294 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:54.294 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:54.294 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:54.294 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:54.294 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:54.294 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:54.294 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:54.294 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.294 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:54.551 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:54.551 "name": "Existed_Raid", 00:18:54.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:54.551 "strip_size_kb": 64, 00:18:54.551 "state": "configuring", 00:18:54.551 "raid_level": "concat", 00:18:54.551 "superblock": false, 00:18:54.551 "num_base_bdevs": 4, 00:18:54.551 "num_base_bdevs_discovered": 2, 00:18:54.551 "num_base_bdevs_operational": 4, 00:18:54.551 "base_bdevs_list": [ 00:18:54.551 { 00:18:54.551 "name": "BaseBdev1", 00:18:54.551 "uuid": "f1b7002c-574b-40ba-9388-c351ad91e82d", 00:18:54.551 "is_configured": true, 00:18:54.551 "data_offset": 0, 00:18:54.551 "data_size": 65536 00:18:54.551 }, 00:18:54.551 { 00:18:54.551 "name": "BaseBdev2", 00:18:54.551 "uuid": "b06be9e3-e5da-499f-957b-7bfe9ea51551", 00:18:54.551 "is_configured": true, 00:18:54.551 "data_offset": 0, 00:18:54.551 "data_size": 65536 00:18:54.551 }, 00:18:54.551 { 00:18:54.551 "name": "BaseBdev3", 00:18:54.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:54.551 "is_configured": false, 00:18:54.551 "data_offset": 0, 00:18:54.551 "data_size": 0 00:18:54.551 }, 00:18:54.551 { 00:18:54.551 "name": "BaseBdev4", 00:18:54.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:54.551 "is_configured": false, 00:18:54.551 "data_offset": 0, 00:18:54.551 "data_size": 0 00:18:54.551 } 00:18:54.551 ] 00:18:54.551 }' 00:18:54.551 08:32:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:54.551 08:32:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:54.809 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:55.067 [2024-07-23 08:32:07.480669] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:55.067 BaseBdev3 00:18:55.067 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:55.067 08:32:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:55.067 08:32:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:55.067 08:32:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:55.067 08:32:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:55.067 08:32:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:55.067 08:32:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:55.323 08:32:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:55.323 [ 00:18:55.323 { 00:18:55.323 "name": "BaseBdev3", 00:18:55.323 "aliases": [ 00:18:55.323 "98c6aa8f-e97d-4acb-a197-2717aa0501c3" 00:18:55.323 ], 00:18:55.323 "product_name": "Malloc disk", 00:18:55.323 "block_size": 512, 00:18:55.323 "num_blocks": 65536, 00:18:55.323 "uuid": "98c6aa8f-e97d-4acb-a197-2717aa0501c3", 00:18:55.323 "assigned_rate_limits": { 00:18:55.323 "rw_ios_per_sec": 0, 00:18:55.323 "rw_mbytes_per_sec": 0, 00:18:55.323 "r_mbytes_per_sec": 0, 00:18:55.323 "w_mbytes_per_sec": 0 00:18:55.323 }, 00:18:55.323 "claimed": true, 00:18:55.323 "claim_type": "exclusive_write", 00:18:55.323 "zoned": false, 00:18:55.323 "supported_io_types": { 00:18:55.323 "read": true, 00:18:55.323 "write": true, 00:18:55.323 "unmap": true, 00:18:55.323 "flush": true, 00:18:55.323 "reset": true, 00:18:55.323 "nvme_admin": false, 00:18:55.323 "nvme_io": false, 00:18:55.323 "nvme_io_md": false, 00:18:55.323 "write_zeroes": true, 00:18:55.323 "zcopy": true, 00:18:55.323 "get_zone_info": false, 00:18:55.323 "zone_management": false, 00:18:55.323 "zone_append": false, 00:18:55.323 "compare": false, 00:18:55.323 "compare_and_write": false, 00:18:55.323 "abort": true, 00:18:55.323 "seek_hole": false, 00:18:55.323 "seek_data": false, 00:18:55.323 "copy": true, 00:18:55.323 "nvme_iov_md": false 00:18:55.323 }, 00:18:55.323 "memory_domains": [ 00:18:55.323 { 00:18:55.323 "dma_device_id": "system", 00:18:55.323 "dma_device_type": 1 00:18:55.323 }, 00:18:55.323 { 00:18:55.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:55.323 "dma_device_type": 2 00:18:55.323 } 00:18:55.323 ], 00:18:55.323 "driver_specific": {} 00:18:55.323 } 00:18:55.323 ] 00:18:55.323 08:32:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:55.323 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:55.323 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:55.323 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:55.323 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:55.323 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:55.323 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:55.323 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:55.323 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:55.323 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:55.323 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:55.323 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:55.323 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:55.579 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.579 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:55.579 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:55.579 "name": "Existed_Raid", 00:18:55.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:55.579 "strip_size_kb": 64, 00:18:55.579 "state": "configuring", 00:18:55.579 "raid_level": "concat", 00:18:55.580 "superblock": false, 00:18:55.580 "num_base_bdevs": 4, 00:18:55.580 "num_base_bdevs_discovered": 3, 00:18:55.580 "num_base_bdevs_operational": 4, 00:18:55.580 "base_bdevs_list": [ 00:18:55.580 { 00:18:55.580 "name": "BaseBdev1", 00:18:55.580 "uuid": "f1b7002c-574b-40ba-9388-c351ad91e82d", 00:18:55.580 "is_configured": true, 00:18:55.580 "data_offset": 0, 00:18:55.580 "data_size": 65536 00:18:55.580 }, 00:18:55.580 { 00:18:55.580 "name": "BaseBdev2", 00:18:55.580 "uuid": "b06be9e3-e5da-499f-957b-7bfe9ea51551", 00:18:55.580 "is_configured": true, 00:18:55.580 "data_offset": 0, 00:18:55.580 "data_size": 65536 00:18:55.580 }, 00:18:55.580 { 00:18:55.580 "name": "BaseBdev3", 00:18:55.580 "uuid": "98c6aa8f-e97d-4acb-a197-2717aa0501c3", 00:18:55.580 "is_configured": true, 00:18:55.580 "data_offset": 0, 00:18:55.580 "data_size": 65536 00:18:55.580 }, 00:18:55.580 { 00:18:55.580 "name": "BaseBdev4", 00:18:55.580 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:55.580 "is_configured": false, 00:18:55.580 "data_offset": 0, 00:18:55.580 "data_size": 0 00:18:55.580 } 00:18:55.580 ] 00:18:55.580 }' 00:18:55.580 08:32:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:55.580 08:32:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:56.182 08:32:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:56.182 [2024-07-23 08:32:08.634880] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:56.182 [2024-07-23 08:32:08.634926] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:18:56.182 [2024-07-23 08:32:08.634934] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:56.182 [2024-07-23 08:32:08.635167] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:18:56.182 [2024-07-23 08:32:08.635348] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:18:56.182 [2024-07-23 08:32:08.635360] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:18:56.182 [2024-07-23 08:32:08.635621] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:56.182 BaseBdev4 00:18:56.182 08:32:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:56.182 08:32:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:56.183 08:32:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:56.183 08:32:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:56.183 08:32:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:56.183 08:32:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:56.183 08:32:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:56.440 08:32:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:56.699 [ 00:18:56.699 { 00:18:56.699 "name": "BaseBdev4", 00:18:56.699 "aliases": [ 00:18:56.699 "cd7ff1b3-3c20-48d7-91f5-de1dc4b28b35" 00:18:56.699 ], 00:18:56.699 "product_name": "Malloc disk", 00:18:56.699 "block_size": 512, 00:18:56.699 "num_blocks": 65536, 00:18:56.699 "uuid": "cd7ff1b3-3c20-48d7-91f5-de1dc4b28b35", 00:18:56.699 "assigned_rate_limits": { 00:18:56.699 "rw_ios_per_sec": 0, 00:18:56.699 "rw_mbytes_per_sec": 0, 00:18:56.699 "r_mbytes_per_sec": 0, 00:18:56.699 "w_mbytes_per_sec": 0 00:18:56.699 }, 00:18:56.699 "claimed": true, 00:18:56.699 "claim_type": "exclusive_write", 00:18:56.699 "zoned": false, 00:18:56.699 "supported_io_types": { 00:18:56.699 "read": true, 00:18:56.699 "write": true, 00:18:56.699 "unmap": true, 00:18:56.699 "flush": true, 00:18:56.699 "reset": true, 00:18:56.699 "nvme_admin": false, 00:18:56.699 "nvme_io": false, 00:18:56.699 "nvme_io_md": false, 00:18:56.699 "write_zeroes": true, 00:18:56.699 "zcopy": true, 00:18:56.699 "get_zone_info": false, 00:18:56.699 "zone_management": false, 00:18:56.699 "zone_append": false, 00:18:56.699 "compare": false, 00:18:56.699 "compare_and_write": false, 00:18:56.699 "abort": true, 00:18:56.699 "seek_hole": false, 00:18:56.699 "seek_data": false, 00:18:56.699 "copy": true, 00:18:56.699 "nvme_iov_md": false 00:18:56.699 }, 00:18:56.699 "memory_domains": [ 00:18:56.699 { 00:18:56.699 "dma_device_id": "system", 00:18:56.699 "dma_device_type": 1 00:18:56.699 }, 00:18:56.699 { 00:18:56.699 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.699 "dma_device_type": 2 00:18:56.699 } 00:18:56.699 ], 00:18:56.699 "driver_specific": {} 00:18:56.699 } 00:18:56.699 ] 00:18:56.699 08:32:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:56.699 08:32:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:56.699 08:32:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:56.699 08:32:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:18:56.699 08:32:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:56.699 08:32:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:56.699 08:32:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:56.699 08:32:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:56.699 08:32:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:56.699 08:32:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:56.699 08:32:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:56.699 08:32:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:56.699 08:32:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:56.699 08:32:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.699 08:32:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:56.699 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:56.699 "name": "Existed_Raid", 00:18:56.699 "uuid": "2e85a193-106e-4044-8e04-8fd962b2832d", 00:18:56.699 "strip_size_kb": 64, 00:18:56.699 "state": "online", 00:18:56.699 "raid_level": "concat", 00:18:56.699 "superblock": false, 00:18:56.699 "num_base_bdevs": 4, 00:18:56.699 "num_base_bdevs_discovered": 4, 00:18:56.699 "num_base_bdevs_operational": 4, 00:18:56.699 "base_bdevs_list": [ 00:18:56.699 { 00:18:56.699 "name": "BaseBdev1", 00:18:56.699 "uuid": "f1b7002c-574b-40ba-9388-c351ad91e82d", 00:18:56.699 "is_configured": true, 00:18:56.699 "data_offset": 0, 00:18:56.699 "data_size": 65536 00:18:56.699 }, 00:18:56.699 { 00:18:56.699 "name": "BaseBdev2", 00:18:56.699 "uuid": "b06be9e3-e5da-499f-957b-7bfe9ea51551", 00:18:56.699 "is_configured": true, 00:18:56.699 "data_offset": 0, 00:18:56.699 "data_size": 65536 00:18:56.699 }, 00:18:56.699 { 00:18:56.699 "name": "BaseBdev3", 00:18:56.699 "uuid": "98c6aa8f-e97d-4acb-a197-2717aa0501c3", 00:18:56.699 "is_configured": true, 00:18:56.699 "data_offset": 0, 00:18:56.699 "data_size": 65536 00:18:56.699 }, 00:18:56.699 { 00:18:56.699 "name": "BaseBdev4", 00:18:56.699 "uuid": "cd7ff1b3-3c20-48d7-91f5-de1dc4b28b35", 00:18:56.699 "is_configured": true, 00:18:56.699 "data_offset": 0, 00:18:56.699 "data_size": 65536 00:18:56.699 } 00:18:56.699 ] 00:18:56.699 }' 00:18:56.699 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:56.699 08:32:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:57.264 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:57.264 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:57.264 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:57.264 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:57.264 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:57.264 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:57.264 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:57.264 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:57.264 [2024-07-23 08:32:09.762219] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:57.264 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:57.264 "name": "Existed_Raid", 00:18:57.264 "aliases": [ 00:18:57.264 "2e85a193-106e-4044-8e04-8fd962b2832d" 00:18:57.264 ], 00:18:57.264 "product_name": "Raid Volume", 00:18:57.264 "block_size": 512, 00:18:57.264 "num_blocks": 262144, 00:18:57.264 "uuid": "2e85a193-106e-4044-8e04-8fd962b2832d", 00:18:57.264 "assigned_rate_limits": { 00:18:57.264 "rw_ios_per_sec": 0, 00:18:57.264 "rw_mbytes_per_sec": 0, 00:18:57.264 "r_mbytes_per_sec": 0, 00:18:57.264 "w_mbytes_per_sec": 0 00:18:57.264 }, 00:18:57.264 "claimed": false, 00:18:57.264 "zoned": false, 00:18:57.264 "supported_io_types": { 00:18:57.264 "read": true, 00:18:57.264 "write": true, 00:18:57.264 "unmap": true, 00:18:57.264 "flush": true, 00:18:57.264 "reset": true, 00:18:57.264 "nvme_admin": false, 00:18:57.264 "nvme_io": false, 00:18:57.264 "nvme_io_md": false, 00:18:57.264 "write_zeroes": true, 00:18:57.264 "zcopy": false, 00:18:57.264 "get_zone_info": false, 00:18:57.264 "zone_management": false, 00:18:57.264 "zone_append": false, 00:18:57.264 "compare": false, 00:18:57.264 "compare_and_write": false, 00:18:57.264 "abort": false, 00:18:57.264 "seek_hole": false, 00:18:57.264 "seek_data": false, 00:18:57.264 "copy": false, 00:18:57.264 "nvme_iov_md": false 00:18:57.264 }, 00:18:57.264 "memory_domains": [ 00:18:57.264 { 00:18:57.264 "dma_device_id": "system", 00:18:57.264 "dma_device_type": 1 00:18:57.264 }, 00:18:57.264 { 00:18:57.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.264 "dma_device_type": 2 00:18:57.264 }, 00:18:57.264 { 00:18:57.264 "dma_device_id": "system", 00:18:57.264 "dma_device_type": 1 00:18:57.264 }, 00:18:57.264 { 00:18:57.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.264 "dma_device_type": 2 00:18:57.264 }, 00:18:57.264 { 00:18:57.264 "dma_device_id": "system", 00:18:57.264 "dma_device_type": 1 00:18:57.264 }, 00:18:57.264 { 00:18:57.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.264 "dma_device_type": 2 00:18:57.264 }, 00:18:57.264 { 00:18:57.264 "dma_device_id": "system", 00:18:57.264 "dma_device_type": 1 00:18:57.264 }, 00:18:57.264 { 00:18:57.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.264 "dma_device_type": 2 00:18:57.264 } 00:18:57.264 ], 00:18:57.264 "driver_specific": { 00:18:57.264 "raid": { 00:18:57.264 "uuid": "2e85a193-106e-4044-8e04-8fd962b2832d", 00:18:57.264 "strip_size_kb": 64, 00:18:57.264 "state": "online", 00:18:57.264 "raid_level": "concat", 00:18:57.264 "superblock": false, 00:18:57.264 "num_base_bdevs": 4, 00:18:57.264 "num_base_bdevs_discovered": 4, 00:18:57.264 "num_base_bdevs_operational": 4, 00:18:57.264 "base_bdevs_list": [ 00:18:57.265 { 00:18:57.265 "name": "BaseBdev1", 00:18:57.265 "uuid": "f1b7002c-574b-40ba-9388-c351ad91e82d", 00:18:57.265 "is_configured": true, 00:18:57.265 "data_offset": 0, 00:18:57.265 "data_size": 65536 00:18:57.265 }, 00:18:57.265 { 00:18:57.265 "name": "BaseBdev2", 00:18:57.265 "uuid": "b06be9e3-e5da-499f-957b-7bfe9ea51551", 00:18:57.265 "is_configured": true, 00:18:57.265 "data_offset": 0, 00:18:57.265 "data_size": 65536 00:18:57.265 }, 00:18:57.265 { 00:18:57.265 "name": "BaseBdev3", 00:18:57.265 "uuid": "98c6aa8f-e97d-4acb-a197-2717aa0501c3", 00:18:57.265 "is_configured": true, 00:18:57.265 "data_offset": 0, 00:18:57.265 "data_size": 65536 00:18:57.265 }, 00:18:57.265 { 00:18:57.265 "name": "BaseBdev4", 00:18:57.265 "uuid": "cd7ff1b3-3c20-48d7-91f5-de1dc4b28b35", 00:18:57.265 "is_configured": true, 00:18:57.265 "data_offset": 0, 00:18:57.265 "data_size": 65536 00:18:57.265 } 00:18:57.265 ] 00:18:57.265 } 00:18:57.265 } 00:18:57.265 }' 00:18:57.265 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:57.522 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:57.522 BaseBdev2 00:18:57.522 BaseBdev3 00:18:57.522 BaseBdev4' 00:18:57.522 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:57.522 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:57.523 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:57.523 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:57.523 "name": "BaseBdev1", 00:18:57.523 "aliases": [ 00:18:57.523 "f1b7002c-574b-40ba-9388-c351ad91e82d" 00:18:57.523 ], 00:18:57.523 "product_name": "Malloc disk", 00:18:57.523 "block_size": 512, 00:18:57.523 "num_blocks": 65536, 00:18:57.523 "uuid": "f1b7002c-574b-40ba-9388-c351ad91e82d", 00:18:57.523 "assigned_rate_limits": { 00:18:57.523 "rw_ios_per_sec": 0, 00:18:57.523 "rw_mbytes_per_sec": 0, 00:18:57.523 "r_mbytes_per_sec": 0, 00:18:57.523 "w_mbytes_per_sec": 0 00:18:57.523 }, 00:18:57.523 "claimed": true, 00:18:57.523 "claim_type": "exclusive_write", 00:18:57.523 "zoned": false, 00:18:57.523 "supported_io_types": { 00:18:57.523 "read": true, 00:18:57.523 "write": true, 00:18:57.523 "unmap": true, 00:18:57.523 "flush": true, 00:18:57.523 "reset": true, 00:18:57.523 "nvme_admin": false, 00:18:57.523 "nvme_io": false, 00:18:57.523 "nvme_io_md": false, 00:18:57.523 "write_zeroes": true, 00:18:57.523 "zcopy": true, 00:18:57.523 "get_zone_info": false, 00:18:57.523 "zone_management": false, 00:18:57.523 "zone_append": false, 00:18:57.523 "compare": false, 00:18:57.523 "compare_and_write": false, 00:18:57.523 "abort": true, 00:18:57.523 "seek_hole": false, 00:18:57.523 "seek_data": false, 00:18:57.523 "copy": true, 00:18:57.523 "nvme_iov_md": false 00:18:57.523 }, 00:18:57.523 "memory_domains": [ 00:18:57.523 { 00:18:57.523 "dma_device_id": "system", 00:18:57.523 "dma_device_type": 1 00:18:57.523 }, 00:18:57.523 { 00:18:57.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.523 "dma_device_type": 2 00:18:57.523 } 00:18:57.523 ], 00:18:57.523 "driver_specific": {} 00:18:57.523 }' 00:18:57.523 08:32:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:57.523 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:57.779 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:57.780 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:57.780 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:57.780 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:57.780 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:57.780 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:57.780 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:57.780 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:57.780 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.036 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:58.036 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:58.036 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:58.036 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:58.036 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:58.036 "name": "BaseBdev2", 00:18:58.036 "aliases": [ 00:18:58.036 "b06be9e3-e5da-499f-957b-7bfe9ea51551" 00:18:58.036 ], 00:18:58.036 "product_name": "Malloc disk", 00:18:58.036 "block_size": 512, 00:18:58.036 "num_blocks": 65536, 00:18:58.036 "uuid": "b06be9e3-e5da-499f-957b-7bfe9ea51551", 00:18:58.036 "assigned_rate_limits": { 00:18:58.036 "rw_ios_per_sec": 0, 00:18:58.036 "rw_mbytes_per_sec": 0, 00:18:58.036 "r_mbytes_per_sec": 0, 00:18:58.036 "w_mbytes_per_sec": 0 00:18:58.036 }, 00:18:58.036 "claimed": true, 00:18:58.036 "claim_type": "exclusive_write", 00:18:58.036 "zoned": false, 00:18:58.036 "supported_io_types": { 00:18:58.036 "read": true, 00:18:58.036 "write": true, 00:18:58.036 "unmap": true, 00:18:58.036 "flush": true, 00:18:58.036 "reset": true, 00:18:58.036 "nvme_admin": false, 00:18:58.036 "nvme_io": false, 00:18:58.036 "nvme_io_md": false, 00:18:58.036 "write_zeroes": true, 00:18:58.036 "zcopy": true, 00:18:58.036 "get_zone_info": false, 00:18:58.036 "zone_management": false, 00:18:58.036 "zone_append": false, 00:18:58.036 "compare": false, 00:18:58.036 "compare_and_write": false, 00:18:58.036 "abort": true, 00:18:58.036 "seek_hole": false, 00:18:58.036 "seek_data": false, 00:18:58.036 "copy": true, 00:18:58.036 "nvme_iov_md": false 00:18:58.036 }, 00:18:58.036 "memory_domains": [ 00:18:58.036 { 00:18:58.036 "dma_device_id": "system", 00:18:58.036 "dma_device_type": 1 00:18:58.036 }, 00:18:58.036 { 00:18:58.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:58.036 "dma_device_type": 2 00:18:58.036 } 00:18:58.036 ], 00:18:58.036 "driver_specific": {} 00:18:58.036 }' 00:18:58.036 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:58.036 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:58.294 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:58.294 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:58.294 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:58.294 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:58.294 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:58.294 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:58.294 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:58.294 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.294 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.294 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:58.294 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:58.294 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:58.294 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:58.552 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:58.552 "name": "BaseBdev3", 00:18:58.552 "aliases": [ 00:18:58.552 "98c6aa8f-e97d-4acb-a197-2717aa0501c3" 00:18:58.552 ], 00:18:58.552 "product_name": "Malloc disk", 00:18:58.552 "block_size": 512, 00:18:58.552 "num_blocks": 65536, 00:18:58.552 "uuid": "98c6aa8f-e97d-4acb-a197-2717aa0501c3", 00:18:58.552 "assigned_rate_limits": { 00:18:58.552 "rw_ios_per_sec": 0, 00:18:58.552 "rw_mbytes_per_sec": 0, 00:18:58.552 "r_mbytes_per_sec": 0, 00:18:58.552 "w_mbytes_per_sec": 0 00:18:58.552 }, 00:18:58.552 "claimed": true, 00:18:58.552 "claim_type": "exclusive_write", 00:18:58.552 "zoned": false, 00:18:58.552 "supported_io_types": { 00:18:58.552 "read": true, 00:18:58.552 "write": true, 00:18:58.552 "unmap": true, 00:18:58.552 "flush": true, 00:18:58.552 "reset": true, 00:18:58.552 "nvme_admin": false, 00:18:58.552 "nvme_io": false, 00:18:58.552 "nvme_io_md": false, 00:18:58.552 "write_zeroes": true, 00:18:58.552 "zcopy": true, 00:18:58.552 "get_zone_info": false, 00:18:58.552 "zone_management": false, 00:18:58.552 "zone_append": false, 00:18:58.552 "compare": false, 00:18:58.552 "compare_and_write": false, 00:18:58.552 "abort": true, 00:18:58.552 "seek_hole": false, 00:18:58.552 "seek_data": false, 00:18:58.552 "copy": true, 00:18:58.552 "nvme_iov_md": false 00:18:58.552 }, 00:18:58.552 "memory_domains": [ 00:18:58.552 { 00:18:58.552 "dma_device_id": "system", 00:18:58.552 "dma_device_type": 1 00:18:58.552 }, 00:18:58.552 { 00:18:58.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:58.552 "dma_device_type": 2 00:18:58.552 } 00:18:58.552 ], 00:18:58.552 "driver_specific": {} 00:18:58.552 }' 00:18:58.552 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:58.552 08:32:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:58.552 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:58.552 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:58.810 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:58.810 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:58.810 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:58.810 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:58.810 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:58.810 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.810 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.810 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:58.810 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:58.810 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:58.810 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:59.067 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:59.067 "name": "BaseBdev4", 00:18:59.067 "aliases": [ 00:18:59.067 "cd7ff1b3-3c20-48d7-91f5-de1dc4b28b35" 00:18:59.067 ], 00:18:59.067 "product_name": "Malloc disk", 00:18:59.067 "block_size": 512, 00:18:59.067 "num_blocks": 65536, 00:18:59.067 "uuid": "cd7ff1b3-3c20-48d7-91f5-de1dc4b28b35", 00:18:59.067 "assigned_rate_limits": { 00:18:59.067 "rw_ios_per_sec": 0, 00:18:59.067 "rw_mbytes_per_sec": 0, 00:18:59.067 "r_mbytes_per_sec": 0, 00:18:59.067 "w_mbytes_per_sec": 0 00:18:59.067 }, 00:18:59.067 "claimed": true, 00:18:59.067 "claim_type": "exclusive_write", 00:18:59.067 "zoned": false, 00:18:59.067 "supported_io_types": { 00:18:59.067 "read": true, 00:18:59.067 "write": true, 00:18:59.067 "unmap": true, 00:18:59.067 "flush": true, 00:18:59.067 "reset": true, 00:18:59.067 "nvme_admin": false, 00:18:59.067 "nvme_io": false, 00:18:59.067 "nvme_io_md": false, 00:18:59.067 "write_zeroes": true, 00:18:59.067 "zcopy": true, 00:18:59.067 "get_zone_info": false, 00:18:59.067 "zone_management": false, 00:18:59.067 "zone_append": false, 00:18:59.067 "compare": false, 00:18:59.067 "compare_and_write": false, 00:18:59.067 "abort": true, 00:18:59.067 "seek_hole": false, 00:18:59.067 "seek_data": false, 00:18:59.067 "copy": true, 00:18:59.067 "nvme_iov_md": false 00:18:59.067 }, 00:18:59.067 "memory_domains": [ 00:18:59.067 { 00:18:59.067 "dma_device_id": "system", 00:18:59.067 "dma_device_type": 1 00:18:59.067 }, 00:18:59.067 { 00:18:59.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:59.067 "dma_device_type": 2 00:18:59.067 } 00:18:59.067 ], 00:18:59.067 "driver_specific": {} 00:18:59.067 }' 00:18:59.067 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:59.067 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:59.067 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:59.067 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:59.067 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:59.324 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:59.324 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:59.324 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:59.324 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:59.325 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:59.325 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:59.325 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:59.325 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:59.582 [2024-07-23 08:32:11.899588] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:59.582 [2024-07-23 08:32:11.899623] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:59.582 [2024-07-23 08:32:11.899673] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.582 08:32:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:59.839 08:32:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:59.839 "name": "Existed_Raid", 00:18:59.839 "uuid": "2e85a193-106e-4044-8e04-8fd962b2832d", 00:18:59.839 "strip_size_kb": 64, 00:18:59.839 "state": "offline", 00:18:59.839 "raid_level": "concat", 00:18:59.839 "superblock": false, 00:18:59.839 "num_base_bdevs": 4, 00:18:59.839 "num_base_bdevs_discovered": 3, 00:18:59.839 "num_base_bdevs_operational": 3, 00:18:59.839 "base_bdevs_list": [ 00:18:59.839 { 00:18:59.840 "name": null, 00:18:59.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.840 "is_configured": false, 00:18:59.840 "data_offset": 0, 00:18:59.840 "data_size": 65536 00:18:59.840 }, 00:18:59.840 { 00:18:59.840 "name": "BaseBdev2", 00:18:59.840 "uuid": "b06be9e3-e5da-499f-957b-7bfe9ea51551", 00:18:59.840 "is_configured": true, 00:18:59.840 "data_offset": 0, 00:18:59.840 "data_size": 65536 00:18:59.840 }, 00:18:59.840 { 00:18:59.840 "name": "BaseBdev3", 00:18:59.840 "uuid": "98c6aa8f-e97d-4acb-a197-2717aa0501c3", 00:18:59.840 "is_configured": true, 00:18:59.840 "data_offset": 0, 00:18:59.840 "data_size": 65536 00:18:59.840 }, 00:18:59.840 { 00:18:59.840 "name": "BaseBdev4", 00:18:59.840 "uuid": "cd7ff1b3-3c20-48d7-91f5-de1dc4b28b35", 00:18:59.840 "is_configured": true, 00:18:59.840 "data_offset": 0, 00:18:59.840 "data_size": 65536 00:18:59.840 } 00:18:59.840 ] 00:18:59.840 }' 00:18:59.840 08:32:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:59.840 08:32:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:00.097 08:32:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:00.097 08:32:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:00.097 08:32:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.097 08:32:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:00.355 08:32:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:00.355 08:32:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:00.355 08:32:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:00.613 [2024-07-23 08:32:12.904828] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:00.613 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:00.613 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:00.613 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.613 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:00.870 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:00.870 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:00.870 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:00.870 [2024-07-23 08:32:13.325606] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:01.126 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:01.126 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:01.126 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.126 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:01.126 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:01.126 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:01.126 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:01.383 [2024-07-23 08:32:13.770206] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:01.383 [2024-07-23 08:32:13.770255] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:19:01.383 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:01.383 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:01.383 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.383 08:32:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:01.640 08:32:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:01.640 08:32:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:01.640 08:32:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:01.640 08:32:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:01.640 08:32:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:01.640 08:32:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:01.896 BaseBdev2 00:19:01.896 08:32:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:01.896 08:32:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:01.896 08:32:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:01.896 08:32:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:01.896 08:32:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:01.896 08:32:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:01.897 08:32:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:02.154 08:32:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:02.154 [ 00:19:02.154 { 00:19:02.154 "name": "BaseBdev2", 00:19:02.154 "aliases": [ 00:19:02.154 "028bb3da-5804-4da1-b46f-a76983714635" 00:19:02.154 ], 00:19:02.154 "product_name": "Malloc disk", 00:19:02.154 "block_size": 512, 00:19:02.154 "num_blocks": 65536, 00:19:02.154 "uuid": "028bb3da-5804-4da1-b46f-a76983714635", 00:19:02.154 "assigned_rate_limits": { 00:19:02.154 "rw_ios_per_sec": 0, 00:19:02.154 "rw_mbytes_per_sec": 0, 00:19:02.154 "r_mbytes_per_sec": 0, 00:19:02.154 "w_mbytes_per_sec": 0 00:19:02.154 }, 00:19:02.154 "claimed": false, 00:19:02.154 "zoned": false, 00:19:02.154 "supported_io_types": { 00:19:02.154 "read": true, 00:19:02.154 "write": true, 00:19:02.154 "unmap": true, 00:19:02.154 "flush": true, 00:19:02.154 "reset": true, 00:19:02.154 "nvme_admin": false, 00:19:02.154 "nvme_io": false, 00:19:02.154 "nvme_io_md": false, 00:19:02.154 "write_zeroes": true, 00:19:02.154 "zcopy": true, 00:19:02.154 "get_zone_info": false, 00:19:02.154 "zone_management": false, 00:19:02.154 "zone_append": false, 00:19:02.154 "compare": false, 00:19:02.154 "compare_and_write": false, 00:19:02.154 "abort": true, 00:19:02.154 "seek_hole": false, 00:19:02.154 "seek_data": false, 00:19:02.154 "copy": true, 00:19:02.154 "nvme_iov_md": false 00:19:02.154 }, 00:19:02.154 "memory_domains": [ 00:19:02.154 { 00:19:02.154 "dma_device_id": "system", 00:19:02.154 "dma_device_type": 1 00:19:02.154 }, 00:19:02.154 { 00:19:02.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:02.154 "dma_device_type": 2 00:19:02.154 } 00:19:02.154 ], 00:19:02.154 "driver_specific": {} 00:19:02.154 } 00:19:02.154 ] 00:19:02.154 08:32:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:02.154 08:32:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:02.154 08:32:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:02.154 08:32:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:02.412 BaseBdev3 00:19:02.412 08:32:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:02.412 08:32:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:02.412 08:32:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:02.412 08:32:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:02.412 08:32:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:02.412 08:32:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:02.412 08:32:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:02.670 08:32:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:02.670 [ 00:19:02.670 { 00:19:02.670 "name": "BaseBdev3", 00:19:02.670 "aliases": [ 00:19:02.670 "9acfc122-8129-48bf-943b-b948c854331d" 00:19:02.670 ], 00:19:02.670 "product_name": "Malloc disk", 00:19:02.670 "block_size": 512, 00:19:02.670 "num_blocks": 65536, 00:19:02.670 "uuid": "9acfc122-8129-48bf-943b-b948c854331d", 00:19:02.670 "assigned_rate_limits": { 00:19:02.670 "rw_ios_per_sec": 0, 00:19:02.670 "rw_mbytes_per_sec": 0, 00:19:02.670 "r_mbytes_per_sec": 0, 00:19:02.670 "w_mbytes_per_sec": 0 00:19:02.670 }, 00:19:02.670 "claimed": false, 00:19:02.670 "zoned": false, 00:19:02.670 "supported_io_types": { 00:19:02.670 "read": true, 00:19:02.670 "write": true, 00:19:02.670 "unmap": true, 00:19:02.670 "flush": true, 00:19:02.670 "reset": true, 00:19:02.670 "nvme_admin": false, 00:19:02.670 "nvme_io": false, 00:19:02.670 "nvme_io_md": false, 00:19:02.670 "write_zeroes": true, 00:19:02.670 "zcopy": true, 00:19:02.670 "get_zone_info": false, 00:19:02.670 "zone_management": false, 00:19:02.670 "zone_append": false, 00:19:02.670 "compare": false, 00:19:02.670 "compare_and_write": false, 00:19:02.670 "abort": true, 00:19:02.670 "seek_hole": false, 00:19:02.670 "seek_data": false, 00:19:02.670 "copy": true, 00:19:02.670 "nvme_iov_md": false 00:19:02.670 }, 00:19:02.670 "memory_domains": [ 00:19:02.670 { 00:19:02.670 "dma_device_id": "system", 00:19:02.670 "dma_device_type": 1 00:19:02.670 }, 00:19:02.670 { 00:19:02.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:02.670 "dma_device_type": 2 00:19:02.670 } 00:19:02.670 ], 00:19:02.670 "driver_specific": {} 00:19:02.670 } 00:19:02.670 ] 00:19:02.670 08:32:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:02.670 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:02.670 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:02.670 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:02.928 BaseBdev4 00:19:02.928 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:02.928 08:32:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:02.928 08:32:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:02.928 08:32:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:02.928 08:32:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:02.928 08:32:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:02.928 08:32:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:03.185 08:32:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:03.443 [ 00:19:03.443 { 00:19:03.443 "name": "BaseBdev4", 00:19:03.443 "aliases": [ 00:19:03.443 "0b185a8d-5905-4a7b-acaa-5794c398c005" 00:19:03.443 ], 00:19:03.443 "product_name": "Malloc disk", 00:19:03.443 "block_size": 512, 00:19:03.443 "num_blocks": 65536, 00:19:03.443 "uuid": "0b185a8d-5905-4a7b-acaa-5794c398c005", 00:19:03.443 "assigned_rate_limits": { 00:19:03.443 "rw_ios_per_sec": 0, 00:19:03.443 "rw_mbytes_per_sec": 0, 00:19:03.443 "r_mbytes_per_sec": 0, 00:19:03.443 "w_mbytes_per_sec": 0 00:19:03.443 }, 00:19:03.443 "claimed": false, 00:19:03.443 "zoned": false, 00:19:03.443 "supported_io_types": { 00:19:03.443 "read": true, 00:19:03.443 "write": true, 00:19:03.443 "unmap": true, 00:19:03.443 "flush": true, 00:19:03.443 "reset": true, 00:19:03.443 "nvme_admin": false, 00:19:03.443 "nvme_io": false, 00:19:03.443 "nvme_io_md": false, 00:19:03.443 "write_zeroes": true, 00:19:03.443 "zcopy": true, 00:19:03.443 "get_zone_info": false, 00:19:03.443 "zone_management": false, 00:19:03.443 "zone_append": false, 00:19:03.443 "compare": false, 00:19:03.443 "compare_and_write": false, 00:19:03.443 "abort": true, 00:19:03.443 "seek_hole": false, 00:19:03.443 "seek_data": false, 00:19:03.443 "copy": true, 00:19:03.443 "nvme_iov_md": false 00:19:03.443 }, 00:19:03.443 "memory_domains": [ 00:19:03.443 { 00:19:03.443 "dma_device_id": "system", 00:19:03.443 "dma_device_type": 1 00:19:03.443 }, 00:19:03.443 { 00:19:03.443 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:03.443 "dma_device_type": 2 00:19:03.443 } 00:19:03.443 ], 00:19:03.443 "driver_specific": {} 00:19:03.443 } 00:19:03.443 ] 00:19:03.443 08:32:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:03.443 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:03.443 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:03.443 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:03.443 [2024-07-23 08:32:15.873482] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:03.443 [2024-07-23 08:32:15.873520] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:03.443 [2024-07-23 08:32:15.873544] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:03.443 [2024-07-23 08:32:15.875182] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:03.443 [2024-07-23 08:32:15.875230] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:03.443 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:03.443 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:03.443 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:03.443 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:03.443 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:03.443 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:03.443 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:03.443 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:03.443 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:03.443 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:03.443 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:03.443 08:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:03.701 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:03.701 "name": "Existed_Raid", 00:19:03.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:03.701 "strip_size_kb": 64, 00:19:03.701 "state": "configuring", 00:19:03.701 "raid_level": "concat", 00:19:03.701 "superblock": false, 00:19:03.701 "num_base_bdevs": 4, 00:19:03.701 "num_base_bdevs_discovered": 3, 00:19:03.701 "num_base_bdevs_operational": 4, 00:19:03.701 "base_bdevs_list": [ 00:19:03.701 { 00:19:03.701 "name": "BaseBdev1", 00:19:03.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:03.701 "is_configured": false, 00:19:03.701 "data_offset": 0, 00:19:03.701 "data_size": 0 00:19:03.701 }, 00:19:03.701 { 00:19:03.701 "name": "BaseBdev2", 00:19:03.701 "uuid": "028bb3da-5804-4da1-b46f-a76983714635", 00:19:03.701 "is_configured": true, 00:19:03.701 "data_offset": 0, 00:19:03.701 "data_size": 65536 00:19:03.701 }, 00:19:03.701 { 00:19:03.701 "name": "BaseBdev3", 00:19:03.701 "uuid": "9acfc122-8129-48bf-943b-b948c854331d", 00:19:03.701 "is_configured": true, 00:19:03.701 "data_offset": 0, 00:19:03.701 "data_size": 65536 00:19:03.701 }, 00:19:03.701 { 00:19:03.701 "name": "BaseBdev4", 00:19:03.701 "uuid": "0b185a8d-5905-4a7b-acaa-5794c398c005", 00:19:03.701 "is_configured": true, 00:19:03.701 "data_offset": 0, 00:19:03.701 "data_size": 65536 00:19:03.701 } 00:19:03.701 ] 00:19:03.701 }' 00:19:03.701 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:03.701 08:32:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:04.266 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:04.266 [2024-07-23 08:32:16.695639] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:04.266 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:04.266 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:04.266 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:04.266 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:04.266 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:04.266 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:04.266 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:04.266 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:04.266 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:04.266 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:04.266 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:04.266 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.524 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:04.524 "name": "Existed_Raid", 00:19:04.524 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.524 "strip_size_kb": 64, 00:19:04.524 "state": "configuring", 00:19:04.524 "raid_level": "concat", 00:19:04.524 "superblock": false, 00:19:04.524 "num_base_bdevs": 4, 00:19:04.524 "num_base_bdevs_discovered": 2, 00:19:04.524 "num_base_bdevs_operational": 4, 00:19:04.524 "base_bdevs_list": [ 00:19:04.524 { 00:19:04.524 "name": "BaseBdev1", 00:19:04.524 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.524 "is_configured": false, 00:19:04.524 "data_offset": 0, 00:19:04.524 "data_size": 0 00:19:04.524 }, 00:19:04.524 { 00:19:04.524 "name": null, 00:19:04.524 "uuid": "028bb3da-5804-4da1-b46f-a76983714635", 00:19:04.524 "is_configured": false, 00:19:04.524 "data_offset": 0, 00:19:04.524 "data_size": 65536 00:19:04.524 }, 00:19:04.524 { 00:19:04.524 "name": "BaseBdev3", 00:19:04.524 "uuid": "9acfc122-8129-48bf-943b-b948c854331d", 00:19:04.524 "is_configured": true, 00:19:04.524 "data_offset": 0, 00:19:04.524 "data_size": 65536 00:19:04.524 }, 00:19:04.524 { 00:19:04.524 "name": "BaseBdev4", 00:19:04.524 "uuid": "0b185a8d-5905-4a7b-acaa-5794c398c005", 00:19:04.524 "is_configured": true, 00:19:04.524 "data_offset": 0, 00:19:04.524 "data_size": 65536 00:19:04.524 } 00:19:04.524 ] 00:19:04.524 }' 00:19:04.524 08:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:04.524 08:32:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:05.088 08:32:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.088 08:32:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:05.088 08:32:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:05.088 08:32:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:05.345 [2024-07-23 08:32:17.759509] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:05.345 BaseBdev1 00:19:05.345 08:32:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:05.345 08:32:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:05.345 08:32:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:05.345 08:32:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:05.345 08:32:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:05.345 08:32:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:05.345 08:32:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:05.602 08:32:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:05.602 [ 00:19:05.602 { 00:19:05.602 "name": "BaseBdev1", 00:19:05.602 "aliases": [ 00:19:05.602 "86efaa74-df99-4710-bca9-f9177c1c5388" 00:19:05.602 ], 00:19:05.602 "product_name": "Malloc disk", 00:19:05.602 "block_size": 512, 00:19:05.602 "num_blocks": 65536, 00:19:05.602 "uuid": "86efaa74-df99-4710-bca9-f9177c1c5388", 00:19:05.602 "assigned_rate_limits": { 00:19:05.602 "rw_ios_per_sec": 0, 00:19:05.602 "rw_mbytes_per_sec": 0, 00:19:05.602 "r_mbytes_per_sec": 0, 00:19:05.602 "w_mbytes_per_sec": 0 00:19:05.602 }, 00:19:05.602 "claimed": true, 00:19:05.602 "claim_type": "exclusive_write", 00:19:05.602 "zoned": false, 00:19:05.602 "supported_io_types": { 00:19:05.602 "read": true, 00:19:05.602 "write": true, 00:19:05.602 "unmap": true, 00:19:05.602 "flush": true, 00:19:05.602 "reset": true, 00:19:05.602 "nvme_admin": false, 00:19:05.602 "nvme_io": false, 00:19:05.602 "nvme_io_md": false, 00:19:05.602 "write_zeroes": true, 00:19:05.602 "zcopy": true, 00:19:05.602 "get_zone_info": false, 00:19:05.602 "zone_management": false, 00:19:05.602 "zone_append": false, 00:19:05.602 "compare": false, 00:19:05.602 "compare_and_write": false, 00:19:05.602 "abort": true, 00:19:05.602 "seek_hole": false, 00:19:05.602 "seek_data": false, 00:19:05.602 "copy": true, 00:19:05.602 "nvme_iov_md": false 00:19:05.602 }, 00:19:05.602 "memory_domains": [ 00:19:05.602 { 00:19:05.602 "dma_device_id": "system", 00:19:05.602 "dma_device_type": 1 00:19:05.602 }, 00:19:05.602 { 00:19:05.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:05.602 "dma_device_type": 2 00:19:05.602 } 00:19:05.602 ], 00:19:05.602 "driver_specific": {} 00:19:05.602 } 00:19:05.602 ] 00:19:05.860 08:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:05.860 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:05.860 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:05.860 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:05.860 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:05.860 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:05.860 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:05.860 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.860 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.860 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.860 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.860 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.860 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:05.860 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:05.860 "name": "Existed_Raid", 00:19:05.860 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:05.860 "strip_size_kb": 64, 00:19:05.860 "state": "configuring", 00:19:05.860 "raid_level": "concat", 00:19:05.860 "superblock": false, 00:19:05.860 "num_base_bdevs": 4, 00:19:05.860 "num_base_bdevs_discovered": 3, 00:19:05.860 "num_base_bdevs_operational": 4, 00:19:05.860 "base_bdevs_list": [ 00:19:05.860 { 00:19:05.860 "name": "BaseBdev1", 00:19:05.860 "uuid": "86efaa74-df99-4710-bca9-f9177c1c5388", 00:19:05.860 "is_configured": true, 00:19:05.860 "data_offset": 0, 00:19:05.860 "data_size": 65536 00:19:05.860 }, 00:19:05.860 { 00:19:05.860 "name": null, 00:19:05.860 "uuid": "028bb3da-5804-4da1-b46f-a76983714635", 00:19:05.860 "is_configured": false, 00:19:05.860 "data_offset": 0, 00:19:05.860 "data_size": 65536 00:19:05.860 }, 00:19:05.860 { 00:19:05.860 "name": "BaseBdev3", 00:19:05.860 "uuid": "9acfc122-8129-48bf-943b-b948c854331d", 00:19:05.860 "is_configured": true, 00:19:05.860 "data_offset": 0, 00:19:05.860 "data_size": 65536 00:19:05.860 }, 00:19:05.860 { 00:19:05.860 "name": "BaseBdev4", 00:19:05.860 "uuid": "0b185a8d-5905-4a7b-acaa-5794c398c005", 00:19:05.860 "is_configured": true, 00:19:05.860 "data_offset": 0, 00:19:05.860 "data_size": 65536 00:19:05.860 } 00:19:05.860 ] 00:19:05.860 }' 00:19:05.860 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:05.860 08:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:06.425 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:06.425 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.683 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:06.683 08:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:06.683 [2024-07-23 08:32:19.123158] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:06.683 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:06.683 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:06.683 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:06.683 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:06.683 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:06.683 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:06.683 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:06.683 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:06.683 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:06.683 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:06.683 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.683 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:06.941 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:06.941 "name": "Existed_Raid", 00:19:06.941 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.941 "strip_size_kb": 64, 00:19:06.941 "state": "configuring", 00:19:06.941 "raid_level": "concat", 00:19:06.941 "superblock": false, 00:19:06.941 "num_base_bdevs": 4, 00:19:06.941 "num_base_bdevs_discovered": 2, 00:19:06.941 "num_base_bdevs_operational": 4, 00:19:06.941 "base_bdevs_list": [ 00:19:06.941 { 00:19:06.941 "name": "BaseBdev1", 00:19:06.941 "uuid": "86efaa74-df99-4710-bca9-f9177c1c5388", 00:19:06.941 "is_configured": true, 00:19:06.941 "data_offset": 0, 00:19:06.941 "data_size": 65536 00:19:06.941 }, 00:19:06.941 { 00:19:06.941 "name": null, 00:19:06.941 "uuid": "028bb3da-5804-4da1-b46f-a76983714635", 00:19:06.941 "is_configured": false, 00:19:06.941 "data_offset": 0, 00:19:06.941 "data_size": 65536 00:19:06.941 }, 00:19:06.941 { 00:19:06.941 "name": null, 00:19:06.941 "uuid": "9acfc122-8129-48bf-943b-b948c854331d", 00:19:06.941 "is_configured": false, 00:19:06.941 "data_offset": 0, 00:19:06.941 "data_size": 65536 00:19:06.941 }, 00:19:06.941 { 00:19:06.941 "name": "BaseBdev4", 00:19:06.941 "uuid": "0b185a8d-5905-4a7b-acaa-5794c398c005", 00:19:06.942 "is_configured": true, 00:19:06.942 "data_offset": 0, 00:19:06.942 "data_size": 65536 00:19:06.942 } 00:19:06.942 ] 00:19:06.942 }' 00:19:06.942 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:06.942 08:32:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:07.507 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.507 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:07.507 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:07.507 08:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:07.764 [2024-07-23 08:32:20.117814] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:07.764 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:07.764 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:07.764 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:07.764 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:07.764 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:07.764 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:07.764 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:07.764 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:07.764 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:07.764 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:07.764 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.764 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:08.023 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:08.023 "name": "Existed_Raid", 00:19:08.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.023 "strip_size_kb": 64, 00:19:08.023 "state": "configuring", 00:19:08.023 "raid_level": "concat", 00:19:08.023 "superblock": false, 00:19:08.023 "num_base_bdevs": 4, 00:19:08.023 "num_base_bdevs_discovered": 3, 00:19:08.023 "num_base_bdevs_operational": 4, 00:19:08.023 "base_bdevs_list": [ 00:19:08.023 { 00:19:08.023 "name": "BaseBdev1", 00:19:08.023 "uuid": "86efaa74-df99-4710-bca9-f9177c1c5388", 00:19:08.023 "is_configured": true, 00:19:08.023 "data_offset": 0, 00:19:08.023 "data_size": 65536 00:19:08.023 }, 00:19:08.023 { 00:19:08.023 "name": null, 00:19:08.023 "uuid": "028bb3da-5804-4da1-b46f-a76983714635", 00:19:08.023 "is_configured": false, 00:19:08.023 "data_offset": 0, 00:19:08.023 "data_size": 65536 00:19:08.023 }, 00:19:08.023 { 00:19:08.023 "name": "BaseBdev3", 00:19:08.023 "uuid": "9acfc122-8129-48bf-943b-b948c854331d", 00:19:08.023 "is_configured": true, 00:19:08.023 "data_offset": 0, 00:19:08.023 "data_size": 65536 00:19:08.023 }, 00:19:08.023 { 00:19:08.023 "name": "BaseBdev4", 00:19:08.023 "uuid": "0b185a8d-5905-4a7b-acaa-5794c398c005", 00:19:08.023 "is_configured": true, 00:19:08.023 "data_offset": 0, 00:19:08.023 "data_size": 65536 00:19:08.023 } 00:19:08.023 ] 00:19:08.023 }' 00:19:08.023 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:08.023 08:32:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:08.299 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.299 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:08.569 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:08.569 08:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:08.828 [2024-07-23 08:32:21.120464] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:08.828 08:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:08.828 08:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:08.828 08:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:08.828 08:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:08.828 08:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:08.828 08:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:08.828 08:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:08.828 08:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:08.828 08:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:08.828 08:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:08.828 08:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.828 08:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:09.087 08:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:09.087 "name": "Existed_Raid", 00:19:09.087 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:09.087 "strip_size_kb": 64, 00:19:09.087 "state": "configuring", 00:19:09.087 "raid_level": "concat", 00:19:09.087 "superblock": false, 00:19:09.087 "num_base_bdevs": 4, 00:19:09.087 "num_base_bdevs_discovered": 2, 00:19:09.087 "num_base_bdevs_operational": 4, 00:19:09.087 "base_bdevs_list": [ 00:19:09.087 { 00:19:09.087 "name": null, 00:19:09.087 "uuid": "86efaa74-df99-4710-bca9-f9177c1c5388", 00:19:09.087 "is_configured": false, 00:19:09.087 "data_offset": 0, 00:19:09.087 "data_size": 65536 00:19:09.087 }, 00:19:09.087 { 00:19:09.087 "name": null, 00:19:09.087 "uuid": "028bb3da-5804-4da1-b46f-a76983714635", 00:19:09.087 "is_configured": false, 00:19:09.087 "data_offset": 0, 00:19:09.087 "data_size": 65536 00:19:09.087 }, 00:19:09.087 { 00:19:09.087 "name": "BaseBdev3", 00:19:09.087 "uuid": "9acfc122-8129-48bf-943b-b948c854331d", 00:19:09.087 "is_configured": true, 00:19:09.087 "data_offset": 0, 00:19:09.087 "data_size": 65536 00:19:09.087 }, 00:19:09.087 { 00:19:09.087 "name": "BaseBdev4", 00:19:09.087 "uuid": "0b185a8d-5905-4a7b-acaa-5794c398c005", 00:19:09.087 "is_configured": true, 00:19:09.087 "data_offset": 0, 00:19:09.087 "data_size": 65536 00:19:09.087 } 00:19:09.087 ] 00:19:09.087 }' 00:19:09.087 08:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:09.087 08:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:09.654 08:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.654 08:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:09.654 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:09.654 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:09.912 [2024-07-23 08:32:22.218032] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:09.912 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:09.912 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:09.912 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:09.912 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:09.912 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:09.912 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:09.912 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:09.912 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:09.912 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:09.912 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:09.912 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.912 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:09.912 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:09.912 "name": "Existed_Raid", 00:19:09.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:09.912 "strip_size_kb": 64, 00:19:09.912 "state": "configuring", 00:19:09.912 "raid_level": "concat", 00:19:09.912 "superblock": false, 00:19:09.912 "num_base_bdevs": 4, 00:19:09.912 "num_base_bdevs_discovered": 3, 00:19:09.912 "num_base_bdevs_operational": 4, 00:19:09.912 "base_bdevs_list": [ 00:19:09.912 { 00:19:09.912 "name": null, 00:19:09.912 "uuid": "86efaa74-df99-4710-bca9-f9177c1c5388", 00:19:09.912 "is_configured": false, 00:19:09.912 "data_offset": 0, 00:19:09.912 "data_size": 65536 00:19:09.912 }, 00:19:09.912 { 00:19:09.912 "name": "BaseBdev2", 00:19:09.912 "uuid": "028bb3da-5804-4da1-b46f-a76983714635", 00:19:09.912 "is_configured": true, 00:19:09.912 "data_offset": 0, 00:19:09.912 "data_size": 65536 00:19:09.912 }, 00:19:09.912 { 00:19:09.912 "name": "BaseBdev3", 00:19:09.912 "uuid": "9acfc122-8129-48bf-943b-b948c854331d", 00:19:09.912 "is_configured": true, 00:19:09.912 "data_offset": 0, 00:19:09.912 "data_size": 65536 00:19:09.912 }, 00:19:09.912 { 00:19:09.912 "name": "BaseBdev4", 00:19:09.912 "uuid": "0b185a8d-5905-4a7b-acaa-5794c398c005", 00:19:09.912 "is_configured": true, 00:19:09.912 "data_offset": 0, 00:19:09.912 "data_size": 65536 00:19:09.912 } 00:19:09.912 ] 00:19:09.912 }' 00:19:09.912 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:09.912 08:32:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:10.479 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.479 08:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:10.737 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:10.737 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:10.737 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.737 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 86efaa74-df99-4710-bca9-f9177c1c5388 00:19:10.996 [2024-07-23 08:32:23.400378] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:10.996 [2024-07-23 08:32:23.400421] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037280 00:19:10.997 [2024-07-23 08:32:23.400428] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:10.997 [2024-07-23 08:32:23.400671] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c3a0 00:19:10.997 [2024-07-23 08:32:23.400843] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037280 00:19:10.997 [2024-07-23 08:32:23.400855] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000037280 00:19:10.997 [2024-07-23 08:32:23.401118] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:10.997 NewBaseBdev 00:19:10.997 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:10.997 08:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:10.997 08:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:10.997 08:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:10.997 08:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:10.997 08:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:10.997 08:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:11.255 08:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:11.255 [ 00:19:11.255 { 00:19:11.255 "name": "NewBaseBdev", 00:19:11.255 "aliases": [ 00:19:11.255 "86efaa74-df99-4710-bca9-f9177c1c5388" 00:19:11.255 ], 00:19:11.255 "product_name": "Malloc disk", 00:19:11.255 "block_size": 512, 00:19:11.256 "num_blocks": 65536, 00:19:11.256 "uuid": "86efaa74-df99-4710-bca9-f9177c1c5388", 00:19:11.256 "assigned_rate_limits": { 00:19:11.256 "rw_ios_per_sec": 0, 00:19:11.256 "rw_mbytes_per_sec": 0, 00:19:11.256 "r_mbytes_per_sec": 0, 00:19:11.256 "w_mbytes_per_sec": 0 00:19:11.256 }, 00:19:11.256 "claimed": true, 00:19:11.256 "claim_type": "exclusive_write", 00:19:11.256 "zoned": false, 00:19:11.256 "supported_io_types": { 00:19:11.256 "read": true, 00:19:11.256 "write": true, 00:19:11.256 "unmap": true, 00:19:11.256 "flush": true, 00:19:11.256 "reset": true, 00:19:11.256 "nvme_admin": false, 00:19:11.256 "nvme_io": false, 00:19:11.256 "nvme_io_md": false, 00:19:11.256 "write_zeroes": true, 00:19:11.256 "zcopy": true, 00:19:11.256 "get_zone_info": false, 00:19:11.256 "zone_management": false, 00:19:11.256 "zone_append": false, 00:19:11.256 "compare": false, 00:19:11.256 "compare_and_write": false, 00:19:11.256 "abort": true, 00:19:11.256 "seek_hole": false, 00:19:11.256 "seek_data": false, 00:19:11.256 "copy": true, 00:19:11.256 "nvme_iov_md": false 00:19:11.256 }, 00:19:11.256 "memory_domains": [ 00:19:11.256 { 00:19:11.256 "dma_device_id": "system", 00:19:11.256 "dma_device_type": 1 00:19:11.256 }, 00:19:11.256 { 00:19:11.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.256 "dma_device_type": 2 00:19:11.256 } 00:19:11.256 ], 00:19:11.256 "driver_specific": {} 00:19:11.256 } 00:19:11.256 ] 00:19:11.256 08:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:11.256 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:11.256 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:11.256 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:11.256 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:11.256 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:11.256 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:11.256 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:11.256 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:11.256 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:11.256 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:11.256 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.256 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:11.514 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:11.514 "name": "Existed_Raid", 00:19:11.514 "uuid": "6b5391ca-72d2-4fa7-898d-bec70256bb41", 00:19:11.514 "strip_size_kb": 64, 00:19:11.514 "state": "online", 00:19:11.514 "raid_level": "concat", 00:19:11.514 "superblock": false, 00:19:11.514 "num_base_bdevs": 4, 00:19:11.515 "num_base_bdevs_discovered": 4, 00:19:11.515 "num_base_bdevs_operational": 4, 00:19:11.515 "base_bdevs_list": [ 00:19:11.515 { 00:19:11.515 "name": "NewBaseBdev", 00:19:11.515 "uuid": "86efaa74-df99-4710-bca9-f9177c1c5388", 00:19:11.515 "is_configured": true, 00:19:11.515 "data_offset": 0, 00:19:11.515 "data_size": 65536 00:19:11.515 }, 00:19:11.515 { 00:19:11.515 "name": "BaseBdev2", 00:19:11.515 "uuid": "028bb3da-5804-4da1-b46f-a76983714635", 00:19:11.515 "is_configured": true, 00:19:11.515 "data_offset": 0, 00:19:11.515 "data_size": 65536 00:19:11.515 }, 00:19:11.515 { 00:19:11.515 "name": "BaseBdev3", 00:19:11.515 "uuid": "9acfc122-8129-48bf-943b-b948c854331d", 00:19:11.515 "is_configured": true, 00:19:11.515 "data_offset": 0, 00:19:11.515 "data_size": 65536 00:19:11.515 }, 00:19:11.515 { 00:19:11.515 "name": "BaseBdev4", 00:19:11.515 "uuid": "0b185a8d-5905-4a7b-acaa-5794c398c005", 00:19:11.515 "is_configured": true, 00:19:11.515 "data_offset": 0, 00:19:11.515 "data_size": 65536 00:19:11.515 } 00:19:11.515 ] 00:19:11.515 }' 00:19:11.515 08:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:11.515 08:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:12.081 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:12.081 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:12.081 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:12.081 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:12.081 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:12.081 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:12.081 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:12.082 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:12.082 [2024-07-23 08:32:24.555763] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:12.082 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:12.082 "name": "Existed_Raid", 00:19:12.082 "aliases": [ 00:19:12.082 "6b5391ca-72d2-4fa7-898d-bec70256bb41" 00:19:12.082 ], 00:19:12.082 "product_name": "Raid Volume", 00:19:12.082 "block_size": 512, 00:19:12.082 "num_blocks": 262144, 00:19:12.082 "uuid": "6b5391ca-72d2-4fa7-898d-bec70256bb41", 00:19:12.082 "assigned_rate_limits": { 00:19:12.082 "rw_ios_per_sec": 0, 00:19:12.082 "rw_mbytes_per_sec": 0, 00:19:12.082 "r_mbytes_per_sec": 0, 00:19:12.082 "w_mbytes_per_sec": 0 00:19:12.082 }, 00:19:12.082 "claimed": false, 00:19:12.082 "zoned": false, 00:19:12.082 "supported_io_types": { 00:19:12.082 "read": true, 00:19:12.082 "write": true, 00:19:12.082 "unmap": true, 00:19:12.082 "flush": true, 00:19:12.082 "reset": true, 00:19:12.082 "nvme_admin": false, 00:19:12.082 "nvme_io": false, 00:19:12.082 "nvme_io_md": false, 00:19:12.082 "write_zeroes": true, 00:19:12.082 "zcopy": false, 00:19:12.082 "get_zone_info": false, 00:19:12.082 "zone_management": false, 00:19:12.082 "zone_append": false, 00:19:12.082 "compare": false, 00:19:12.082 "compare_and_write": false, 00:19:12.082 "abort": false, 00:19:12.082 "seek_hole": false, 00:19:12.082 "seek_data": false, 00:19:12.082 "copy": false, 00:19:12.082 "nvme_iov_md": false 00:19:12.082 }, 00:19:12.082 "memory_domains": [ 00:19:12.082 { 00:19:12.082 "dma_device_id": "system", 00:19:12.082 "dma_device_type": 1 00:19:12.082 }, 00:19:12.082 { 00:19:12.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.082 "dma_device_type": 2 00:19:12.082 }, 00:19:12.082 { 00:19:12.082 "dma_device_id": "system", 00:19:12.082 "dma_device_type": 1 00:19:12.082 }, 00:19:12.082 { 00:19:12.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.082 "dma_device_type": 2 00:19:12.082 }, 00:19:12.082 { 00:19:12.082 "dma_device_id": "system", 00:19:12.082 "dma_device_type": 1 00:19:12.082 }, 00:19:12.082 { 00:19:12.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.082 "dma_device_type": 2 00:19:12.082 }, 00:19:12.082 { 00:19:12.082 "dma_device_id": "system", 00:19:12.082 "dma_device_type": 1 00:19:12.082 }, 00:19:12.082 { 00:19:12.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.082 "dma_device_type": 2 00:19:12.082 } 00:19:12.082 ], 00:19:12.082 "driver_specific": { 00:19:12.082 "raid": { 00:19:12.082 "uuid": "6b5391ca-72d2-4fa7-898d-bec70256bb41", 00:19:12.082 "strip_size_kb": 64, 00:19:12.082 "state": "online", 00:19:12.082 "raid_level": "concat", 00:19:12.082 "superblock": false, 00:19:12.082 "num_base_bdevs": 4, 00:19:12.082 "num_base_bdevs_discovered": 4, 00:19:12.082 "num_base_bdevs_operational": 4, 00:19:12.082 "base_bdevs_list": [ 00:19:12.082 { 00:19:12.082 "name": "NewBaseBdev", 00:19:12.082 "uuid": "86efaa74-df99-4710-bca9-f9177c1c5388", 00:19:12.082 "is_configured": true, 00:19:12.082 "data_offset": 0, 00:19:12.082 "data_size": 65536 00:19:12.082 }, 00:19:12.082 { 00:19:12.082 "name": "BaseBdev2", 00:19:12.082 "uuid": "028bb3da-5804-4da1-b46f-a76983714635", 00:19:12.082 "is_configured": true, 00:19:12.082 "data_offset": 0, 00:19:12.082 "data_size": 65536 00:19:12.082 }, 00:19:12.082 { 00:19:12.082 "name": "BaseBdev3", 00:19:12.082 "uuid": "9acfc122-8129-48bf-943b-b948c854331d", 00:19:12.082 "is_configured": true, 00:19:12.082 "data_offset": 0, 00:19:12.082 "data_size": 65536 00:19:12.082 }, 00:19:12.082 { 00:19:12.082 "name": "BaseBdev4", 00:19:12.082 "uuid": "0b185a8d-5905-4a7b-acaa-5794c398c005", 00:19:12.082 "is_configured": true, 00:19:12.082 "data_offset": 0, 00:19:12.082 "data_size": 65536 00:19:12.082 } 00:19:12.082 ] 00:19:12.082 } 00:19:12.082 } 00:19:12.082 }' 00:19:12.082 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:12.341 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:12.341 BaseBdev2 00:19:12.341 BaseBdev3 00:19:12.341 BaseBdev4' 00:19:12.341 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:12.341 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:12.341 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:12.341 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:12.341 "name": "NewBaseBdev", 00:19:12.341 "aliases": [ 00:19:12.341 "86efaa74-df99-4710-bca9-f9177c1c5388" 00:19:12.341 ], 00:19:12.341 "product_name": "Malloc disk", 00:19:12.341 "block_size": 512, 00:19:12.341 "num_blocks": 65536, 00:19:12.341 "uuid": "86efaa74-df99-4710-bca9-f9177c1c5388", 00:19:12.341 "assigned_rate_limits": { 00:19:12.341 "rw_ios_per_sec": 0, 00:19:12.341 "rw_mbytes_per_sec": 0, 00:19:12.341 "r_mbytes_per_sec": 0, 00:19:12.341 "w_mbytes_per_sec": 0 00:19:12.341 }, 00:19:12.341 "claimed": true, 00:19:12.341 "claim_type": "exclusive_write", 00:19:12.341 "zoned": false, 00:19:12.341 "supported_io_types": { 00:19:12.341 "read": true, 00:19:12.341 "write": true, 00:19:12.341 "unmap": true, 00:19:12.341 "flush": true, 00:19:12.341 "reset": true, 00:19:12.341 "nvme_admin": false, 00:19:12.341 "nvme_io": false, 00:19:12.341 "nvme_io_md": false, 00:19:12.341 "write_zeroes": true, 00:19:12.341 "zcopy": true, 00:19:12.341 "get_zone_info": false, 00:19:12.341 "zone_management": false, 00:19:12.341 "zone_append": false, 00:19:12.341 "compare": false, 00:19:12.341 "compare_and_write": false, 00:19:12.341 "abort": true, 00:19:12.341 "seek_hole": false, 00:19:12.341 "seek_data": false, 00:19:12.341 "copy": true, 00:19:12.341 "nvme_iov_md": false 00:19:12.341 }, 00:19:12.341 "memory_domains": [ 00:19:12.341 { 00:19:12.341 "dma_device_id": "system", 00:19:12.341 "dma_device_type": 1 00:19:12.341 }, 00:19:12.341 { 00:19:12.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.341 "dma_device_type": 2 00:19:12.341 } 00:19:12.341 ], 00:19:12.341 "driver_specific": {} 00:19:12.341 }' 00:19:12.341 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.341 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.341 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:12.341 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.600 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.600 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:12.600 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.600 08:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.600 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:12.600 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.600 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.600 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:12.600 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:12.600 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:12.600 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:12.859 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:12.859 "name": "BaseBdev2", 00:19:12.859 "aliases": [ 00:19:12.859 "028bb3da-5804-4da1-b46f-a76983714635" 00:19:12.859 ], 00:19:12.859 "product_name": "Malloc disk", 00:19:12.859 "block_size": 512, 00:19:12.859 "num_blocks": 65536, 00:19:12.859 "uuid": "028bb3da-5804-4da1-b46f-a76983714635", 00:19:12.859 "assigned_rate_limits": { 00:19:12.859 "rw_ios_per_sec": 0, 00:19:12.859 "rw_mbytes_per_sec": 0, 00:19:12.859 "r_mbytes_per_sec": 0, 00:19:12.859 "w_mbytes_per_sec": 0 00:19:12.859 }, 00:19:12.859 "claimed": true, 00:19:12.859 "claim_type": "exclusive_write", 00:19:12.859 "zoned": false, 00:19:12.859 "supported_io_types": { 00:19:12.859 "read": true, 00:19:12.859 "write": true, 00:19:12.859 "unmap": true, 00:19:12.859 "flush": true, 00:19:12.859 "reset": true, 00:19:12.859 "nvme_admin": false, 00:19:12.859 "nvme_io": false, 00:19:12.859 "nvme_io_md": false, 00:19:12.859 "write_zeroes": true, 00:19:12.859 "zcopy": true, 00:19:12.859 "get_zone_info": false, 00:19:12.859 "zone_management": false, 00:19:12.859 "zone_append": false, 00:19:12.859 "compare": false, 00:19:12.859 "compare_and_write": false, 00:19:12.859 "abort": true, 00:19:12.859 "seek_hole": false, 00:19:12.859 "seek_data": false, 00:19:12.859 "copy": true, 00:19:12.859 "nvme_iov_md": false 00:19:12.859 }, 00:19:12.859 "memory_domains": [ 00:19:12.859 { 00:19:12.859 "dma_device_id": "system", 00:19:12.859 "dma_device_type": 1 00:19:12.859 }, 00:19:12.859 { 00:19:12.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.859 "dma_device_type": 2 00:19:12.859 } 00:19:12.859 ], 00:19:12.859 "driver_specific": {} 00:19:12.859 }' 00:19:12.859 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.859 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.859 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:12.859 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.119 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.119 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:13.119 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.119 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.119 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:13.119 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:13.119 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:13.119 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:13.119 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:13.119 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:13.119 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:13.378 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:13.378 "name": "BaseBdev3", 00:19:13.378 "aliases": [ 00:19:13.378 "9acfc122-8129-48bf-943b-b948c854331d" 00:19:13.378 ], 00:19:13.378 "product_name": "Malloc disk", 00:19:13.378 "block_size": 512, 00:19:13.378 "num_blocks": 65536, 00:19:13.378 "uuid": "9acfc122-8129-48bf-943b-b948c854331d", 00:19:13.378 "assigned_rate_limits": { 00:19:13.378 "rw_ios_per_sec": 0, 00:19:13.378 "rw_mbytes_per_sec": 0, 00:19:13.378 "r_mbytes_per_sec": 0, 00:19:13.378 "w_mbytes_per_sec": 0 00:19:13.378 }, 00:19:13.378 "claimed": true, 00:19:13.378 "claim_type": "exclusive_write", 00:19:13.378 "zoned": false, 00:19:13.378 "supported_io_types": { 00:19:13.378 "read": true, 00:19:13.378 "write": true, 00:19:13.378 "unmap": true, 00:19:13.378 "flush": true, 00:19:13.378 "reset": true, 00:19:13.378 "nvme_admin": false, 00:19:13.378 "nvme_io": false, 00:19:13.378 "nvme_io_md": false, 00:19:13.378 "write_zeroes": true, 00:19:13.378 "zcopy": true, 00:19:13.378 "get_zone_info": false, 00:19:13.378 "zone_management": false, 00:19:13.378 "zone_append": false, 00:19:13.378 "compare": false, 00:19:13.378 "compare_and_write": false, 00:19:13.378 "abort": true, 00:19:13.378 "seek_hole": false, 00:19:13.378 "seek_data": false, 00:19:13.378 "copy": true, 00:19:13.378 "nvme_iov_md": false 00:19:13.378 }, 00:19:13.378 "memory_domains": [ 00:19:13.378 { 00:19:13.378 "dma_device_id": "system", 00:19:13.378 "dma_device_type": 1 00:19:13.378 }, 00:19:13.378 { 00:19:13.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.379 "dma_device_type": 2 00:19:13.379 } 00:19:13.379 ], 00:19:13.379 "driver_specific": {} 00:19:13.379 }' 00:19:13.379 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.379 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.379 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:13.379 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.379 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.379 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:13.379 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.637 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.637 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:13.637 08:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:13.637 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:13.637 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:13.637 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:13.637 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:13.637 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:13.896 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:13.896 "name": "BaseBdev4", 00:19:13.896 "aliases": [ 00:19:13.896 "0b185a8d-5905-4a7b-acaa-5794c398c005" 00:19:13.896 ], 00:19:13.896 "product_name": "Malloc disk", 00:19:13.896 "block_size": 512, 00:19:13.896 "num_blocks": 65536, 00:19:13.896 "uuid": "0b185a8d-5905-4a7b-acaa-5794c398c005", 00:19:13.896 "assigned_rate_limits": { 00:19:13.896 "rw_ios_per_sec": 0, 00:19:13.896 "rw_mbytes_per_sec": 0, 00:19:13.896 "r_mbytes_per_sec": 0, 00:19:13.896 "w_mbytes_per_sec": 0 00:19:13.896 }, 00:19:13.896 "claimed": true, 00:19:13.896 "claim_type": "exclusive_write", 00:19:13.896 "zoned": false, 00:19:13.896 "supported_io_types": { 00:19:13.896 "read": true, 00:19:13.896 "write": true, 00:19:13.896 "unmap": true, 00:19:13.896 "flush": true, 00:19:13.896 "reset": true, 00:19:13.896 "nvme_admin": false, 00:19:13.896 "nvme_io": false, 00:19:13.896 "nvme_io_md": false, 00:19:13.896 "write_zeroes": true, 00:19:13.896 "zcopy": true, 00:19:13.896 "get_zone_info": false, 00:19:13.896 "zone_management": false, 00:19:13.896 "zone_append": false, 00:19:13.896 "compare": false, 00:19:13.896 "compare_and_write": false, 00:19:13.896 "abort": true, 00:19:13.896 "seek_hole": false, 00:19:13.896 "seek_data": false, 00:19:13.897 "copy": true, 00:19:13.897 "nvme_iov_md": false 00:19:13.897 }, 00:19:13.897 "memory_domains": [ 00:19:13.897 { 00:19:13.897 "dma_device_id": "system", 00:19:13.897 "dma_device_type": 1 00:19:13.897 }, 00:19:13.897 { 00:19:13.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.897 "dma_device_type": 2 00:19:13.897 } 00:19:13.897 ], 00:19:13.897 "driver_specific": {} 00:19:13.897 }' 00:19:13.897 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.897 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.897 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:13.897 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.897 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.897 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:13.897 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.897 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:14.156 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:14.156 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:14.156 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:14.156 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:14.156 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:14.415 [2024-07-23 08:32:26.681068] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:14.415 [2024-07-23 08:32:26.681097] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:14.415 [2024-07-23 08:32:26.681168] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:14.415 [2024-07-23 08:32:26.681234] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:14.415 [2024-07-23 08:32:26.681245] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037280 name Existed_Raid, state offline 00:19:14.415 08:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1487549 00:19:14.415 08:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1487549 ']' 00:19:14.415 08:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1487549 00:19:14.415 08:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:19:14.415 08:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:14.415 08:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1487549 00:19:14.415 08:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:14.415 08:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:14.415 08:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1487549' 00:19:14.415 killing process with pid 1487549 00:19:14.415 08:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1487549 00:19:14.415 [2024-07-23 08:32:26.730275] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:14.415 08:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1487549 00:19:14.673 [2024-07-23 08:32:27.051618] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:19:16.050 00:19:16.050 real 0m26.369s 00:19:16.050 user 0m47.164s 00:19:16.050 sys 0m3.894s 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:16.050 ************************************ 00:19:16.050 END TEST raid_state_function_test 00:19:16.050 ************************************ 00:19:16.050 08:32:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:16.050 08:32:28 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:19:16.050 08:32:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:16.050 08:32:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:16.050 08:32:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:16.050 ************************************ 00:19:16.050 START TEST raid_state_function_test_sb 00:19:16.050 ************************************ 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:16.050 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1493029 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1493029' 00:19:16.051 Process raid pid: 1493029 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1493029 /var/tmp/spdk-raid.sock 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1493029 ']' 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:16.051 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:16.051 08:32:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:16.051 [2024-07-23 08:32:28.479087] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:19:16.051 [2024-07-23 08:32:28.479171] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:16.310 [2024-07-23 08:32:28.602622] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:16.310 [2024-07-23 08:32:28.822531] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:16.877 [2024-07-23 08:32:29.092677] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:16.877 [2024-07-23 08:32:29.092705] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:16.877 08:32:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:16.877 08:32:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:19:16.877 08:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:17.136 [2024-07-23 08:32:29.405365] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:17.136 [2024-07-23 08:32:29.405411] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:17.136 [2024-07-23 08:32:29.405421] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:17.136 [2024-07-23 08:32:29.405448] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:17.136 [2024-07-23 08:32:29.405456] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:17.136 [2024-07-23 08:32:29.405465] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:17.136 [2024-07-23 08:32:29.405472] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:17.136 [2024-07-23 08:32:29.405480] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:17.136 08:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:17.136 08:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:17.136 08:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:17.136 08:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:17.136 08:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:17.136 08:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:17.136 08:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:17.136 08:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:17.136 08:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:17.136 08:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:17.136 08:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.136 08:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:17.136 08:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:17.136 "name": "Existed_Raid", 00:19:17.136 "uuid": "807d852a-b3ba-44a9-a550-2b6358f46835", 00:19:17.136 "strip_size_kb": 64, 00:19:17.136 "state": "configuring", 00:19:17.136 "raid_level": "concat", 00:19:17.136 "superblock": true, 00:19:17.136 "num_base_bdevs": 4, 00:19:17.136 "num_base_bdevs_discovered": 0, 00:19:17.136 "num_base_bdevs_operational": 4, 00:19:17.136 "base_bdevs_list": [ 00:19:17.136 { 00:19:17.136 "name": "BaseBdev1", 00:19:17.136 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.136 "is_configured": false, 00:19:17.136 "data_offset": 0, 00:19:17.136 "data_size": 0 00:19:17.136 }, 00:19:17.136 { 00:19:17.136 "name": "BaseBdev2", 00:19:17.136 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.136 "is_configured": false, 00:19:17.136 "data_offset": 0, 00:19:17.136 "data_size": 0 00:19:17.136 }, 00:19:17.136 { 00:19:17.136 "name": "BaseBdev3", 00:19:17.136 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.136 "is_configured": false, 00:19:17.136 "data_offset": 0, 00:19:17.136 "data_size": 0 00:19:17.136 }, 00:19:17.136 { 00:19:17.136 "name": "BaseBdev4", 00:19:17.136 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.136 "is_configured": false, 00:19:17.136 "data_offset": 0, 00:19:17.136 "data_size": 0 00:19:17.136 } 00:19:17.136 ] 00:19:17.136 }' 00:19:17.136 08:32:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:17.136 08:32:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:17.703 08:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:17.703 [2024-07-23 08:32:30.199331] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:17.703 [2024-07-23 08:32:30.199371] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:19:17.961 08:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:17.961 [2024-07-23 08:32:30.379827] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:17.961 [2024-07-23 08:32:30.379870] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:17.961 [2024-07-23 08:32:30.379879] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:17.961 [2024-07-23 08:32:30.379903] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:17.961 [2024-07-23 08:32:30.379911] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:17.961 [2024-07-23 08:32:30.379919] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:17.962 [2024-07-23 08:32:30.379926] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:17.962 [2024-07-23 08:32:30.379936] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:17.962 08:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:18.220 [2024-07-23 08:32:30.592312] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:18.220 BaseBdev1 00:19:18.220 08:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:18.220 08:32:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:18.220 08:32:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:18.220 08:32:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:18.220 08:32:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:18.220 08:32:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:18.220 08:32:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:18.479 08:32:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:18.479 [ 00:19:18.479 { 00:19:18.479 "name": "BaseBdev1", 00:19:18.479 "aliases": [ 00:19:18.479 "11c62467-8038-4aac-9113-15a6f7668142" 00:19:18.479 ], 00:19:18.479 "product_name": "Malloc disk", 00:19:18.479 "block_size": 512, 00:19:18.479 "num_blocks": 65536, 00:19:18.479 "uuid": "11c62467-8038-4aac-9113-15a6f7668142", 00:19:18.479 "assigned_rate_limits": { 00:19:18.479 "rw_ios_per_sec": 0, 00:19:18.479 "rw_mbytes_per_sec": 0, 00:19:18.479 "r_mbytes_per_sec": 0, 00:19:18.479 "w_mbytes_per_sec": 0 00:19:18.479 }, 00:19:18.479 "claimed": true, 00:19:18.479 "claim_type": "exclusive_write", 00:19:18.479 "zoned": false, 00:19:18.479 "supported_io_types": { 00:19:18.479 "read": true, 00:19:18.479 "write": true, 00:19:18.479 "unmap": true, 00:19:18.479 "flush": true, 00:19:18.479 "reset": true, 00:19:18.479 "nvme_admin": false, 00:19:18.479 "nvme_io": false, 00:19:18.479 "nvme_io_md": false, 00:19:18.479 "write_zeroes": true, 00:19:18.479 "zcopy": true, 00:19:18.479 "get_zone_info": false, 00:19:18.479 "zone_management": false, 00:19:18.479 "zone_append": false, 00:19:18.479 "compare": false, 00:19:18.479 "compare_and_write": false, 00:19:18.479 "abort": true, 00:19:18.479 "seek_hole": false, 00:19:18.479 "seek_data": false, 00:19:18.479 "copy": true, 00:19:18.479 "nvme_iov_md": false 00:19:18.479 }, 00:19:18.479 "memory_domains": [ 00:19:18.479 { 00:19:18.479 "dma_device_id": "system", 00:19:18.479 "dma_device_type": 1 00:19:18.479 }, 00:19:18.479 { 00:19:18.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.479 "dma_device_type": 2 00:19:18.479 } 00:19:18.479 ], 00:19:18.479 "driver_specific": {} 00:19:18.479 } 00:19:18.479 ] 00:19:18.479 08:32:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:18.479 08:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:18.479 08:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:18.479 08:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:18.479 08:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:18.479 08:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:18.479 08:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:18.479 08:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:18.479 08:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:18.479 08:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:18.479 08:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:18.479 08:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.479 08:32:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:18.738 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:18.738 "name": "Existed_Raid", 00:19:18.738 "uuid": "a765adf6-5c3a-4678-b3c0-71682284fc63", 00:19:18.738 "strip_size_kb": 64, 00:19:18.738 "state": "configuring", 00:19:18.738 "raid_level": "concat", 00:19:18.738 "superblock": true, 00:19:18.738 "num_base_bdevs": 4, 00:19:18.738 "num_base_bdevs_discovered": 1, 00:19:18.738 "num_base_bdevs_operational": 4, 00:19:18.738 "base_bdevs_list": [ 00:19:18.738 { 00:19:18.738 "name": "BaseBdev1", 00:19:18.738 "uuid": "11c62467-8038-4aac-9113-15a6f7668142", 00:19:18.738 "is_configured": true, 00:19:18.738 "data_offset": 2048, 00:19:18.738 "data_size": 63488 00:19:18.738 }, 00:19:18.738 { 00:19:18.738 "name": "BaseBdev2", 00:19:18.738 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.738 "is_configured": false, 00:19:18.738 "data_offset": 0, 00:19:18.738 "data_size": 0 00:19:18.738 }, 00:19:18.738 { 00:19:18.738 "name": "BaseBdev3", 00:19:18.738 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.738 "is_configured": false, 00:19:18.738 "data_offset": 0, 00:19:18.738 "data_size": 0 00:19:18.738 }, 00:19:18.738 { 00:19:18.738 "name": "BaseBdev4", 00:19:18.738 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.738 "is_configured": false, 00:19:18.738 "data_offset": 0, 00:19:18.738 "data_size": 0 00:19:18.738 } 00:19:18.738 ] 00:19:18.738 }' 00:19:18.738 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:18.738 08:32:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:19.306 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:19.306 [2024-07-23 08:32:31.779502] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:19.306 [2024-07-23 08:32:31.779554] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:19:19.306 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:19.565 [2024-07-23 08:32:31.935946] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:19.565 [2024-07-23 08:32:31.937509] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:19.565 [2024-07-23 08:32:31.937542] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:19.565 [2024-07-23 08:32:31.937551] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:19.565 [2024-07-23 08:32:31.937559] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:19.565 [2024-07-23 08:32:31.937582] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:19.566 [2024-07-23 08:32:31.937593] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:19.566 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:19.566 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:19.566 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:19.566 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:19.566 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:19.566 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:19.566 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:19.566 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:19.566 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:19.566 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:19.566 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:19.566 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:19.566 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.566 08:32:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:19.824 08:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:19.824 "name": "Existed_Raid", 00:19:19.824 "uuid": "9648a1ca-c48d-4268-b6c7-dd5a8e8d7f10", 00:19:19.824 "strip_size_kb": 64, 00:19:19.824 "state": "configuring", 00:19:19.824 "raid_level": "concat", 00:19:19.824 "superblock": true, 00:19:19.824 "num_base_bdevs": 4, 00:19:19.824 "num_base_bdevs_discovered": 1, 00:19:19.824 "num_base_bdevs_operational": 4, 00:19:19.824 "base_bdevs_list": [ 00:19:19.824 { 00:19:19.824 "name": "BaseBdev1", 00:19:19.824 "uuid": "11c62467-8038-4aac-9113-15a6f7668142", 00:19:19.824 "is_configured": true, 00:19:19.824 "data_offset": 2048, 00:19:19.824 "data_size": 63488 00:19:19.824 }, 00:19:19.824 { 00:19:19.824 "name": "BaseBdev2", 00:19:19.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.824 "is_configured": false, 00:19:19.824 "data_offset": 0, 00:19:19.824 "data_size": 0 00:19:19.824 }, 00:19:19.824 { 00:19:19.824 "name": "BaseBdev3", 00:19:19.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.824 "is_configured": false, 00:19:19.824 "data_offset": 0, 00:19:19.824 "data_size": 0 00:19:19.824 }, 00:19:19.824 { 00:19:19.824 "name": "BaseBdev4", 00:19:19.825 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.825 "is_configured": false, 00:19:19.825 "data_offset": 0, 00:19:19.825 "data_size": 0 00:19:19.825 } 00:19:19.825 ] 00:19:19.825 }' 00:19:19.825 08:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:19.825 08:32:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:20.083 08:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:20.342 [2024-07-23 08:32:32.794795] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:20.342 BaseBdev2 00:19:20.342 08:32:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:20.342 08:32:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:20.342 08:32:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:20.342 08:32:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:20.342 08:32:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:20.342 08:32:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:20.342 08:32:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:20.599 08:32:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:20.858 [ 00:19:20.858 { 00:19:20.858 "name": "BaseBdev2", 00:19:20.858 "aliases": [ 00:19:20.858 "9fcdc948-4a3a-47df-8161-1988e338c376" 00:19:20.858 ], 00:19:20.858 "product_name": "Malloc disk", 00:19:20.858 "block_size": 512, 00:19:20.858 "num_blocks": 65536, 00:19:20.858 "uuid": "9fcdc948-4a3a-47df-8161-1988e338c376", 00:19:20.858 "assigned_rate_limits": { 00:19:20.858 "rw_ios_per_sec": 0, 00:19:20.858 "rw_mbytes_per_sec": 0, 00:19:20.858 "r_mbytes_per_sec": 0, 00:19:20.858 "w_mbytes_per_sec": 0 00:19:20.858 }, 00:19:20.858 "claimed": true, 00:19:20.858 "claim_type": "exclusive_write", 00:19:20.858 "zoned": false, 00:19:20.858 "supported_io_types": { 00:19:20.858 "read": true, 00:19:20.858 "write": true, 00:19:20.858 "unmap": true, 00:19:20.858 "flush": true, 00:19:20.858 "reset": true, 00:19:20.858 "nvme_admin": false, 00:19:20.858 "nvme_io": false, 00:19:20.858 "nvme_io_md": false, 00:19:20.858 "write_zeroes": true, 00:19:20.858 "zcopy": true, 00:19:20.858 "get_zone_info": false, 00:19:20.858 "zone_management": false, 00:19:20.858 "zone_append": false, 00:19:20.858 "compare": false, 00:19:20.858 "compare_and_write": false, 00:19:20.858 "abort": true, 00:19:20.858 "seek_hole": false, 00:19:20.858 "seek_data": false, 00:19:20.858 "copy": true, 00:19:20.858 "nvme_iov_md": false 00:19:20.858 }, 00:19:20.858 "memory_domains": [ 00:19:20.858 { 00:19:20.858 "dma_device_id": "system", 00:19:20.858 "dma_device_type": 1 00:19:20.858 }, 00:19:20.858 { 00:19:20.858 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.858 "dma_device_type": 2 00:19:20.858 } 00:19:20.858 ], 00:19:20.858 "driver_specific": {} 00:19:20.858 } 00:19:20.858 ] 00:19:20.858 08:32:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:20.858 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:20.858 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:20.858 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:20.859 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:20.859 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:20.859 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:20.859 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:20.859 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:20.859 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:20.859 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:20.859 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:20.859 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:20.859 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.859 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:20.859 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:20.859 "name": "Existed_Raid", 00:19:20.859 "uuid": "9648a1ca-c48d-4268-b6c7-dd5a8e8d7f10", 00:19:20.859 "strip_size_kb": 64, 00:19:20.859 "state": "configuring", 00:19:20.859 "raid_level": "concat", 00:19:20.859 "superblock": true, 00:19:20.859 "num_base_bdevs": 4, 00:19:20.859 "num_base_bdevs_discovered": 2, 00:19:20.859 "num_base_bdevs_operational": 4, 00:19:20.859 "base_bdevs_list": [ 00:19:20.859 { 00:19:20.859 "name": "BaseBdev1", 00:19:20.859 "uuid": "11c62467-8038-4aac-9113-15a6f7668142", 00:19:20.859 "is_configured": true, 00:19:20.859 "data_offset": 2048, 00:19:20.859 "data_size": 63488 00:19:20.859 }, 00:19:20.859 { 00:19:20.859 "name": "BaseBdev2", 00:19:20.859 "uuid": "9fcdc948-4a3a-47df-8161-1988e338c376", 00:19:20.859 "is_configured": true, 00:19:20.859 "data_offset": 2048, 00:19:20.859 "data_size": 63488 00:19:20.859 }, 00:19:20.859 { 00:19:20.859 "name": "BaseBdev3", 00:19:20.859 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:20.859 "is_configured": false, 00:19:20.859 "data_offset": 0, 00:19:20.859 "data_size": 0 00:19:20.859 }, 00:19:20.859 { 00:19:20.859 "name": "BaseBdev4", 00:19:20.859 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:20.859 "is_configured": false, 00:19:20.859 "data_offset": 0, 00:19:20.859 "data_size": 0 00:19:20.859 } 00:19:20.859 ] 00:19:20.859 }' 00:19:20.859 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:20.859 08:32:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:21.470 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:21.470 [2024-07-23 08:32:33.943621] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:21.470 BaseBdev3 00:19:21.733 08:32:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:21.733 08:32:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:21.733 08:32:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:21.733 08:32:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:21.733 08:32:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:21.733 08:32:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:21.733 08:32:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:21.733 08:32:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:21.993 [ 00:19:21.993 { 00:19:21.993 "name": "BaseBdev3", 00:19:21.993 "aliases": [ 00:19:21.993 "93f6c7d9-698f-431c-bb77-ee035e245b9f" 00:19:21.993 ], 00:19:21.993 "product_name": "Malloc disk", 00:19:21.993 "block_size": 512, 00:19:21.993 "num_blocks": 65536, 00:19:21.993 "uuid": "93f6c7d9-698f-431c-bb77-ee035e245b9f", 00:19:21.993 "assigned_rate_limits": { 00:19:21.993 "rw_ios_per_sec": 0, 00:19:21.993 "rw_mbytes_per_sec": 0, 00:19:21.993 "r_mbytes_per_sec": 0, 00:19:21.993 "w_mbytes_per_sec": 0 00:19:21.993 }, 00:19:21.993 "claimed": true, 00:19:21.993 "claim_type": "exclusive_write", 00:19:21.993 "zoned": false, 00:19:21.993 "supported_io_types": { 00:19:21.993 "read": true, 00:19:21.993 "write": true, 00:19:21.993 "unmap": true, 00:19:21.993 "flush": true, 00:19:21.993 "reset": true, 00:19:21.993 "nvme_admin": false, 00:19:21.993 "nvme_io": false, 00:19:21.993 "nvme_io_md": false, 00:19:21.993 "write_zeroes": true, 00:19:21.993 "zcopy": true, 00:19:21.993 "get_zone_info": false, 00:19:21.993 "zone_management": false, 00:19:21.993 "zone_append": false, 00:19:21.993 "compare": false, 00:19:21.993 "compare_and_write": false, 00:19:21.993 "abort": true, 00:19:21.993 "seek_hole": false, 00:19:21.993 "seek_data": false, 00:19:21.993 "copy": true, 00:19:21.993 "nvme_iov_md": false 00:19:21.993 }, 00:19:21.993 "memory_domains": [ 00:19:21.993 { 00:19:21.993 "dma_device_id": "system", 00:19:21.993 "dma_device_type": 1 00:19:21.993 }, 00:19:21.993 { 00:19:21.993 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:21.993 "dma_device_type": 2 00:19:21.993 } 00:19:21.993 ], 00:19:21.993 "driver_specific": {} 00:19:21.993 } 00:19:21.993 ] 00:19:21.993 08:32:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:21.993 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:21.993 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:21.993 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:21.993 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:21.993 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:21.993 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:21.993 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:21.993 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:21.993 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:21.993 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:21.993 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:21.993 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:21.993 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.993 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:21.993 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:21.993 "name": "Existed_Raid", 00:19:21.993 "uuid": "9648a1ca-c48d-4268-b6c7-dd5a8e8d7f10", 00:19:21.993 "strip_size_kb": 64, 00:19:21.993 "state": "configuring", 00:19:21.993 "raid_level": "concat", 00:19:21.993 "superblock": true, 00:19:21.993 "num_base_bdevs": 4, 00:19:21.993 "num_base_bdevs_discovered": 3, 00:19:21.993 "num_base_bdevs_operational": 4, 00:19:21.993 "base_bdevs_list": [ 00:19:21.993 { 00:19:21.993 "name": "BaseBdev1", 00:19:21.993 "uuid": "11c62467-8038-4aac-9113-15a6f7668142", 00:19:21.993 "is_configured": true, 00:19:21.993 "data_offset": 2048, 00:19:21.993 "data_size": 63488 00:19:21.993 }, 00:19:21.993 { 00:19:21.993 "name": "BaseBdev2", 00:19:21.993 "uuid": "9fcdc948-4a3a-47df-8161-1988e338c376", 00:19:21.993 "is_configured": true, 00:19:21.993 "data_offset": 2048, 00:19:21.993 "data_size": 63488 00:19:21.993 }, 00:19:21.993 { 00:19:21.993 "name": "BaseBdev3", 00:19:21.993 "uuid": "93f6c7d9-698f-431c-bb77-ee035e245b9f", 00:19:21.993 "is_configured": true, 00:19:21.993 "data_offset": 2048, 00:19:21.993 "data_size": 63488 00:19:21.993 }, 00:19:21.993 { 00:19:21.993 "name": "BaseBdev4", 00:19:21.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:21.993 "is_configured": false, 00:19:21.993 "data_offset": 0, 00:19:21.994 "data_size": 0 00:19:21.994 } 00:19:21.994 ] 00:19:21.994 }' 00:19:21.994 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:21.994 08:32:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:22.561 08:32:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:22.821 [2024-07-23 08:32:35.121085] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:22.821 [2024-07-23 08:32:35.121317] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:19:22.821 [2024-07-23 08:32:35.121333] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:22.821 [2024-07-23 08:32:35.121562] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:19:22.821 [2024-07-23 08:32:35.121740] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:19:22.821 [2024-07-23 08:32:35.121754] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:19:22.821 [2024-07-23 08:32:35.121914] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:22.821 BaseBdev4 00:19:22.821 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:22.821 08:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:22.821 08:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:22.821 08:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:22.821 08:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:22.821 08:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:22.821 08:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:22.821 08:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:23.080 [ 00:19:23.080 { 00:19:23.080 "name": "BaseBdev4", 00:19:23.080 "aliases": [ 00:19:23.080 "ee46f1d9-f8ca-47c6-825a-81cfea2ee072" 00:19:23.080 ], 00:19:23.080 "product_name": "Malloc disk", 00:19:23.080 "block_size": 512, 00:19:23.080 "num_blocks": 65536, 00:19:23.080 "uuid": "ee46f1d9-f8ca-47c6-825a-81cfea2ee072", 00:19:23.080 "assigned_rate_limits": { 00:19:23.080 "rw_ios_per_sec": 0, 00:19:23.080 "rw_mbytes_per_sec": 0, 00:19:23.080 "r_mbytes_per_sec": 0, 00:19:23.080 "w_mbytes_per_sec": 0 00:19:23.080 }, 00:19:23.080 "claimed": true, 00:19:23.080 "claim_type": "exclusive_write", 00:19:23.080 "zoned": false, 00:19:23.080 "supported_io_types": { 00:19:23.080 "read": true, 00:19:23.080 "write": true, 00:19:23.080 "unmap": true, 00:19:23.080 "flush": true, 00:19:23.080 "reset": true, 00:19:23.080 "nvme_admin": false, 00:19:23.080 "nvme_io": false, 00:19:23.080 "nvme_io_md": false, 00:19:23.080 "write_zeroes": true, 00:19:23.080 "zcopy": true, 00:19:23.080 "get_zone_info": false, 00:19:23.080 "zone_management": false, 00:19:23.080 "zone_append": false, 00:19:23.080 "compare": false, 00:19:23.080 "compare_and_write": false, 00:19:23.080 "abort": true, 00:19:23.080 "seek_hole": false, 00:19:23.080 "seek_data": false, 00:19:23.080 "copy": true, 00:19:23.080 "nvme_iov_md": false 00:19:23.080 }, 00:19:23.080 "memory_domains": [ 00:19:23.081 { 00:19:23.081 "dma_device_id": "system", 00:19:23.081 "dma_device_type": 1 00:19:23.081 }, 00:19:23.081 { 00:19:23.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.081 "dma_device_type": 2 00:19:23.081 } 00:19:23.081 ], 00:19:23.081 "driver_specific": {} 00:19:23.081 } 00:19:23.081 ] 00:19:23.081 08:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:23.081 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:23.081 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:23.081 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:23.081 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:23.081 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:23.081 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:23.081 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:23.081 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:23.081 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:23.081 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:23.081 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:23.081 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:23.081 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.081 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:23.340 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:23.340 "name": "Existed_Raid", 00:19:23.340 "uuid": "9648a1ca-c48d-4268-b6c7-dd5a8e8d7f10", 00:19:23.340 "strip_size_kb": 64, 00:19:23.340 "state": "online", 00:19:23.340 "raid_level": "concat", 00:19:23.340 "superblock": true, 00:19:23.340 "num_base_bdevs": 4, 00:19:23.340 "num_base_bdevs_discovered": 4, 00:19:23.340 "num_base_bdevs_operational": 4, 00:19:23.340 "base_bdevs_list": [ 00:19:23.340 { 00:19:23.340 "name": "BaseBdev1", 00:19:23.340 "uuid": "11c62467-8038-4aac-9113-15a6f7668142", 00:19:23.340 "is_configured": true, 00:19:23.340 "data_offset": 2048, 00:19:23.340 "data_size": 63488 00:19:23.340 }, 00:19:23.340 { 00:19:23.340 "name": "BaseBdev2", 00:19:23.340 "uuid": "9fcdc948-4a3a-47df-8161-1988e338c376", 00:19:23.340 "is_configured": true, 00:19:23.340 "data_offset": 2048, 00:19:23.340 "data_size": 63488 00:19:23.340 }, 00:19:23.340 { 00:19:23.340 "name": "BaseBdev3", 00:19:23.340 "uuid": "93f6c7d9-698f-431c-bb77-ee035e245b9f", 00:19:23.340 "is_configured": true, 00:19:23.340 "data_offset": 2048, 00:19:23.340 "data_size": 63488 00:19:23.340 }, 00:19:23.340 { 00:19:23.340 "name": "BaseBdev4", 00:19:23.340 "uuid": "ee46f1d9-f8ca-47c6-825a-81cfea2ee072", 00:19:23.340 "is_configured": true, 00:19:23.340 "data_offset": 2048, 00:19:23.340 "data_size": 63488 00:19:23.340 } 00:19:23.340 ] 00:19:23.340 }' 00:19:23.340 08:32:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:23.340 08:32:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:23.600 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:23.600 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:23.600 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:23.600 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:23.600 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:23.600 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:23.600 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:23.600 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:23.859 [2024-07-23 08:32:36.256443] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:23.859 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:23.859 "name": "Existed_Raid", 00:19:23.859 "aliases": [ 00:19:23.860 "9648a1ca-c48d-4268-b6c7-dd5a8e8d7f10" 00:19:23.860 ], 00:19:23.860 "product_name": "Raid Volume", 00:19:23.860 "block_size": 512, 00:19:23.860 "num_blocks": 253952, 00:19:23.860 "uuid": "9648a1ca-c48d-4268-b6c7-dd5a8e8d7f10", 00:19:23.860 "assigned_rate_limits": { 00:19:23.860 "rw_ios_per_sec": 0, 00:19:23.860 "rw_mbytes_per_sec": 0, 00:19:23.860 "r_mbytes_per_sec": 0, 00:19:23.860 "w_mbytes_per_sec": 0 00:19:23.860 }, 00:19:23.860 "claimed": false, 00:19:23.860 "zoned": false, 00:19:23.860 "supported_io_types": { 00:19:23.860 "read": true, 00:19:23.860 "write": true, 00:19:23.860 "unmap": true, 00:19:23.860 "flush": true, 00:19:23.860 "reset": true, 00:19:23.860 "nvme_admin": false, 00:19:23.860 "nvme_io": false, 00:19:23.860 "nvme_io_md": false, 00:19:23.860 "write_zeroes": true, 00:19:23.860 "zcopy": false, 00:19:23.860 "get_zone_info": false, 00:19:23.860 "zone_management": false, 00:19:23.860 "zone_append": false, 00:19:23.860 "compare": false, 00:19:23.860 "compare_and_write": false, 00:19:23.860 "abort": false, 00:19:23.860 "seek_hole": false, 00:19:23.860 "seek_data": false, 00:19:23.860 "copy": false, 00:19:23.860 "nvme_iov_md": false 00:19:23.860 }, 00:19:23.860 "memory_domains": [ 00:19:23.860 { 00:19:23.860 "dma_device_id": "system", 00:19:23.860 "dma_device_type": 1 00:19:23.860 }, 00:19:23.860 { 00:19:23.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.860 "dma_device_type": 2 00:19:23.860 }, 00:19:23.860 { 00:19:23.860 "dma_device_id": "system", 00:19:23.860 "dma_device_type": 1 00:19:23.860 }, 00:19:23.860 { 00:19:23.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.860 "dma_device_type": 2 00:19:23.860 }, 00:19:23.860 { 00:19:23.860 "dma_device_id": "system", 00:19:23.860 "dma_device_type": 1 00:19:23.860 }, 00:19:23.860 { 00:19:23.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.860 "dma_device_type": 2 00:19:23.860 }, 00:19:23.860 { 00:19:23.860 "dma_device_id": "system", 00:19:23.860 "dma_device_type": 1 00:19:23.860 }, 00:19:23.860 { 00:19:23.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.860 "dma_device_type": 2 00:19:23.860 } 00:19:23.860 ], 00:19:23.860 "driver_specific": { 00:19:23.860 "raid": { 00:19:23.860 "uuid": "9648a1ca-c48d-4268-b6c7-dd5a8e8d7f10", 00:19:23.860 "strip_size_kb": 64, 00:19:23.860 "state": "online", 00:19:23.860 "raid_level": "concat", 00:19:23.860 "superblock": true, 00:19:23.860 "num_base_bdevs": 4, 00:19:23.860 "num_base_bdevs_discovered": 4, 00:19:23.860 "num_base_bdevs_operational": 4, 00:19:23.860 "base_bdevs_list": [ 00:19:23.860 { 00:19:23.860 "name": "BaseBdev1", 00:19:23.860 "uuid": "11c62467-8038-4aac-9113-15a6f7668142", 00:19:23.860 "is_configured": true, 00:19:23.860 "data_offset": 2048, 00:19:23.860 "data_size": 63488 00:19:23.860 }, 00:19:23.860 { 00:19:23.860 "name": "BaseBdev2", 00:19:23.860 "uuid": "9fcdc948-4a3a-47df-8161-1988e338c376", 00:19:23.860 "is_configured": true, 00:19:23.860 "data_offset": 2048, 00:19:23.860 "data_size": 63488 00:19:23.860 }, 00:19:23.860 { 00:19:23.860 "name": "BaseBdev3", 00:19:23.860 "uuid": "93f6c7d9-698f-431c-bb77-ee035e245b9f", 00:19:23.860 "is_configured": true, 00:19:23.860 "data_offset": 2048, 00:19:23.860 "data_size": 63488 00:19:23.860 }, 00:19:23.860 { 00:19:23.860 "name": "BaseBdev4", 00:19:23.860 "uuid": "ee46f1d9-f8ca-47c6-825a-81cfea2ee072", 00:19:23.860 "is_configured": true, 00:19:23.860 "data_offset": 2048, 00:19:23.860 "data_size": 63488 00:19:23.860 } 00:19:23.860 ] 00:19:23.860 } 00:19:23.860 } 00:19:23.860 }' 00:19:23.860 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:23.860 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:23.860 BaseBdev2 00:19:23.860 BaseBdev3 00:19:23.860 BaseBdev4' 00:19:23.860 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:23.860 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:23.860 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:24.120 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:24.120 "name": "BaseBdev1", 00:19:24.120 "aliases": [ 00:19:24.120 "11c62467-8038-4aac-9113-15a6f7668142" 00:19:24.120 ], 00:19:24.120 "product_name": "Malloc disk", 00:19:24.120 "block_size": 512, 00:19:24.120 "num_blocks": 65536, 00:19:24.120 "uuid": "11c62467-8038-4aac-9113-15a6f7668142", 00:19:24.120 "assigned_rate_limits": { 00:19:24.120 "rw_ios_per_sec": 0, 00:19:24.120 "rw_mbytes_per_sec": 0, 00:19:24.120 "r_mbytes_per_sec": 0, 00:19:24.120 "w_mbytes_per_sec": 0 00:19:24.120 }, 00:19:24.120 "claimed": true, 00:19:24.120 "claim_type": "exclusive_write", 00:19:24.120 "zoned": false, 00:19:24.120 "supported_io_types": { 00:19:24.120 "read": true, 00:19:24.120 "write": true, 00:19:24.120 "unmap": true, 00:19:24.120 "flush": true, 00:19:24.120 "reset": true, 00:19:24.120 "nvme_admin": false, 00:19:24.120 "nvme_io": false, 00:19:24.120 "nvme_io_md": false, 00:19:24.120 "write_zeroes": true, 00:19:24.120 "zcopy": true, 00:19:24.120 "get_zone_info": false, 00:19:24.120 "zone_management": false, 00:19:24.120 "zone_append": false, 00:19:24.120 "compare": false, 00:19:24.120 "compare_and_write": false, 00:19:24.120 "abort": true, 00:19:24.120 "seek_hole": false, 00:19:24.120 "seek_data": false, 00:19:24.120 "copy": true, 00:19:24.120 "nvme_iov_md": false 00:19:24.120 }, 00:19:24.120 "memory_domains": [ 00:19:24.120 { 00:19:24.120 "dma_device_id": "system", 00:19:24.120 "dma_device_type": 1 00:19:24.120 }, 00:19:24.120 { 00:19:24.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.120 "dma_device_type": 2 00:19:24.120 } 00:19:24.120 ], 00:19:24.120 "driver_specific": {} 00:19:24.120 }' 00:19:24.120 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.120 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.120 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:24.120 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:24.120 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:24.380 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:24.380 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:24.380 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:24.380 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:24.380 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:24.380 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:24.380 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:24.380 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:24.380 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:24.380 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:24.639 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:24.639 "name": "BaseBdev2", 00:19:24.639 "aliases": [ 00:19:24.639 "9fcdc948-4a3a-47df-8161-1988e338c376" 00:19:24.639 ], 00:19:24.639 "product_name": "Malloc disk", 00:19:24.639 "block_size": 512, 00:19:24.639 "num_blocks": 65536, 00:19:24.639 "uuid": "9fcdc948-4a3a-47df-8161-1988e338c376", 00:19:24.639 "assigned_rate_limits": { 00:19:24.639 "rw_ios_per_sec": 0, 00:19:24.639 "rw_mbytes_per_sec": 0, 00:19:24.639 "r_mbytes_per_sec": 0, 00:19:24.639 "w_mbytes_per_sec": 0 00:19:24.639 }, 00:19:24.639 "claimed": true, 00:19:24.639 "claim_type": "exclusive_write", 00:19:24.639 "zoned": false, 00:19:24.639 "supported_io_types": { 00:19:24.639 "read": true, 00:19:24.639 "write": true, 00:19:24.639 "unmap": true, 00:19:24.639 "flush": true, 00:19:24.639 "reset": true, 00:19:24.639 "nvme_admin": false, 00:19:24.639 "nvme_io": false, 00:19:24.639 "nvme_io_md": false, 00:19:24.639 "write_zeroes": true, 00:19:24.639 "zcopy": true, 00:19:24.639 "get_zone_info": false, 00:19:24.639 "zone_management": false, 00:19:24.639 "zone_append": false, 00:19:24.639 "compare": false, 00:19:24.639 "compare_and_write": false, 00:19:24.639 "abort": true, 00:19:24.639 "seek_hole": false, 00:19:24.639 "seek_data": false, 00:19:24.639 "copy": true, 00:19:24.639 "nvme_iov_md": false 00:19:24.639 }, 00:19:24.639 "memory_domains": [ 00:19:24.639 { 00:19:24.639 "dma_device_id": "system", 00:19:24.639 "dma_device_type": 1 00:19:24.639 }, 00:19:24.639 { 00:19:24.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.639 "dma_device_type": 2 00:19:24.639 } 00:19:24.639 ], 00:19:24.639 "driver_specific": {} 00:19:24.639 }' 00:19:24.639 08:32:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.639 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.639 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:24.639 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:24.639 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:24.639 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:24.639 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:24.898 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:24.898 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:24.898 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:24.898 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:24.898 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:24.898 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:24.898 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:24.898 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:25.157 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:25.157 "name": "BaseBdev3", 00:19:25.157 "aliases": [ 00:19:25.157 "93f6c7d9-698f-431c-bb77-ee035e245b9f" 00:19:25.157 ], 00:19:25.157 "product_name": "Malloc disk", 00:19:25.157 "block_size": 512, 00:19:25.157 "num_blocks": 65536, 00:19:25.157 "uuid": "93f6c7d9-698f-431c-bb77-ee035e245b9f", 00:19:25.157 "assigned_rate_limits": { 00:19:25.157 "rw_ios_per_sec": 0, 00:19:25.157 "rw_mbytes_per_sec": 0, 00:19:25.157 "r_mbytes_per_sec": 0, 00:19:25.157 "w_mbytes_per_sec": 0 00:19:25.157 }, 00:19:25.157 "claimed": true, 00:19:25.157 "claim_type": "exclusive_write", 00:19:25.157 "zoned": false, 00:19:25.157 "supported_io_types": { 00:19:25.157 "read": true, 00:19:25.157 "write": true, 00:19:25.157 "unmap": true, 00:19:25.157 "flush": true, 00:19:25.157 "reset": true, 00:19:25.157 "nvme_admin": false, 00:19:25.157 "nvme_io": false, 00:19:25.157 "nvme_io_md": false, 00:19:25.157 "write_zeroes": true, 00:19:25.157 "zcopy": true, 00:19:25.157 "get_zone_info": false, 00:19:25.157 "zone_management": false, 00:19:25.157 "zone_append": false, 00:19:25.157 "compare": false, 00:19:25.157 "compare_and_write": false, 00:19:25.157 "abort": true, 00:19:25.157 "seek_hole": false, 00:19:25.157 "seek_data": false, 00:19:25.157 "copy": true, 00:19:25.157 "nvme_iov_md": false 00:19:25.157 }, 00:19:25.157 "memory_domains": [ 00:19:25.157 { 00:19:25.157 "dma_device_id": "system", 00:19:25.157 "dma_device_type": 1 00:19:25.157 }, 00:19:25.157 { 00:19:25.157 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:25.157 "dma_device_type": 2 00:19:25.157 } 00:19:25.157 ], 00:19:25.157 "driver_specific": {} 00:19:25.157 }' 00:19:25.157 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:25.157 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:25.157 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:25.157 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:25.157 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:25.157 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:25.157 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:25.157 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:25.416 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:25.416 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:25.416 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:25.416 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:25.416 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:25.416 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:25.416 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:25.677 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:25.677 "name": "BaseBdev4", 00:19:25.677 "aliases": [ 00:19:25.677 "ee46f1d9-f8ca-47c6-825a-81cfea2ee072" 00:19:25.677 ], 00:19:25.677 "product_name": "Malloc disk", 00:19:25.677 "block_size": 512, 00:19:25.677 "num_blocks": 65536, 00:19:25.677 "uuid": "ee46f1d9-f8ca-47c6-825a-81cfea2ee072", 00:19:25.677 "assigned_rate_limits": { 00:19:25.677 "rw_ios_per_sec": 0, 00:19:25.677 "rw_mbytes_per_sec": 0, 00:19:25.677 "r_mbytes_per_sec": 0, 00:19:25.677 "w_mbytes_per_sec": 0 00:19:25.677 }, 00:19:25.677 "claimed": true, 00:19:25.677 "claim_type": "exclusive_write", 00:19:25.677 "zoned": false, 00:19:25.677 "supported_io_types": { 00:19:25.677 "read": true, 00:19:25.677 "write": true, 00:19:25.677 "unmap": true, 00:19:25.677 "flush": true, 00:19:25.677 "reset": true, 00:19:25.677 "nvme_admin": false, 00:19:25.677 "nvme_io": false, 00:19:25.677 "nvme_io_md": false, 00:19:25.677 "write_zeroes": true, 00:19:25.677 "zcopy": true, 00:19:25.677 "get_zone_info": false, 00:19:25.677 "zone_management": false, 00:19:25.677 "zone_append": false, 00:19:25.677 "compare": false, 00:19:25.677 "compare_and_write": false, 00:19:25.677 "abort": true, 00:19:25.677 "seek_hole": false, 00:19:25.677 "seek_data": false, 00:19:25.677 "copy": true, 00:19:25.677 "nvme_iov_md": false 00:19:25.677 }, 00:19:25.677 "memory_domains": [ 00:19:25.677 { 00:19:25.677 "dma_device_id": "system", 00:19:25.677 "dma_device_type": 1 00:19:25.677 }, 00:19:25.677 { 00:19:25.677 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:25.677 "dma_device_type": 2 00:19:25.677 } 00:19:25.677 ], 00:19:25.677 "driver_specific": {} 00:19:25.677 }' 00:19:25.677 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:25.677 08:32:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:25.677 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:25.677 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:25.677 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:25.677 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:25.677 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:25.677 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:25.677 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:25.677 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:25.936 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:25.936 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:25.936 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:25.936 [2024-07-23 08:32:38.410123] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:25.936 [2024-07-23 08:32:38.410154] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:25.936 [2024-07-23 08:32:38.410208] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:26.195 "name": "Existed_Raid", 00:19:26.195 "uuid": "9648a1ca-c48d-4268-b6c7-dd5a8e8d7f10", 00:19:26.195 "strip_size_kb": 64, 00:19:26.195 "state": "offline", 00:19:26.195 "raid_level": "concat", 00:19:26.195 "superblock": true, 00:19:26.195 "num_base_bdevs": 4, 00:19:26.195 "num_base_bdevs_discovered": 3, 00:19:26.195 "num_base_bdevs_operational": 3, 00:19:26.195 "base_bdevs_list": [ 00:19:26.195 { 00:19:26.195 "name": null, 00:19:26.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:26.195 "is_configured": false, 00:19:26.195 "data_offset": 2048, 00:19:26.195 "data_size": 63488 00:19:26.195 }, 00:19:26.195 { 00:19:26.195 "name": "BaseBdev2", 00:19:26.195 "uuid": "9fcdc948-4a3a-47df-8161-1988e338c376", 00:19:26.195 "is_configured": true, 00:19:26.195 "data_offset": 2048, 00:19:26.195 "data_size": 63488 00:19:26.195 }, 00:19:26.195 { 00:19:26.195 "name": "BaseBdev3", 00:19:26.195 "uuid": "93f6c7d9-698f-431c-bb77-ee035e245b9f", 00:19:26.195 "is_configured": true, 00:19:26.195 "data_offset": 2048, 00:19:26.195 "data_size": 63488 00:19:26.195 }, 00:19:26.195 { 00:19:26.195 "name": "BaseBdev4", 00:19:26.195 "uuid": "ee46f1d9-f8ca-47c6-825a-81cfea2ee072", 00:19:26.195 "is_configured": true, 00:19:26.195 "data_offset": 2048, 00:19:26.195 "data_size": 63488 00:19:26.195 } 00:19:26.195 ] 00:19:26.195 }' 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:26.195 08:32:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:26.763 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:26.763 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:26.763 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.763 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:26.763 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:26.763 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:26.763 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:27.022 [2024-07-23 08:32:39.424074] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:27.022 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:27.022 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:27.022 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.022 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:27.280 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:27.280 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:27.280 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:27.539 [2024-07-23 08:32:39.861195] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:27.539 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:27.539 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:27.539 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.539 08:32:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:27.797 08:32:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:27.797 08:32:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:27.797 08:32:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:27.797 [2024-07-23 08:32:40.312060] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:27.797 [2024-07-23 08:32:40.312110] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:19:28.056 08:32:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:28.056 08:32:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:28.056 08:32:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.056 08:32:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:28.315 08:32:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:28.315 08:32:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:28.315 08:32:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:28.315 08:32:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:28.315 08:32:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:28.315 08:32:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:28.315 BaseBdev2 00:19:28.315 08:32:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:28.315 08:32:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:28.315 08:32:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:28.315 08:32:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:28.315 08:32:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:28.315 08:32:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:28.315 08:32:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:28.573 08:32:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:28.832 [ 00:19:28.832 { 00:19:28.832 "name": "BaseBdev2", 00:19:28.832 "aliases": [ 00:19:28.832 "c0aa2a5f-2b74-47ef-a756-0e68470c2973" 00:19:28.832 ], 00:19:28.832 "product_name": "Malloc disk", 00:19:28.832 "block_size": 512, 00:19:28.832 "num_blocks": 65536, 00:19:28.832 "uuid": "c0aa2a5f-2b74-47ef-a756-0e68470c2973", 00:19:28.832 "assigned_rate_limits": { 00:19:28.832 "rw_ios_per_sec": 0, 00:19:28.832 "rw_mbytes_per_sec": 0, 00:19:28.832 "r_mbytes_per_sec": 0, 00:19:28.832 "w_mbytes_per_sec": 0 00:19:28.832 }, 00:19:28.832 "claimed": false, 00:19:28.832 "zoned": false, 00:19:28.832 "supported_io_types": { 00:19:28.832 "read": true, 00:19:28.832 "write": true, 00:19:28.832 "unmap": true, 00:19:28.832 "flush": true, 00:19:28.832 "reset": true, 00:19:28.832 "nvme_admin": false, 00:19:28.832 "nvme_io": false, 00:19:28.832 "nvme_io_md": false, 00:19:28.832 "write_zeroes": true, 00:19:28.832 "zcopy": true, 00:19:28.832 "get_zone_info": false, 00:19:28.832 "zone_management": false, 00:19:28.832 "zone_append": false, 00:19:28.832 "compare": false, 00:19:28.832 "compare_and_write": false, 00:19:28.832 "abort": true, 00:19:28.832 "seek_hole": false, 00:19:28.832 "seek_data": false, 00:19:28.832 "copy": true, 00:19:28.832 "nvme_iov_md": false 00:19:28.832 }, 00:19:28.832 "memory_domains": [ 00:19:28.832 { 00:19:28.832 "dma_device_id": "system", 00:19:28.832 "dma_device_type": 1 00:19:28.832 }, 00:19:28.832 { 00:19:28.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.832 "dma_device_type": 2 00:19:28.832 } 00:19:28.832 ], 00:19:28.832 "driver_specific": {} 00:19:28.832 } 00:19:28.832 ] 00:19:28.832 08:32:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:28.832 08:32:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:28.832 08:32:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:28.832 08:32:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:29.091 BaseBdev3 00:19:29.091 08:32:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:29.091 08:32:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:29.091 08:32:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:29.091 08:32:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:29.091 08:32:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:29.091 08:32:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:29.091 08:32:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:29.091 08:32:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:29.351 [ 00:19:29.351 { 00:19:29.351 "name": "BaseBdev3", 00:19:29.351 "aliases": [ 00:19:29.351 "6bb7b649-2f0e-4513-80db-033209a60916" 00:19:29.351 ], 00:19:29.351 "product_name": "Malloc disk", 00:19:29.351 "block_size": 512, 00:19:29.351 "num_blocks": 65536, 00:19:29.351 "uuid": "6bb7b649-2f0e-4513-80db-033209a60916", 00:19:29.351 "assigned_rate_limits": { 00:19:29.351 "rw_ios_per_sec": 0, 00:19:29.351 "rw_mbytes_per_sec": 0, 00:19:29.351 "r_mbytes_per_sec": 0, 00:19:29.351 "w_mbytes_per_sec": 0 00:19:29.351 }, 00:19:29.351 "claimed": false, 00:19:29.351 "zoned": false, 00:19:29.351 "supported_io_types": { 00:19:29.351 "read": true, 00:19:29.351 "write": true, 00:19:29.351 "unmap": true, 00:19:29.351 "flush": true, 00:19:29.351 "reset": true, 00:19:29.351 "nvme_admin": false, 00:19:29.351 "nvme_io": false, 00:19:29.351 "nvme_io_md": false, 00:19:29.351 "write_zeroes": true, 00:19:29.351 "zcopy": true, 00:19:29.351 "get_zone_info": false, 00:19:29.351 "zone_management": false, 00:19:29.351 "zone_append": false, 00:19:29.351 "compare": false, 00:19:29.351 "compare_and_write": false, 00:19:29.351 "abort": true, 00:19:29.351 "seek_hole": false, 00:19:29.351 "seek_data": false, 00:19:29.351 "copy": true, 00:19:29.351 "nvme_iov_md": false 00:19:29.351 }, 00:19:29.351 "memory_domains": [ 00:19:29.351 { 00:19:29.351 "dma_device_id": "system", 00:19:29.351 "dma_device_type": 1 00:19:29.351 }, 00:19:29.351 { 00:19:29.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.351 "dma_device_type": 2 00:19:29.351 } 00:19:29.351 ], 00:19:29.351 "driver_specific": {} 00:19:29.351 } 00:19:29.351 ] 00:19:29.351 08:32:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:29.351 08:32:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:29.351 08:32:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:29.351 08:32:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:29.611 BaseBdev4 00:19:29.611 08:32:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:29.611 08:32:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:29.611 08:32:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:29.611 08:32:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:29.611 08:32:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:29.611 08:32:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:29.611 08:32:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:29.611 08:32:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:29.870 [ 00:19:29.870 { 00:19:29.870 "name": "BaseBdev4", 00:19:29.870 "aliases": [ 00:19:29.870 "e89b3ff4-a41e-44a8-acff-0fe348eb1e3d" 00:19:29.870 ], 00:19:29.870 "product_name": "Malloc disk", 00:19:29.870 "block_size": 512, 00:19:29.870 "num_blocks": 65536, 00:19:29.870 "uuid": "e89b3ff4-a41e-44a8-acff-0fe348eb1e3d", 00:19:29.870 "assigned_rate_limits": { 00:19:29.870 "rw_ios_per_sec": 0, 00:19:29.870 "rw_mbytes_per_sec": 0, 00:19:29.870 "r_mbytes_per_sec": 0, 00:19:29.870 "w_mbytes_per_sec": 0 00:19:29.870 }, 00:19:29.870 "claimed": false, 00:19:29.870 "zoned": false, 00:19:29.870 "supported_io_types": { 00:19:29.870 "read": true, 00:19:29.870 "write": true, 00:19:29.870 "unmap": true, 00:19:29.871 "flush": true, 00:19:29.871 "reset": true, 00:19:29.871 "nvme_admin": false, 00:19:29.871 "nvme_io": false, 00:19:29.871 "nvme_io_md": false, 00:19:29.871 "write_zeroes": true, 00:19:29.871 "zcopy": true, 00:19:29.871 "get_zone_info": false, 00:19:29.871 "zone_management": false, 00:19:29.871 "zone_append": false, 00:19:29.871 "compare": false, 00:19:29.871 "compare_and_write": false, 00:19:29.871 "abort": true, 00:19:29.871 "seek_hole": false, 00:19:29.871 "seek_data": false, 00:19:29.871 "copy": true, 00:19:29.871 "nvme_iov_md": false 00:19:29.871 }, 00:19:29.871 "memory_domains": [ 00:19:29.871 { 00:19:29.871 "dma_device_id": "system", 00:19:29.871 "dma_device_type": 1 00:19:29.871 }, 00:19:29.871 { 00:19:29.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.871 "dma_device_type": 2 00:19:29.871 } 00:19:29.871 ], 00:19:29.871 "driver_specific": {} 00:19:29.871 } 00:19:29.871 ] 00:19:29.871 08:32:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:29.871 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:29.871 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:29.871 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:29.871 [2024-07-23 08:32:42.388318] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:29.871 [2024-07-23 08:32:42.388358] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:29.871 [2024-07-23 08:32:42.388384] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:30.130 [2024-07-23 08:32:42.390194] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:30.130 [2024-07-23 08:32:42.390247] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:30.130 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:30.130 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:30.130 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:30.130 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:30.130 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:30.130 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:30.130 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:30.130 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:30.130 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:30.130 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:30.130 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.130 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:30.130 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:30.130 "name": "Existed_Raid", 00:19:30.130 "uuid": "598acd93-dfe8-4e0a-a237-d03b3c0ce1a7", 00:19:30.130 "strip_size_kb": 64, 00:19:30.130 "state": "configuring", 00:19:30.130 "raid_level": "concat", 00:19:30.130 "superblock": true, 00:19:30.130 "num_base_bdevs": 4, 00:19:30.130 "num_base_bdevs_discovered": 3, 00:19:30.130 "num_base_bdevs_operational": 4, 00:19:30.130 "base_bdevs_list": [ 00:19:30.130 { 00:19:30.130 "name": "BaseBdev1", 00:19:30.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:30.130 "is_configured": false, 00:19:30.130 "data_offset": 0, 00:19:30.130 "data_size": 0 00:19:30.130 }, 00:19:30.130 { 00:19:30.130 "name": "BaseBdev2", 00:19:30.130 "uuid": "c0aa2a5f-2b74-47ef-a756-0e68470c2973", 00:19:30.130 "is_configured": true, 00:19:30.130 "data_offset": 2048, 00:19:30.130 "data_size": 63488 00:19:30.130 }, 00:19:30.130 { 00:19:30.130 "name": "BaseBdev3", 00:19:30.130 "uuid": "6bb7b649-2f0e-4513-80db-033209a60916", 00:19:30.130 "is_configured": true, 00:19:30.130 "data_offset": 2048, 00:19:30.130 "data_size": 63488 00:19:30.130 }, 00:19:30.130 { 00:19:30.130 "name": "BaseBdev4", 00:19:30.130 "uuid": "e89b3ff4-a41e-44a8-acff-0fe348eb1e3d", 00:19:30.130 "is_configured": true, 00:19:30.130 "data_offset": 2048, 00:19:30.130 "data_size": 63488 00:19:30.130 } 00:19:30.130 ] 00:19:30.130 }' 00:19:30.130 08:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:30.130 08:32:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:30.698 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:30.698 [2024-07-23 08:32:43.178339] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:30.698 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:30.698 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:30.698 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:30.698 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:30.698 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:30.698 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:30.698 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:30.698 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:30.698 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:30.698 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:30.698 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.698 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:30.960 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:30.960 "name": "Existed_Raid", 00:19:30.960 "uuid": "598acd93-dfe8-4e0a-a237-d03b3c0ce1a7", 00:19:30.960 "strip_size_kb": 64, 00:19:30.960 "state": "configuring", 00:19:30.960 "raid_level": "concat", 00:19:30.960 "superblock": true, 00:19:30.960 "num_base_bdevs": 4, 00:19:30.960 "num_base_bdevs_discovered": 2, 00:19:30.960 "num_base_bdevs_operational": 4, 00:19:30.960 "base_bdevs_list": [ 00:19:30.960 { 00:19:30.960 "name": "BaseBdev1", 00:19:30.960 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:30.960 "is_configured": false, 00:19:30.960 "data_offset": 0, 00:19:30.960 "data_size": 0 00:19:30.960 }, 00:19:30.960 { 00:19:30.960 "name": null, 00:19:30.960 "uuid": "c0aa2a5f-2b74-47ef-a756-0e68470c2973", 00:19:30.960 "is_configured": false, 00:19:30.960 "data_offset": 2048, 00:19:30.960 "data_size": 63488 00:19:30.960 }, 00:19:30.960 { 00:19:30.960 "name": "BaseBdev3", 00:19:30.960 "uuid": "6bb7b649-2f0e-4513-80db-033209a60916", 00:19:30.960 "is_configured": true, 00:19:30.960 "data_offset": 2048, 00:19:30.960 "data_size": 63488 00:19:30.960 }, 00:19:30.960 { 00:19:30.960 "name": "BaseBdev4", 00:19:30.960 "uuid": "e89b3ff4-a41e-44a8-acff-0fe348eb1e3d", 00:19:30.960 "is_configured": true, 00:19:30.960 "data_offset": 2048, 00:19:30.960 "data_size": 63488 00:19:30.960 } 00:19:30.960 ] 00:19:30.960 }' 00:19:30.960 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:30.960 08:32:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:31.528 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.528 08:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:31.528 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:31.528 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:31.787 [2024-07-23 08:32:44.209628] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:31.787 BaseBdev1 00:19:31.787 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:31.787 08:32:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:31.787 08:32:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:31.787 08:32:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:31.787 08:32:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:31.787 08:32:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:31.787 08:32:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:32.046 08:32:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:32.046 [ 00:19:32.046 { 00:19:32.046 "name": "BaseBdev1", 00:19:32.046 "aliases": [ 00:19:32.046 "d51f8796-d524-4b51-9d80-2691642ee314" 00:19:32.046 ], 00:19:32.046 "product_name": "Malloc disk", 00:19:32.046 "block_size": 512, 00:19:32.046 "num_blocks": 65536, 00:19:32.046 "uuid": "d51f8796-d524-4b51-9d80-2691642ee314", 00:19:32.046 "assigned_rate_limits": { 00:19:32.046 "rw_ios_per_sec": 0, 00:19:32.046 "rw_mbytes_per_sec": 0, 00:19:32.046 "r_mbytes_per_sec": 0, 00:19:32.046 "w_mbytes_per_sec": 0 00:19:32.046 }, 00:19:32.046 "claimed": true, 00:19:32.046 "claim_type": "exclusive_write", 00:19:32.047 "zoned": false, 00:19:32.047 "supported_io_types": { 00:19:32.047 "read": true, 00:19:32.047 "write": true, 00:19:32.047 "unmap": true, 00:19:32.047 "flush": true, 00:19:32.047 "reset": true, 00:19:32.047 "nvme_admin": false, 00:19:32.047 "nvme_io": false, 00:19:32.047 "nvme_io_md": false, 00:19:32.047 "write_zeroes": true, 00:19:32.047 "zcopy": true, 00:19:32.047 "get_zone_info": false, 00:19:32.047 "zone_management": false, 00:19:32.047 "zone_append": false, 00:19:32.047 "compare": false, 00:19:32.047 "compare_and_write": false, 00:19:32.047 "abort": true, 00:19:32.047 "seek_hole": false, 00:19:32.047 "seek_data": false, 00:19:32.047 "copy": true, 00:19:32.047 "nvme_iov_md": false 00:19:32.047 }, 00:19:32.047 "memory_domains": [ 00:19:32.047 { 00:19:32.047 "dma_device_id": "system", 00:19:32.047 "dma_device_type": 1 00:19:32.047 }, 00:19:32.047 { 00:19:32.047 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:32.047 "dma_device_type": 2 00:19:32.047 } 00:19:32.047 ], 00:19:32.047 "driver_specific": {} 00:19:32.047 } 00:19:32.047 ] 00:19:32.047 08:32:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:32.047 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:32.047 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:32.047 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:32.047 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:32.047 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:32.047 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:32.047 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:32.047 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:32.047 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:32.047 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:32.047 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.047 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:32.306 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:32.306 "name": "Existed_Raid", 00:19:32.306 "uuid": "598acd93-dfe8-4e0a-a237-d03b3c0ce1a7", 00:19:32.306 "strip_size_kb": 64, 00:19:32.306 "state": "configuring", 00:19:32.306 "raid_level": "concat", 00:19:32.306 "superblock": true, 00:19:32.306 "num_base_bdevs": 4, 00:19:32.306 "num_base_bdevs_discovered": 3, 00:19:32.306 "num_base_bdevs_operational": 4, 00:19:32.306 "base_bdevs_list": [ 00:19:32.306 { 00:19:32.306 "name": "BaseBdev1", 00:19:32.306 "uuid": "d51f8796-d524-4b51-9d80-2691642ee314", 00:19:32.306 "is_configured": true, 00:19:32.306 "data_offset": 2048, 00:19:32.306 "data_size": 63488 00:19:32.306 }, 00:19:32.306 { 00:19:32.306 "name": null, 00:19:32.306 "uuid": "c0aa2a5f-2b74-47ef-a756-0e68470c2973", 00:19:32.306 "is_configured": false, 00:19:32.306 "data_offset": 2048, 00:19:32.306 "data_size": 63488 00:19:32.306 }, 00:19:32.306 { 00:19:32.306 "name": "BaseBdev3", 00:19:32.306 "uuid": "6bb7b649-2f0e-4513-80db-033209a60916", 00:19:32.306 "is_configured": true, 00:19:32.306 "data_offset": 2048, 00:19:32.306 "data_size": 63488 00:19:32.306 }, 00:19:32.306 { 00:19:32.306 "name": "BaseBdev4", 00:19:32.306 "uuid": "e89b3ff4-a41e-44a8-acff-0fe348eb1e3d", 00:19:32.306 "is_configured": true, 00:19:32.306 "data_offset": 2048, 00:19:32.306 "data_size": 63488 00:19:32.306 } 00:19:32.306 ] 00:19:32.306 }' 00:19:32.306 08:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:32.306 08:32:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:32.872 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.872 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:32.872 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:32.872 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:33.130 [2024-07-23 08:32:45.505085] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:33.130 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:33.130 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:33.130 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:33.130 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:33.130 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:33.130 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:33.130 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:33.130 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:33.130 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:33.130 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:33.130 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.130 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:33.388 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:33.388 "name": "Existed_Raid", 00:19:33.389 "uuid": "598acd93-dfe8-4e0a-a237-d03b3c0ce1a7", 00:19:33.389 "strip_size_kb": 64, 00:19:33.389 "state": "configuring", 00:19:33.389 "raid_level": "concat", 00:19:33.389 "superblock": true, 00:19:33.389 "num_base_bdevs": 4, 00:19:33.389 "num_base_bdevs_discovered": 2, 00:19:33.389 "num_base_bdevs_operational": 4, 00:19:33.389 "base_bdevs_list": [ 00:19:33.389 { 00:19:33.389 "name": "BaseBdev1", 00:19:33.389 "uuid": "d51f8796-d524-4b51-9d80-2691642ee314", 00:19:33.389 "is_configured": true, 00:19:33.389 "data_offset": 2048, 00:19:33.389 "data_size": 63488 00:19:33.389 }, 00:19:33.389 { 00:19:33.389 "name": null, 00:19:33.389 "uuid": "c0aa2a5f-2b74-47ef-a756-0e68470c2973", 00:19:33.389 "is_configured": false, 00:19:33.389 "data_offset": 2048, 00:19:33.389 "data_size": 63488 00:19:33.389 }, 00:19:33.389 { 00:19:33.389 "name": null, 00:19:33.389 "uuid": "6bb7b649-2f0e-4513-80db-033209a60916", 00:19:33.389 "is_configured": false, 00:19:33.389 "data_offset": 2048, 00:19:33.389 "data_size": 63488 00:19:33.389 }, 00:19:33.389 { 00:19:33.389 "name": "BaseBdev4", 00:19:33.389 "uuid": "e89b3ff4-a41e-44a8-acff-0fe348eb1e3d", 00:19:33.389 "is_configured": true, 00:19:33.389 "data_offset": 2048, 00:19:33.389 "data_size": 63488 00:19:33.389 } 00:19:33.389 ] 00:19:33.389 }' 00:19:33.389 08:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:33.389 08:32:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:33.647 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.647 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:33.906 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:33.906 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:34.171 [2024-07-23 08:32:46.487717] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:34.171 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:34.171 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:34.171 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:34.171 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:34.171 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:34.171 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:34.171 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:34.171 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:34.171 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:34.171 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:34.171 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:34.171 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.171 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:34.171 "name": "Existed_Raid", 00:19:34.171 "uuid": "598acd93-dfe8-4e0a-a237-d03b3c0ce1a7", 00:19:34.171 "strip_size_kb": 64, 00:19:34.171 "state": "configuring", 00:19:34.171 "raid_level": "concat", 00:19:34.171 "superblock": true, 00:19:34.171 "num_base_bdevs": 4, 00:19:34.171 "num_base_bdevs_discovered": 3, 00:19:34.171 "num_base_bdevs_operational": 4, 00:19:34.171 "base_bdevs_list": [ 00:19:34.171 { 00:19:34.171 "name": "BaseBdev1", 00:19:34.171 "uuid": "d51f8796-d524-4b51-9d80-2691642ee314", 00:19:34.171 "is_configured": true, 00:19:34.171 "data_offset": 2048, 00:19:34.171 "data_size": 63488 00:19:34.171 }, 00:19:34.171 { 00:19:34.171 "name": null, 00:19:34.171 "uuid": "c0aa2a5f-2b74-47ef-a756-0e68470c2973", 00:19:34.171 "is_configured": false, 00:19:34.171 "data_offset": 2048, 00:19:34.171 "data_size": 63488 00:19:34.171 }, 00:19:34.171 { 00:19:34.171 "name": "BaseBdev3", 00:19:34.171 "uuid": "6bb7b649-2f0e-4513-80db-033209a60916", 00:19:34.171 "is_configured": true, 00:19:34.171 "data_offset": 2048, 00:19:34.171 "data_size": 63488 00:19:34.171 }, 00:19:34.171 { 00:19:34.171 "name": "BaseBdev4", 00:19:34.171 "uuid": "e89b3ff4-a41e-44a8-acff-0fe348eb1e3d", 00:19:34.171 "is_configured": true, 00:19:34.171 "data_offset": 2048, 00:19:34.171 "data_size": 63488 00:19:34.171 } 00:19:34.171 ] 00:19:34.171 }' 00:19:34.171 08:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:34.171 08:32:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:34.782 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.782 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:35.040 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:35.040 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:35.040 [2024-07-23 08:32:47.486366] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:35.299 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:35.299 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:35.299 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:35.299 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:35.299 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:35.299 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:35.299 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:35.299 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:35.299 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:35.299 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:35.299 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.299 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:35.299 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:35.299 "name": "Existed_Raid", 00:19:35.299 "uuid": "598acd93-dfe8-4e0a-a237-d03b3c0ce1a7", 00:19:35.299 "strip_size_kb": 64, 00:19:35.299 "state": "configuring", 00:19:35.299 "raid_level": "concat", 00:19:35.299 "superblock": true, 00:19:35.299 "num_base_bdevs": 4, 00:19:35.299 "num_base_bdevs_discovered": 2, 00:19:35.299 "num_base_bdevs_operational": 4, 00:19:35.299 "base_bdevs_list": [ 00:19:35.299 { 00:19:35.299 "name": null, 00:19:35.299 "uuid": "d51f8796-d524-4b51-9d80-2691642ee314", 00:19:35.299 "is_configured": false, 00:19:35.299 "data_offset": 2048, 00:19:35.299 "data_size": 63488 00:19:35.299 }, 00:19:35.299 { 00:19:35.299 "name": null, 00:19:35.299 "uuid": "c0aa2a5f-2b74-47ef-a756-0e68470c2973", 00:19:35.299 "is_configured": false, 00:19:35.299 "data_offset": 2048, 00:19:35.299 "data_size": 63488 00:19:35.299 }, 00:19:35.299 { 00:19:35.299 "name": "BaseBdev3", 00:19:35.299 "uuid": "6bb7b649-2f0e-4513-80db-033209a60916", 00:19:35.299 "is_configured": true, 00:19:35.299 "data_offset": 2048, 00:19:35.299 "data_size": 63488 00:19:35.299 }, 00:19:35.299 { 00:19:35.299 "name": "BaseBdev4", 00:19:35.299 "uuid": "e89b3ff4-a41e-44a8-acff-0fe348eb1e3d", 00:19:35.299 "is_configured": true, 00:19:35.299 "data_offset": 2048, 00:19:35.299 "data_size": 63488 00:19:35.299 } 00:19:35.299 ] 00:19:35.299 }' 00:19:35.299 08:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:35.299 08:32:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:35.866 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.866 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:36.124 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:36.124 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:36.124 [2024-07-23 08:32:48.579021] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:36.124 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:36.124 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:36.124 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:36.124 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:36.124 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:36.124 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:36.124 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:36.124 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:36.124 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:36.124 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:36.124 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.124 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:36.381 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:36.381 "name": "Existed_Raid", 00:19:36.381 "uuid": "598acd93-dfe8-4e0a-a237-d03b3c0ce1a7", 00:19:36.381 "strip_size_kb": 64, 00:19:36.381 "state": "configuring", 00:19:36.381 "raid_level": "concat", 00:19:36.381 "superblock": true, 00:19:36.381 "num_base_bdevs": 4, 00:19:36.381 "num_base_bdevs_discovered": 3, 00:19:36.381 "num_base_bdevs_operational": 4, 00:19:36.381 "base_bdevs_list": [ 00:19:36.381 { 00:19:36.381 "name": null, 00:19:36.381 "uuid": "d51f8796-d524-4b51-9d80-2691642ee314", 00:19:36.381 "is_configured": false, 00:19:36.381 "data_offset": 2048, 00:19:36.381 "data_size": 63488 00:19:36.381 }, 00:19:36.381 { 00:19:36.381 "name": "BaseBdev2", 00:19:36.381 "uuid": "c0aa2a5f-2b74-47ef-a756-0e68470c2973", 00:19:36.381 "is_configured": true, 00:19:36.381 "data_offset": 2048, 00:19:36.381 "data_size": 63488 00:19:36.381 }, 00:19:36.381 { 00:19:36.381 "name": "BaseBdev3", 00:19:36.381 "uuid": "6bb7b649-2f0e-4513-80db-033209a60916", 00:19:36.381 "is_configured": true, 00:19:36.381 "data_offset": 2048, 00:19:36.381 "data_size": 63488 00:19:36.381 }, 00:19:36.381 { 00:19:36.381 "name": "BaseBdev4", 00:19:36.381 "uuid": "e89b3ff4-a41e-44a8-acff-0fe348eb1e3d", 00:19:36.381 "is_configured": true, 00:19:36.381 "data_offset": 2048, 00:19:36.381 "data_size": 63488 00:19:36.382 } 00:19:36.382 ] 00:19:36.382 }' 00:19:36.382 08:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:36.382 08:32:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:36.948 08:32:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:36.948 08:32:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.948 08:32:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:36.948 08:32:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.948 08:32:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:37.206 08:32:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u d51f8796-d524-4b51-9d80-2691642ee314 00:19:37.464 [2024-07-23 08:32:49.738314] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:37.464 [2024-07-23 08:32:49.738532] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037280 00:19:37.464 [2024-07-23 08:32:49.738548] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:37.464 [2024-07-23 08:32:49.738785] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c3a0 00:19:37.464 [2024-07-23 08:32:49.738955] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037280 00:19:37.464 [2024-07-23 08:32:49.738967] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000037280 00:19:37.464 [2024-07-23 08:32:49.739095] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:37.464 NewBaseBdev 00:19:37.464 08:32:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:37.464 08:32:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:37.464 08:32:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:37.464 08:32:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:37.464 08:32:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:37.464 08:32:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:37.464 08:32:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:37.464 08:32:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:37.722 [ 00:19:37.722 { 00:19:37.722 "name": "NewBaseBdev", 00:19:37.722 "aliases": [ 00:19:37.722 "d51f8796-d524-4b51-9d80-2691642ee314" 00:19:37.722 ], 00:19:37.722 "product_name": "Malloc disk", 00:19:37.722 "block_size": 512, 00:19:37.722 "num_blocks": 65536, 00:19:37.722 "uuid": "d51f8796-d524-4b51-9d80-2691642ee314", 00:19:37.722 "assigned_rate_limits": { 00:19:37.722 "rw_ios_per_sec": 0, 00:19:37.722 "rw_mbytes_per_sec": 0, 00:19:37.722 "r_mbytes_per_sec": 0, 00:19:37.722 "w_mbytes_per_sec": 0 00:19:37.722 }, 00:19:37.722 "claimed": true, 00:19:37.722 "claim_type": "exclusive_write", 00:19:37.722 "zoned": false, 00:19:37.722 "supported_io_types": { 00:19:37.722 "read": true, 00:19:37.722 "write": true, 00:19:37.722 "unmap": true, 00:19:37.722 "flush": true, 00:19:37.722 "reset": true, 00:19:37.722 "nvme_admin": false, 00:19:37.722 "nvme_io": false, 00:19:37.722 "nvme_io_md": false, 00:19:37.722 "write_zeroes": true, 00:19:37.722 "zcopy": true, 00:19:37.722 "get_zone_info": false, 00:19:37.722 "zone_management": false, 00:19:37.722 "zone_append": false, 00:19:37.722 "compare": false, 00:19:37.722 "compare_and_write": false, 00:19:37.722 "abort": true, 00:19:37.722 "seek_hole": false, 00:19:37.722 "seek_data": false, 00:19:37.722 "copy": true, 00:19:37.722 "nvme_iov_md": false 00:19:37.722 }, 00:19:37.722 "memory_domains": [ 00:19:37.722 { 00:19:37.722 "dma_device_id": "system", 00:19:37.722 "dma_device_type": 1 00:19:37.722 }, 00:19:37.722 { 00:19:37.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.722 "dma_device_type": 2 00:19:37.722 } 00:19:37.722 ], 00:19:37.722 "driver_specific": {} 00:19:37.722 } 00:19:37.722 ] 00:19:37.722 08:32:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:37.722 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:37.722 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:37.722 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:37.722 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:37.722 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:37.722 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:37.722 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:37.722 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:37.722 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:37.722 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:37.722 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:37.722 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.980 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:37.980 "name": "Existed_Raid", 00:19:37.980 "uuid": "598acd93-dfe8-4e0a-a237-d03b3c0ce1a7", 00:19:37.980 "strip_size_kb": 64, 00:19:37.980 "state": "online", 00:19:37.980 "raid_level": "concat", 00:19:37.980 "superblock": true, 00:19:37.980 "num_base_bdevs": 4, 00:19:37.980 "num_base_bdevs_discovered": 4, 00:19:37.980 "num_base_bdevs_operational": 4, 00:19:37.980 "base_bdevs_list": [ 00:19:37.980 { 00:19:37.980 "name": "NewBaseBdev", 00:19:37.980 "uuid": "d51f8796-d524-4b51-9d80-2691642ee314", 00:19:37.980 "is_configured": true, 00:19:37.980 "data_offset": 2048, 00:19:37.980 "data_size": 63488 00:19:37.980 }, 00:19:37.980 { 00:19:37.980 "name": "BaseBdev2", 00:19:37.980 "uuid": "c0aa2a5f-2b74-47ef-a756-0e68470c2973", 00:19:37.980 "is_configured": true, 00:19:37.980 "data_offset": 2048, 00:19:37.980 "data_size": 63488 00:19:37.980 }, 00:19:37.980 { 00:19:37.980 "name": "BaseBdev3", 00:19:37.980 "uuid": "6bb7b649-2f0e-4513-80db-033209a60916", 00:19:37.980 "is_configured": true, 00:19:37.980 "data_offset": 2048, 00:19:37.980 "data_size": 63488 00:19:37.980 }, 00:19:37.980 { 00:19:37.980 "name": "BaseBdev4", 00:19:37.980 "uuid": "e89b3ff4-a41e-44a8-acff-0fe348eb1e3d", 00:19:37.981 "is_configured": true, 00:19:37.981 "data_offset": 2048, 00:19:37.981 "data_size": 63488 00:19:37.981 } 00:19:37.981 ] 00:19:37.981 }' 00:19:37.981 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:37.981 08:32:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:38.239 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:38.239 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:38.239 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:38.239 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:38.239 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:38.239 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:38.239 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:38.239 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:38.497 [2024-07-23 08:32:50.869654] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:38.497 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:38.497 "name": "Existed_Raid", 00:19:38.497 "aliases": [ 00:19:38.497 "598acd93-dfe8-4e0a-a237-d03b3c0ce1a7" 00:19:38.497 ], 00:19:38.497 "product_name": "Raid Volume", 00:19:38.497 "block_size": 512, 00:19:38.497 "num_blocks": 253952, 00:19:38.497 "uuid": "598acd93-dfe8-4e0a-a237-d03b3c0ce1a7", 00:19:38.497 "assigned_rate_limits": { 00:19:38.497 "rw_ios_per_sec": 0, 00:19:38.497 "rw_mbytes_per_sec": 0, 00:19:38.497 "r_mbytes_per_sec": 0, 00:19:38.497 "w_mbytes_per_sec": 0 00:19:38.497 }, 00:19:38.497 "claimed": false, 00:19:38.497 "zoned": false, 00:19:38.497 "supported_io_types": { 00:19:38.497 "read": true, 00:19:38.497 "write": true, 00:19:38.497 "unmap": true, 00:19:38.497 "flush": true, 00:19:38.497 "reset": true, 00:19:38.497 "nvme_admin": false, 00:19:38.497 "nvme_io": false, 00:19:38.497 "nvme_io_md": false, 00:19:38.497 "write_zeroes": true, 00:19:38.497 "zcopy": false, 00:19:38.497 "get_zone_info": false, 00:19:38.497 "zone_management": false, 00:19:38.497 "zone_append": false, 00:19:38.497 "compare": false, 00:19:38.497 "compare_and_write": false, 00:19:38.497 "abort": false, 00:19:38.497 "seek_hole": false, 00:19:38.497 "seek_data": false, 00:19:38.497 "copy": false, 00:19:38.497 "nvme_iov_md": false 00:19:38.497 }, 00:19:38.497 "memory_domains": [ 00:19:38.497 { 00:19:38.497 "dma_device_id": "system", 00:19:38.497 "dma_device_type": 1 00:19:38.497 }, 00:19:38.497 { 00:19:38.497 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:38.497 "dma_device_type": 2 00:19:38.497 }, 00:19:38.497 { 00:19:38.497 "dma_device_id": "system", 00:19:38.497 "dma_device_type": 1 00:19:38.497 }, 00:19:38.497 { 00:19:38.497 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:38.497 "dma_device_type": 2 00:19:38.497 }, 00:19:38.497 { 00:19:38.497 "dma_device_id": "system", 00:19:38.497 "dma_device_type": 1 00:19:38.497 }, 00:19:38.497 { 00:19:38.497 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:38.497 "dma_device_type": 2 00:19:38.497 }, 00:19:38.497 { 00:19:38.497 "dma_device_id": "system", 00:19:38.497 "dma_device_type": 1 00:19:38.497 }, 00:19:38.497 { 00:19:38.497 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:38.497 "dma_device_type": 2 00:19:38.497 } 00:19:38.497 ], 00:19:38.497 "driver_specific": { 00:19:38.497 "raid": { 00:19:38.497 "uuid": "598acd93-dfe8-4e0a-a237-d03b3c0ce1a7", 00:19:38.497 "strip_size_kb": 64, 00:19:38.498 "state": "online", 00:19:38.498 "raid_level": "concat", 00:19:38.498 "superblock": true, 00:19:38.498 "num_base_bdevs": 4, 00:19:38.498 "num_base_bdevs_discovered": 4, 00:19:38.498 "num_base_bdevs_operational": 4, 00:19:38.498 "base_bdevs_list": [ 00:19:38.498 { 00:19:38.498 "name": "NewBaseBdev", 00:19:38.498 "uuid": "d51f8796-d524-4b51-9d80-2691642ee314", 00:19:38.498 "is_configured": true, 00:19:38.498 "data_offset": 2048, 00:19:38.498 "data_size": 63488 00:19:38.498 }, 00:19:38.498 { 00:19:38.498 "name": "BaseBdev2", 00:19:38.498 "uuid": "c0aa2a5f-2b74-47ef-a756-0e68470c2973", 00:19:38.498 "is_configured": true, 00:19:38.498 "data_offset": 2048, 00:19:38.498 "data_size": 63488 00:19:38.498 }, 00:19:38.498 { 00:19:38.498 "name": "BaseBdev3", 00:19:38.498 "uuid": "6bb7b649-2f0e-4513-80db-033209a60916", 00:19:38.498 "is_configured": true, 00:19:38.498 "data_offset": 2048, 00:19:38.498 "data_size": 63488 00:19:38.498 }, 00:19:38.498 { 00:19:38.498 "name": "BaseBdev4", 00:19:38.498 "uuid": "e89b3ff4-a41e-44a8-acff-0fe348eb1e3d", 00:19:38.498 "is_configured": true, 00:19:38.498 "data_offset": 2048, 00:19:38.498 "data_size": 63488 00:19:38.498 } 00:19:38.498 ] 00:19:38.498 } 00:19:38.498 } 00:19:38.498 }' 00:19:38.498 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:38.498 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:38.498 BaseBdev2 00:19:38.498 BaseBdev3 00:19:38.498 BaseBdev4' 00:19:38.498 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:38.498 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:38.498 08:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:38.756 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:38.756 "name": "NewBaseBdev", 00:19:38.756 "aliases": [ 00:19:38.756 "d51f8796-d524-4b51-9d80-2691642ee314" 00:19:38.756 ], 00:19:38.756 "product_name": "Malloc disk", 00:19:38.756 "block_size": 512, 00:19:38.756 "num_blocks": 65536, 00:19:38.756 "uuid": "d51f8796-d524-4b51-9d80-2691642ee314", 00:19:38.756 "assigned_rate_limits": { 00:19:38.756 "rw_ios_per_sec": 0, 00:19:38.756 "rw_mbytes_per_sec": 0, 00:19:38.756 "r_mbytes_per_sec": 0, 00:19:38.756 "w_mbytes_per_sec": 0 00:19:38.756 }, 00:19:38.756 "claimed": true, 00:19:38.756 "claim_type": "exclusive_write", 00:19:38.756 "zoned": false, 00:19:38.756 "supported_io_types": { 00:19:38.756 "read": true, 00:19:38.756 "write": true, 00:19:38.756 "unmap": true, 00:19:38.756 "flush": true, 00:19:38.756 "reset": true, 00:19:38.756 "nvme_admin": false, 00:19:38.756 "nvme_io": false, 00:19:38.756 "nvme_io_md": false, 00:19:38.756 "write_zeroes": true, 00:19:38.756 "zcopy": true, 00:19:38.756 "get_zone_info": false, 00:19:38.756 "zone_management": false, 00:19:38.756 "zone_append": false, 00:19:38.756 "compare": false, 00:19:38.756 "compare_and_write": false, 00:19:38.756 "abort": true, 00:19:38.756 "seek_hole": false, 00:19:38.756 "seek_data": false, 00:19:38.756 "copy": true, 00:19:38.756 "nvme_iov_md": false 00:19:38.756 }, 00:19:38.756 "memory_domains": [ 00:19:38.756 { 00:19:38.756 "dma_device_id": "system", 00:19:38.756 "dma_device_type": 1 00:19:38.756 }, 00:19:38.756 { 00:19:38.756 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:38.756 "dma_device_type": 2 00:19:38.756 } 00:19:38.756 ], 00:19:38.756 "driver_specific": {} 00:19:38.756 }' 00:19:38.756 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.756 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.756 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:38.756 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.756 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.756 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:38.756 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:39.014 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:39.014 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:39.014 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:39.014 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:39.014 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:39.014 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:39.014 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:39.015 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:39.273 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:39.273 "name": "BaseBdev2", 00:19:39.273 "aliases": [ 00:19:39.273 "c0aa2a5f-2b74-47ef-a756-0e68470c2973" 00:19:39.273 ], 00:19:39.273 "product_name": "Malloc disk", 00:19:39.273 "block_size": 512, 00:19:39.273 "num_blocks": 65536, 00:19:39.273 "uuid": "c0aa2a5f-2b74-47ef-a756-0e68470c2973", 00:19:39.273 "assigned_rate_limits": { 00:19:39.273 "rw_ios_per_sec": 0, 00:19:39.273 "rw_mbytes_per_sec": 0, 00:19:39.273 "r_mbytes_per_sec": 0, 00:19:39.273 "w_mbytes_per_sec": 0 00:19:39.273 }, 00:19:39.273 "claimed": true, 00:19:39.273 "claim_type": "exclusive_write", 00:19:39.273 "zoned": false, 00:19:39.273 "supported_io_types": { 00:19:39.273 "read": true, 00:19:39.273 "write": true, 00:19:39.273 "unmap": true, 00:19:39.273 "flush": true, 00:19:39.273 "reset": true, 00:19:39.273 "nvme_admin": false, 00:19:39.273 "nvme_io": false, 00:19:39.273 "nvme_io_md": false, 00:19:39.273 "write_zeroes": true, 00:19:39.273 "zcopy": true, 00:19:39.273 "get_zone_info": false, 00:19:39.273 "zone_management": false, 00:19:39.273 "zone_append": false, 00:19:39.273 "compare": false, 00:19:39.273 "compare_and_write": false, 00:19:39.273 "abort": true, 00:19:39.273 "seek_hole": false, 00:19:39.273 "seek_data": false, 00:19:39.273 "copy": true, 00:19:39.273 "nvme_iov_md": false 00:19:39.273 }, 00:19:39.273 "memory_domains": [ 00:19:39.273 { 00:19:39.273 "dma_device_id": "system", 00:19:39.273 "dma_device_type": 1 00:19:39.273 }, 00:19:39.273 { 00:19:39.273 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.273 "dma_device_type": 2 00:19:39.273 } 00:19:39.273 ], 00:19:39.273 "driver_specific": {} 00:19:39.273 }' 00:19:39.273 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:39.273 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:39.273 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:39.273 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:39.274 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:39.274 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:39.274 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:39.274 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:39.531 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:39.531 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:39.531 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:39.531 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:39.531 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:39.531 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:39.531 08:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:39.789 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:39.789 "name": "BaseBdev3", 00:19:39.789 "aliases": [ 00:19:39.789 "6bb7b649-2f0e-4513-80db-033209a60916" 00:19:39.789 ], 00:19:39.789 "product_name": "Malloc disk", 00:19:39.789 "block_size": 512, 00:19:39.789 "num_blocks": 65536, 00:19:39.789 "uuid": "6bb7b649-2f0e-4513-80db-033209a60916", 00:19:39.789 "assigned_rate_limits": { 00:19:39.789 "rw_ios_per_sec": 0, 00:19:39.789 "rw_mbytes_per_sec": 0, 00:19:39.789 "r_mbytes_per_sec": 0, 00:19:39.789 "w_mbytes_per_sec": 0 00:19:39.789 }, 00:19:39.789 "claimed": true, 00:19:39.789 "claim_type": "exclusive_write", 00:19:39.789 "zoned": false, 00:19:39.789 "supported_io_types": { 00:19:39.789 "read": true, 00:19:39.789 "write": true, 00:19:39.789 "unmap": true, 00:19:39.789 "flush": true, 00:19:39.789 "reset": true, 00:19:39.789 "nvme_admin": false, 00:19:39.789 "nvme_io": false, 00:19:39.789 "nvme_io_md": false, 00:19:39.789 "write_zeroes": true, 00:19:39.789 "zcopy": true, 00:19:39.789 "get_zone_info": false, 00:19:39.789 "zone_management": false, 00:19:39.789 "zone_append": false, 00:19:39.789 "compare": false, 00:19:39.790 "compare_and_write": false, 00:19:39.790 "abort": true, 00:19:39.790 "seek_hole": false, 00:19:39.790 "seek_data": false, 00:19:39.790 "copy": true, 00:19:39.790 "nvme_iov_md": false 00:19:39.790 }, 00:19:39.790 "memory_domains": [ 00:19:39.790 { 00:19:39.790 "dma_device_id": "system", 00:19:39.790 "dma_device_type": 1 00:19:39.790 }, 00:19:39.790 { 00:19:39.790 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.790 "dma_device_type": 2 00:19:39.790 } 00:19:39.790 ], 00:19:39.790 "driver_specific": {} 00:19:39.790 }' 00:19:39.790 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:39.790 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:39.790 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:39.790 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:39.790 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:39.790 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:39.790 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:39.790 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:39.790 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:39.790 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:40.048 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:40.048 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:40.048 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:40.048 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:40.048 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:40.048 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:40.048 "name": "BaseBdev4", 00:19:40.048 "aliases": [ 00:19:40.048 "e89b3ff4-a41e-44a8-acff-0fe348eb1e3d" 00:19:40.048 ], 00:19:40.048 "product_name": "Malloc disk", 00:19:40.048 "block_size": 512, 00:19:40.048 "num_blocks": 65536, 00:19:40.048 "uuid": "e89b3ff4-a41e-44a8-acff-0fe348eb1e3d", 00:19:40.048 "assigned_rate_limits": { 00:19:40.048 "rw_ios_per_sec": 0, 00:19:40.048 "rw_mbytes_per_sec": 0, 00:19:40.048 "r_mbytes_per_sec": 0, 00:19:40.048 "w_mbytes_per_sec": 0 00:19:40.048 }, 00:19:40.048 "claimed": true, 00:19:40.048 "claim_type": "exclusive_write", 00:19:40.048 "zoned": false, 00:19:40.048 "supported_io_types": { 00:19:40.048 "read": true, 00:19:40.048 "write": true, 00:19:40.048 "unmap": true, 00:19:40.048 "flush": true, 00:19:40.048 "reset": true, 00:19:40.048 "nvme_admin": false, 00:19:40.048 "nvme_io": false, 00:19:40.048 "nvme_io_md": false, 00:19:40.048 "write_zeroes": true, 00:19:40.048 "zcopy": true, 00:19:40.048 "get_zone_info": false, 00:19:40.048 "zone_management": false, 00:19:40.048 "zone_append": false, 00:19:40.048 "compare": false, 00:19:40.048 "compare_and_write": false, 00:19:40.048 "abort": true, 00:19:40.048 "seek_hole": false, 00:19:40.048 "seek_data": false, 00:19:40.048 "copy": true, 00:19:40.048 "nvme_iov_md": false 00:19:40.048 }, 00:19:40.048 "memory_domains": [ 00:19:40.048 { 00:19:40.048 "dma_device_id": "system", 00:19:40.048 "dma_device_type": 1 00:19:40.048 }, 00:19:40.048 { 00:19:40.048 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.048 "dma_device_type": 2 00:19:40.048 } 00:19:40.048 ], 00:19:40.048 "driver_specific": {} 00:19:40.048 }' 00:19:40.048 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:40.306 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:40.306 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:40.306 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:40.306 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:40.306 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:40.306 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:40.306 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:40.306 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:40.306 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:40.306 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:40.565 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:40.565 08:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:40.565 [2024-07-23 08:32:53.007015] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:40.565 [2024-07-23 08:32:53.007044] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:40.565 [2024-07-23 08:32:53.007116] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:40.565 [2024-07-23 08:32:53.007179] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:40.565 [2024-07-23 08:32:53.007189] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037280 name Existed_Raid, state offline 00:19:40.565 08:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1493029 00:19:40.565 08:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1493029 ']' 00:19:40.565 08:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1493029 00:19:40.565 08:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:19:40.565 08:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:40.565 08:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1493029 00:19:40.565 08:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:40.565 08:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:40.565 08:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1493029' 00:19:40.565 killing process with pid 1493029 00:19:40.565 08:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1493029 00:19:40.565 [2024-07-23 08:32:53.065187] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:40.565 08:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1493029 00:19:41.131 [2024-07-23 08:32:53.394232] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:42.506 08:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:19:42.506 00:19:42.506 real 0m26.252s 00:19:42.506 user 0m46.876s 00:19:42.506 sys 0m3.988s 00:19:42.506 08:32:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:42.506 08:32:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:42.506 ************************************ 00:19:42.506 END TEST raid_state_function_test_sb 00:19:42.506 ************************************ 00:19:42.507 08:32:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:42.507 08:32:54 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:19:42.507 08:32:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:19:42.507 08:32:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:42.507 08:32:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:42.507 ************************************ 00:19:42.507 START TEST raid_superblock_test 00:19:42.507 ************************************ 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1498500 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1498500 /var/tmp/spdk-raid.sock 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1498500 ']' 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:42.507 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:42.507 08:32:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:42.507 [2024-07-23 08:32:54.796142] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:19:42.507 [2024-07-23 08:32:54.796235] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1498500 ] 00:19:42.507 [2024-07-23 08:32:54.926316] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:42.765 [2024-07-23 08:32:55.131344] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:43.024 [2024-07-23 08:32:55.393976] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:43.024 [2024-07-23 08:32:55.394007] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:43.281 08:32:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:43.281 08:32:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:19:43.281 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:19:43.281 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:43.281 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:19:43.281 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:19:43.281 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:19:43.282 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:43.282 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:43.282 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:43.282 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:19:43.282 malloc1 00:19:43.282 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:43.540 [2024-07-23 08:32:55.954322] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:43.540 [2024-07-23 08:32:55.954377] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:43.540 [2024-07-23 08:32:55.954400] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:19:43.540 [2024-07-23 08:32:55.954413] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:43.540 [2024-07-23 08:32:55.956346] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:43.540 [2024-07-23 08:32:55.956374] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:43.540 pt1 00:19:43.540 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:43.540 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:43.540 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:19:43.540 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:19:43.540 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:19:43.540 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:43.540 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:43.540 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:43.540 08:32:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:19:43.798 malloc2 00:19:43.798 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:44.058 [2024-07-23 08:32:56.350169] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:44.058 [2024-07-23 08:32:56.350229] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:44.058 [2024-07-23 08:32:56.350253] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:19:44.058 [2024-07-23 08:32:56.350264] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:44.058 [2024-07-23 08:32:56.352541] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:44.058 [2024-07-23 08:32:56.352571] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:44.058 pt2 00:19:44.058 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:44.058 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:44.058 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:19:44.058 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:19:44.058 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:19:44.058 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:44.058 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:44.058 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:44.058 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:19:44.058 malloc3 00:19:44.316 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:44.316 [2024-07-23 08:32:56.733884] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:44.316 [2024-07-23 08:32:56.733935] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:44.317 [2024-07-23 08:32:56.733977] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036080 00:19:44.317 [2024-07-23 08:32:56.733986] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:44.317 [2024-07-23 08:32:56.735912] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:44.317 [2024-07-23 08:32:56.735938] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:44.317 pt3 00:19:44.317 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:44.317 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:44.317 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:19:44.317 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:19:44.317 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:19:44.317 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:44.317 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:44.317 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:44.317 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:19:44.575 malloc4 00:19:44.575 08:32:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:44.833 [2024-07-23 08:32:57.108851] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:44.833 [2024-07-23 08:32:57.108900] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:44.833 [2024-07-23 08:32:57.108935] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036c80 00:19:44.833 [2024-07-23 08:32:57.108945] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:44.833 [2024-07-23 08:32:57.110919] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:44.833 [2024-07-23 08:32:57.110946] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:44.833 pt4 00:19:44.833 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:44.833 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:44.833 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:19:44.833 [2024-07-23 08:32:57.281380] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:44.833 [2024-07-23 08:32:57.283009] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:44.833 [2024-07-23 08:32:57.283073] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:44.833 [2024-07-23 08:32:57.283114] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:44.833 [2024-07-23 08:32:57.283310] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037280 00:19:44.833 [2024-07-23 08:32:57.283321] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:44.833 [2024-07-23 08:32:57.283586] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:19:44.833 [2024-07-23 08:32:57.283796] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037280 00:19:44.833 [2024-07-23 08:32:57.283808] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037280 00:19:44.833 [2024-07-23 08:32:57.283963] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:44.833 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:19:44.833 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:44.833 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:44.833 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:44.833 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:44.833 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:44.833 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:44.833 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:44.833 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:44.833 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:44.833 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:44.833 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:45.092 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:45.092 "name": "raid_bdev1", 00:19:45.092 "uuid": "5fd80812-b897-4172-b61a-ce98fa206986", 00:19:45.092 "strip_size_kb": 64, 00:19:45.092 "state": "online", 00:19:45.092 "raid_level": "concat", 00:19:45.092 "superblock": true, 00:19:45.092 "num_base_bdevs": 4, 00:19:45.092 "num_base_bdevs_discovered": 4, 00:19:45.092 "num_base_bdevs_operational": 4, 00:19:45.092 "base_bdevs_list": [ 00:19:45.092 { 00:19:45.092 "name": "pt1", 00:19:45.092 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:45.092 "is_configured": true, 00:19:45.092 "data_offset": 2048, 00:19:45.092 "data_size": 63488 00:19:45.092 }, 00:19:45.092 { 00:19:45.092 "name": "pt2", 00:19:45.092 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:45.092 "is_configured": true, 00:19:45.092 "data_offset": 2048, 00:19:45.092 "data_size": 63488 00:19:45.092 }, 00:19:45.092 { 00:19:45.092 "name": "pt3", 00:19:45.092 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:45.092 "is_configured": true, 00:19:45.092 "data_offset": 2048, 00:19:45.092 "data_size": 63488 00:19:45.092 }, 00:19:45.092 { 00:19:45.092 "name": "pt4", 00:19:45.092 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:45.092 "is_configured": true, 00:19:45.092 "data_offset": 2048, 00:19:45.092 "data_size": 63488 00:19:45.092 } 00:19:45.092 ] 00:19:45.092 }' 00:19:45.092 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:45.092 08:32:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:45.659 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:19:45.659 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:45.659 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:45.659 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:45.659 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:45.659 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:45.659 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:45.659 08:32:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:45.659 [2024-07-23 08:32:58.075730] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:45.659 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:45.659 "name": "raid_bdev1", 00:19:45.659 "aliases": [ 00:19:45.659 "5fd80812-b897-4172-b61a-ce98fa206986" 00:19:45.659 ], 00:19:45.659 "product_name": "Raid Volume", 00:19:45.659 "block_size": 512, 00:19:45.659 "num_blocks": 253952, 00:19:45.659 "uuid": "5fd80812-b897-4172-b61a-ce98fa206986", 00:19:45.659 "assigned_rate_limits": { 00:19:45.659 "rw_ios_per_sec": 0, 00:19:45.659 "rw_mbytes_per_sec": 0, 00:19:45.659 "r_mbytes_per_sec": 0, 00:19:45.659 "w_mbytes_per_sec": 0 00:19:45.659 }, 00:19:45.659 "claimed": false, 00:19:45.659 "zoned": false, 00:19:45.659 "supported_io_types": { 00:19:45.659 "read": true, 00:19:45.659 "write": true, 00:19:45.659 "unmap": true, 00:19:45.659 "flush": true, 00:19:45.659 "reset": true, 00:19:45.659 "nvme_admin": false, 00:19:45.659 "nvme_io": false, 00:19:45.659 "nvme_io_md": false, 00:19:45.659 "write_zeroes": true, 00:19:45.659 "zcopy": false, 00:19:45.659 "get_zone_info": false, 00:19:45.659 "zone_management": false, 00:19:45.659 "zone_append": false, 00:19:45.659 "compare": false, 00:19:45.659 "compare_and_write": false, 00:19:45.659 "abort": false, 00:19:45.659 "seek_hole": false, 00:19:45.659 "seek_data": false, 00:19:45.659 "copy": false, 00:19:45.659 "nvme_iov_md": false 00:19:45.659 }, 00:19:45.659 "memory_domains": [ 00:19:45.659 { 00:19:45.659 "dma_device_id": "system", 00:19:45.659 "dma_device_type": 1 00:19:45.659 }, 00:19:45.659 { 00:19:45.659 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:45.659 "dma_device_type": 2 00:19:45.659 }, 00:19:45.659 { 00:19:45.659 "dma_device_id": "system", 00:19:45.659 "dma_device_type": 1 00:19:45.659 }, 00:19:45.659 { 00:19:45.659 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:45.659 "dma_device_type": 2 00:19:45.659 }, 00:19:45.659 { 00:19:45.659 "dma_device_id": "system", 00:19:45.659 "dma_device_type": 1 00:19:45.659 }, 00:19:45.659 { 00:19:45.659 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:45.659 "dma_device_type": 2 00:19:45.659 }, 00:19:45.659 { 00:19:45.659 "dma_device_id": "system", 00:19:45.659 "dma_device_type": 1 00:19:45.659 }, 00:19:45.659 { 00:19:45.659 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:45.659 "dma_device_type": 2 00:19:45.659 } 00:19:45.659 ], 00:19:45.659 "driver_specific": { 00:19:45.659 "raid": { 00:19:45.659 "uuid": "5fd80812-b897-4172-b61a-ce98fa206986", 00:19:45.659 "strip_size_kb": 64, 00:19:45.659 "state": "online", 00:19:45.659 "raid_level": "concat", 00:19:45.659 "superblock": true, 00:19:45.659 "num_base_bdevs": 4, 00:19:45.659 "num_base_bdevs_discovered": 4, 00:19:45.659 "num_base_bdevs_operational": 4, 00:19:45.659 "base_bdevs_list": [ 00:19:45.659 { 00:19:45.659 "name": "pt1", 00:19:45.659 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:45.659 "is_configured": true, 00:19:45.659 "data_offset": 2048, 00:19:45.659 "data_size": 63488 00:19:45.659 }, 00:19:45.659 { 00:19:45.659 "name": "pt2", 00:19:45.659 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:45.659 "is_configured": true, 00:19:45.659 "data_offset": 2048, 00:19:45.659 "data_size": 63488 00:19:45.659 }, 00:19:45.659 { 00:19:45.659 "name": "pt3", 00:19:45.659 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:45.659 "is_configured": true, 00:19:45.659 "data_offset": 2048, 00:19:45.659 "data_size": 63488 00:19:45.659 }, 00:19:45.659 { 00:19:45.659 "name": "pt4", 00:19:45.659 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:45.659 "is_configured": true, 00:19:45.659 "data_offset": 2048, 00:19:45.659 "data_size": 63488 00:19:45.659 } 00:19:45.659 ] 00:19:45.659 } 00:19:45.659 } 00:19:45.659 }' 00:19:45.659 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:45.659 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:45.659 pt2 00:19:45.659 pt3 00:19:45.659 pt4' 00:19:45.659 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:45.659 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:45.659 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:45.917 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:45.917 "name": "pt1", 00:19:45.917 "aliases": [ 00:19:45.917 "00000000-0000-0000-0000-000000000001" 00:19:45.917 ], 00:19:45.917 "product_name": "passthru", 00:19:45.917 "block_size": 512, 00:19:45.917 "num_blocks": 65536, 00:19:45.917 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:45.917 "assigned_rate_limits": { 00:19:45.917 "rw_ios_per_sec": 0, 00:19:45.917 "rw_mbytes_per_sec": 0, 00:19:45.917 "r_mbytes_per_sec": 0, 00:19:45.917 "w_mbytes_per_sec": 0 00:19:45.917 }, 00:19:45.917 "claimed": true, 00:19:45.917 "claim_type": "exclusive_write", 00:19:45.917 "zoned": false, 00:19:45.917 "supported_io_types": { 00:19:45.917 "read": true, 00:19:45.917 "write": true, 00:19:45.917 "unmap": true, 00:19:45.917 "flush": true, 00:19:45.917 "reset": true, 00:19:45.917 "nvme_admin": false, 00:19:45.917 "nvme_io": false, 00:19:45.917 "nvme_io_md": false, 00:19:45.917 "write_zeroes": true, 00:19:45.917 "zcopy": true, 00:19:45.917 "get_zone_info": false, 00:19:45.918 "zone_management": false, 00:19:45.918 "zone_append": false, 00:19:45.918 "compare": false, 00:19:45.918 "compare_and_write": false, 00:19:45.918 "abort": true, 00:19:45.918 "seek_hole": false, 00:19:45.918 "seek_data": false, 00:19:45.918 "copy": true, 00:19:45.918 "nvme_iov_md": false 00:19:45.918 }, 00:19:45.918 "memory_domains": [ 00:19:45.918 { 00:19:45.918 "dma_device_id": "system", 00:19:45.918 "dma_device_type": 1 00:19:45.918 }, 00:19:45.918 { 00:19:45.918 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:45.918 "dma_device_type": 2 00:19:45.918 } 00:19:45.918 ], 00:19:45.918 "driver_specific": { 00:19:45.918 "passthru": { 00:19:45.918 "name": "pt1", 00:19:45.918 "base_bdev_name": "malloc1" 00:19:45.918 } 00:19:45.918 } 00:19:45.918 }' 00:19:45.918 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:45.918 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:45.918 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:45.918 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:45.918 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:46.176 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:46.176 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:46.176 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:46.176 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:46.176 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:46.176 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:46.176 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:46.176 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:46.176 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:46.176 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:46.434 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:46.434 "name": "pt2", 00:19:46.434 "aliases": [ 00:19:46.434 "00000000-0000-0000-0000-000000000002" 00:19:46.434 ], 00:19:46.434 "product_name": "passthru", 00:19:46.434 "block_size": 512, 00:19:46.434 "num_blocks": 65536, 00:19:46.434 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:46.434 "assigned_rate_limits": { 00:19:46.434 "rw_ios_per_sec": 0, 00:19:46.434 "rw_mbytes_per_sec": 0, 00:19:46.434 "r_mbytes_per_sec": 0, 00:19:46.434 "w_mbytes_per_sec": 0 00:19:46.434 }, 00:19:46.434 "claimed": true, 00:19:46.434 "claim_type": "exclusive_write", 00:19:46.434 "zoned": false, 00:19:46.434 "supported_io_types": { 00:19:46.434 "read": true, 00:19:46.434 "write": true, 00:19:46.434 "unmap": true, 00:19:46.434 "flush": true, 00:19:46.434 "reset": true, 00:19:46.434 "nvme_admin": false, 00:19:46.434 "nvme_io": false, 00:19:46.434 "nvme_io_md": false, 00:19:46.434 "write_zeroes": true, 00:19:46.434 "zcopy": true, 00:19:46.434 "get_zone_info": false, 00:19:46.434 "zone_management": false, 00:19:46.434 "zone_append": false, 00:19:46.434 "compare": false, 00:19:46.434 "compare_and_write": false, 00:19:46.434 "abort": true, 00:19:46.434 "seek_hole": false, 00:19:46.434 "seek_data": false, 00:19:46.434 "copy": true, 00:19:46.434 "nvme_iov_md": false 00:19:46.434 }, 00:19:46.434 "memory_domains": [ 00:19:46.434 { 00:19:46.434 "dma_device_id": "system", 00:19:46.434 "dma_device_type": 1 00:19:46.434 }, 00:19:46.434 { 00:19:46.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.434 "dma_device_type": 2 00:19:46.434 } 00:19:46.434 ], 00:19:46.434 "driver_specific": { 00:19:46.434 "passthru": { 00:19:46.434 "name": "pt2", 00:19:46.434 "base_bdev_name": "malloc2" 00:19:46.434 } 00:19:46.434 } 00:19:46.434 }' 00:19:46.435 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:46.435 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:46.435 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:46.435 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:46.435 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:46.435 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:46.435 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:46.693 08:32:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:46.693 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:46.693 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:46.693 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:46.693 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:46.693 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:46.693 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:46.693 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:46.951 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:46.951 "name": "pt3", 00:19:46.951 "aliases": [ 00:19:46.951 "00000000-0000-0000-0000-000000000003" 00:19:46.951 ], 00:19:46.951 "product_name": "passthru", 00:19:46.951 "block_size": 512, 00:19:46.951 "num_blocks": 65536, 00:19:46.951 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:46.951 "assigned_rate_limits": { 00:19:46.951 "rw_ios_per_sec": 0, 00:19:46.951 "rw_mbytes_per_sec": 0, 00:19:46.951 "r_mbytes_per_sec": 0, 00:19:46.951 "w_mbytes_per_sec": 0 00:19:46.951 }, 00:19:46.951 "claimed": true, 00:19:46.951 "claim_type": "exclusive_write", 00:19:46.951 "zoned": false, 00:19:46.951 "supported_io_types": { 00:19:46.951 "read": true, 00:19:46.951 "write": true, 00:19:46.951 "unmap": true, 00:19:46.951 "flush": true, 00:19:46.951 "reset": true, 00:19:46.951 "nvme_admin": false, 00:19:46.951 "nvme_io": false, 00:19:46.951 "nvme_io_md": false, 00:19:46.951 "write_zeroes": true, 00:19:46.951 "zcopy": true, 00:19:46.951 "get_zone_info": false, 00:19:46.951 "zone_management": false, 00:19:46.951 "zone_append": false, 00:19:46.951 "compare": false, 00:19:46.951 "compare_and_write": false, 00:19:46.951 "abort": true, 00:19:46.951 "seek_hole": false, 00:19:46.951 "seek_data": false, 00:19:46.951 "copy": true, 00:19:46.951 "nvme_iov_md": false 00:19:46.951 }, 00:19:46.951 "memory_domains": [ 00:19:46.951 { 00:19:46.951 "dma_device_id": "system", 00:19:46.951 "dma_device_type": 1 00:19:46.951 }, 00:19:46.951 { 00:19:46.951 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.951 "dma_device_type": 2 00:19:46.952 } 00:19:46.952 ], 00:19:46.952 "driver_specific": { 00:19:46.952 "passthru": { 00:19:46.952 "name": "pt3", 00:19:46.952 "base_bdev_name": "malloc3" 00:19:46.952 } 00:19:46.952 } 00:19:46.952 }' 00:19:46.952 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:46.952 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:46.952 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:46.952 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:46.952 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:46.952 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:46.952 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:46.952 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.210 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:47.210 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:47.210 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:47.210 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:47.210 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:47.210 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:47.210 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:47.210 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:47.210 "name": "pt4", 00:19:47.210 "aliases": [ 00:19:47.210 "00000000-0000-0000-0000-000000000004" 00:19:47.210 ], 00:19:47.210 "product_name": "passthru", 00:19:47.210 "block_size": 512, 00:19:47.210 "num_blocks": 65536, 00:19:47.210 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:47.210 "assigned_rate_limits": { 00:19:47.210 "rw_ios_per_sec": 0, 00:19:47.210 "rw_mbytes_per_sec": 0, 00:19:47.210 "r_mbytes_per_sec": 0, 00:19:47.210 "w_mbytes_per_sec": 0 00:19:47.210 }, 00:19:47.210 "claimed": true, 00:19:47.210 "claim_type": "exclusive_write", 00:19:47.210 "zoned": false, 00:19:47.210 "supported_io_types": { 00:19:47.210 "read": true, 00:19:47.210 "write": true, 00:19:47.210 "unmap": true, 00:19:47.210 "flush": true, 00:19:47.210 "reset": true, 00:19:47.210 "nvme_admin": false, 00:19:47.210 "nvme_io": false, 00:19:47.210 "nvme_io_md": false, 00:19:47.210 "write_zeroes": true, 00:19:47.210 "zcopy": true, 00:19:47.210 "get_zone_info": false, 00:19:47.210 "zone_management": false, 00:19:47.210 "zone_append": false, 00:19:47.210 "compare": false, 00:19:47.210 "compare_and_write": false, 00:19:47.210 "abort": true, 00:19:47.210 "seek_hole": false, 00:19:47.210 "seek_data": false, 00:19:47.210 "copy": true, 00:19:47.210 "nvme_iov_md": false 00:19:47.210 }, 00:19:47.210 "memory_domains": [ 00:19:47.210 { 00:19:47.210 "dma_device_id": "system", 00:19:47.210 "dma_device_type": 1 00:19:47.210 }, 00:19:47.210 { 00:19:47.210 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.210 "dma_device_type": 2 00:19:47.210 } 00:19:47.210 ], 00:19:47.210 "driver_specific": { 00:19:47.210 "passthru": { 00:19:47.210 "name": "pt4", 00:19:47.210 "base_bdev_name": "malloc4" 00:19:47.210 } 00:19:47.210 } 00:19:47.210 }' 00:19:47.210 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:47.513 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:47.513 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:47.513 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:47.513 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:47.513 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:47.513 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.513 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.513 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:47.513 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:47.513 08:32:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:47.771 08:33:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:47.771 08:33:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:47.771 08:33:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:19:47.771 [2024-07-23 08:33:00.197357] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:47.771 08:33:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=5fd80812-b897-4172-b61a-ce98fa206986 00:19:47.771 08:33:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 5fd80812-b897-4172-b61a-ce98fa206986 ']' 00:19:47.771 08:33:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:48.029 [2024-07-23 08:33:00.385537] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:48.029 [2024-07-23 08:33:00.385564] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:48.029 [2024-07-23 08:33:00.385651] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:48.029 [2024-07-23 08:33:00.385727] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:48.029 [2024-07-23 08:33:00.385741] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037280 name raid_bdev1, state offline 00:19:48.029 08:33:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.029 08:33:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:19:48.287 08:33:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:19:48.287 08:33:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:19:48.287 08:33:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:48.287 08:33:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:48.287 08:33:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:48.287 08:33:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:48.546 08:33:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:48.546 08:33:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:48.546 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:48.546 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:48.805 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:19:48.805 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:49.064 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:19:49.064 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:49.064 08:33:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:19:49.064 08:33:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:49.064 08:33:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:49.064 08:33:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:49.064 08:33:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:49.064 08:33:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:49.064 08:33:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:49.064 08:33:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:49.064 08:33:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:49.064 08:33:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:49.064 08:33:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:49.064 [2024-07-23 08:33:01.476401] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:49.064 [2024-07-23 08:33:01.477992] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:49.064 [2024-07-23 08:33:01.478036] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:49.064 [2024-07-23 08:33:01.478067] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:19:49.064 [2024-07-23 08:33:01.478111] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:49.064 [2024-07-23 08:33:01.478154] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:49.064 [2024-07-23 08:33:01.478190] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:49.064 [2024-07-23 08:33:01.478209] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:19:49.064 [2024-07-23 08:33:01.478223] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:49.064 [2024-07-23 08:33:01.478237] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037880 name raid_bdev1, state configuring 00:19:49.064 request: 00:19:49.064 { 00:19:49.064 "name": "raid_bdev1", 00:19:49.064 "raid_level": "concat", 00:19:49.064 "base_bdevs": [ 00:19:49.064 "malloc1", 00:19:49.064 "malloc2", 00:19:49.064 "malloc3", 00:19:49.064 "malloc4" 00:19:49.064 ], 00:19:49.064 "strip_size_kb": 64, 00:19:49.065 "superblock": false, 00:19:49.065 "method": "bdev_raid_create", 00:19:49.065 "req_id": 1 00:19:49.065 } 00:19:49.065 Got JSON-RPC error response 00:19:49.065 response: 00:19:49.065 { 00:19:49.065 "code": -17, 00:19:49.065 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:49.065 } 00:19:49.065 08:33:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:19:49.065 08:33:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:49.065 08:33:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:49.065 08:33:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:49.065 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:19:49.065 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:49.325 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:19:49.325 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:19:49.325 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:49.325 [2024-07-23 08:33:01.809213] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:49.325 [2024-07-23 08:33:01.809286] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:49.326 [2024-07-23 08:33:01.809305] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037e80 00:19:49.326 [2024-07-23 08:33:01.809315] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:49.326 [2024-07-23 08:33:01.811308] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:49.326 [2024-07-23 08:33:01.811338] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:49.326 [2024-07-23 08:33:01.811416] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:49.326 [2024-07-23 08:33:01.811474] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:49.326 pt1 00:19:49.326 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:19:49.326 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:49.326 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:49.326 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:49.326 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:49.326 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:49.326 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:49.326 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:49.326 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:49.326 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:49.326 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:49.326 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:49.585 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:49.585 "name": "raid_bdev1", 00:19:49.585 "uuid": "5fd80812-b897-4172-b61a-ce98fa206986", 00:19:49.585 "strip_size_kb": 64, 00:19:49.585 "state": "configuring", 00:19:49.585 "raid_level": "concat", 00:19:49.585 "superblock": true, 00:19:49.585 "num_base_bdevs": 4, 00:19:49.585 "num_base_bdevs_discovered": 1, 00:19:49.585 "num_base_bdevs_operational": 4, 00:19:49.585 "base_bdevs_list": [ 00:19:49.585 { 00:19:49.585 "name": "pt1", 00:19:49.585 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:49.585 "is_configured": true, 00:19:49.585 "data_offset": 2048, 00:19:49.585 "data_size": 63488 00:19:49.585 }, 00:19:49.585 { 00:19:49.585 "name": null, 00:19:49.585 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:49.585 "is_configured": false, 00:19:49.585 "data_offset": 2048, 00:19:49.585 "data_size": 63488 00:19:49.585 }, 00:19:49.585 { 00:19:49.585 "name": null, 00:19:49.585 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:49.585 "is_configured": false, 00:19:49.585 "data_offset": 2048, 00:19:49.585 "data_size": 63488 00:19:49.585 }, 00:19:49.585 { 00:19:49.585 "name": null, 00:19:49.585 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:49.585 "is_configured": false, 00:19:49.585 "data_offset": 2048, 00:19:49.585 "data_size": 63488 00:19:49.585 } 00:19:49.585 ] 00:19:49.585 }' 00:19:49.585 08:33:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:49.585 08:33:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:50.150 08:33:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:19:50.150 08:33:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:50.150 [2024-07-23 08:33:02.611331] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:50.150 [2024-07-23 08:33:02.611384] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:50.150 [2024-07-23 08:33:02.611402] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038780 00:19:50.150 [2024-07-23 08:33:02.611413] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:50.150 [2024-07-23 08:33:02.611832] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:50.150 [2024-07-23 08:33:02.611853] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:50.150 [2024-07-23 08:33:02.611924] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:50.150 [2024-07-23 08:33:02.611949] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:50.150 pt2 00:19:50.150 08:33:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:50.408 [2024-07-23 08:33:02.791822] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:19:50.408 08:33:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:19:50.408 08:33:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:50.408 08:33:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:50.408 08:33:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:50.408 08:33:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:50.408 08:33:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:50.408 08:33:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:50.408 08:33:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:50.408 08:33:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:50.408 08:33:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:50.409 08:33:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.409 08:33:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:50.667 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:50.667 "name": "raid_bdev1", 00:19:50.667 "uuid": "5fd80812-b897-4172-b61a-ce98fa206986", 00:19:50.667 "strip_size_kb": 64, 00:19:50.667 "state": "configuring", 00:19:50.667 "raid_level": "concat", 00:19:50.667 "superblock": true, 00:19:50.667 "num_base_bdevs": 4, 00:19:50.667 "num_base_bdevs_discovered": 1, 00:19:50.667 "num_base_bdevs_operational": 4, 00:19:50.667 "base_bdevs_list": [ 00:19:50.667 { 00:19:50.667 "name": "pt1", 00:19:50.667 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:50.667 "is_configured": true, 00:19:50.667 "data_offset": 2048, 00:19:50.667 "data_size": 63488 00:19:50.667 }, 00:19:50.667 { 00:19:50.667 "name": null, 00:19:50.667 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:50.667 "is_configured": false, 00:19:50.667 "data_offset": 2048, 00:19:50.667 "data_size": 63488 00:19:50.667 }, 00:19:50.667 { 00:19:50.667 "name": null, 00:19:50.667 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:50.667 "is_configured": false, 00:19:50.667 "data_offset": 2048, 00:19:50.667 "data_size": 63488 00:19:50.667 }, 00:19:50.667 { 00:19:50.667 "name": null, 00:19:50.667 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:50.667 "is_configured": false, 00:19:50.667 "data_offset": 2048, 00:19:50.667 "data_size": 63488 00:19:50.667 } 00:19:50.667 ] 00:19:50.667 }' 00:19:50.667 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:50.667 08:33:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:51.241 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:19:51.241 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:51.241 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:51.241 [2024-07-23 08:33:03.646052] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:51.241 [2024-07-23 08:33:03.646114] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:51.241 [2024-07-23 08:33:03.646132] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038a80 00:19:51.241 [2024-07-23 08:33:03.646142] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:51.241 [2024-07-23 08:33:03.646575] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:51.241 [2024-07-23 08:33:03.646591] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:51.241 [2024-07-23 08:33:03.646678] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:51.241 [2024-07-23 08:33:03.646701] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:51.241 pt2 00:19:51.241 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:51.241 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:51.241 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:51.498 [2024-07-23 08:33:03.810507] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:51.498 [2024-07-23 08:33:03.810569] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:51.498 [2024-07-23 08:33:03.810620] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038d80 00:19:51.498 [2024-07-23 08:33:03.810631] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:51.498 [2024-07-23 08:33:03.811144] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:51.498 [2024-07-23 08:33:03.811164] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:51.498 [2024-07-23 08:33:03.811240] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:51.498 [2024-07-23 08:33:03.811260] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:51.498 pt3 00:19:51.498 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:51.498 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:51.498 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:51.498 [2024-07-23 08:33:03.974939] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:51.498 [2024-07-23 08:33:03.975001] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:51.498 [2024-07-23 08:33:03.975022] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039080 00:19:51.498 [2024-07-23 08:33:03.975031] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:51.498 [2024-07-23 08:33:03.975468] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:51.498 [2024-07-23 08:33:03.975485] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:51.498 [2024-07-23 08:33:03.975561] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:51.498 [2024-07-23 08:33:03.975589] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:51.498 [2024-07-23 08:33:03.975768] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000038480 00:19:51.498 [2024-07-23 08:33:03.975778] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:51.498 [2024-07-23 08:33:03.976006] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:19:51.498 [2024-07-23 08:33:03.976195] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000038480 00:19:51.498 [2024-07-23 08:33:03.976207] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000038480 00:19:51.498 [2024-07-23 08:33:03.976351] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:51.498 pt4 00:19:51.498 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:51.498 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:51.498 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:19:51.498 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:51.498 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:51.498 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:51.498 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:51.498 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:51.498 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.498 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.498 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.498 08:33:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.498 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.498 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:51.757 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:51.757 "name": "raid_bdev1", 00:19:51.757 "uuid": "5fd80812-b897-4172-b61a-ce98fa206986", 00:19:51.757 "strip_size_kb": 64, 00:19:51.757 "state": "online", 00:19:51.757 "raid_level": "concat", 00:19:51.757 "superblock": true, 00:19:51.757 "num_base_bdevs": 4, 00:19:51.757 "num_base_bdevs_discovered": 4, 00:19:51.757 "num_base_bdevs_operational": 4, 00:19:51.757 "base_bdevs_list": [ 00:19:51.757 { 00:19:51.757 "name": "pt1", 00:19:51.757 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:51.757 "is_configured": true, 00:19:51.757 "data_offset": 2048, 00:19:51.757 "data_size": 63488 00:19:51.757 }, 00:19:51.757 { 00:19:51.757 "name": "pt2", 00:19:51.757 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:51.757 "is_configured": true, 00:19:51.757 "data_offset": 2048, 00:19:51.757 "data_size": 63488 00:19:51.757 }, 00:19:51.757 { 00:19:51.757 "name": "pt3", 00:19:51.757 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:51.757 "is_configured": true, 00:19:51.757 "data_offset": 2048, 00:19:51.757 "data_size": 63488 00:19:51.757 }, 00:19:51.757 { 00:19:51.757 "name": "pt4", 00:19:51.757 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:51.757 "is_configured": true, 00:19:51.757 "data_offset": 2048, 00:19:51.757 "data_size": 63488 00:19:51.757 } 00:19:51.757 ] 00:19:51.757 }' 00:19:51.757 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:51.757 08:33:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:52.324 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:19:52.324 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:52.324 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:52.324 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:52.324 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:52.324 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:52.324 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:52.324 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:52.324 [2024-07-23 08:33:04.841543] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:52.583 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:52.583 "name": "raid_bdev1", 00:19:52.583 "aliases": [ 00:19:52.583 "5fd80812-b897-4172-b61a-ce98fa206986" 00:19:52.583 ], 00:19:52.583 "product_name": "Raid Volume", 00:19:52.583 "block_size": 512, 00:19:52.583 "num_blocks": 253952, 00:19:52.583 "uuid": "5fd80812-b897-4172-b61a-ce98fa206986", 00:19:52.583 "assigned_rate_limits": { 00:19:52.583 "rw_ios_per_sec": 0, 00:19:52.583 "rw_mbytes_per_sec": 0, 00:19:52.583 "r_mbytes_per_sec": 0, 00:19:52.583 "w_mbytes_per_sec": 0 00:19:52.583 }, 00:19:52.583 "claimed": false, 00:19:52.583 "zoned": false, 00:19:52.583 "supported_io_types": { 00:19:52.583 "read": true, 00:19:52.583 "write": true, 00:19:52.583 "unmap": true, 00:19:52.583 "flush": true, 00:19:52.583 "reset": true, 00:19:52.583 "nvme_admin": false, 00:19:52.583 "nvme_io": false, 00:19:52.583 "nvme_io_md": false, 00:19:52.583 "write_zeroes": true, 00:19:52.583 "zcopy": false, 00:19:52.583 "get_zone_info": false, 00:19:52.583 "zone_management": false, 00:19:52.583 "zone_append": false, 00:19:52.583 "compare": false, 00:19:52.583 "compare_and_write": false, 00:19:52.583 "abort": false, 00:19:52.583 "seek_hole": false, 00:19:52.583 "seek_data": false, 00:19:52.583 "copy": false, 00:19:52.583 "nvme_iov_md": false 00:19:52.583 }, 00:19:52.583 "memory_domains": [ 00:19:52.583 { 00:19:52.583 "dma_device_id": "system", 00:19:52.583 "dma_device_type": 1 00:19:52.583 }, 00:19:52.583 { 00:19:52.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:52.583 "dma_device_type": 2 00:19:52.583 }, 00:19:52.583 { 00:19:52.583 "dma_device_id": "system", 00:19:52.583 "dma_device_type": 1 00:19:52.583 }, 00:19:52.583 { 00:19:52.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:52.583 "dma_device_type": 2 00:19:52.583 }, 00:19:52.583 { 00:19:52.583 "dma_device_id": "system", 00:19:52.583 "dma_device_type": 1 00:19:52.583 }, 00:19:52.583 { 00:19:52.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:52.583 "dma_device_type": 2 00:19:52.583 }, 00:19:52.583 { 00:19:52.583 "dma_device_id": "system", 00:19:52.583 "dma_device_type": 1 00:19:52.583 }, 00:19:52.583 { 00:19:52.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:52.583 "dma_device_type": 2 00:19:52.583 } 00:19:52.583 ], 00:19:52.583 "driver_specific": { 00:19:52.583 "raid": { 00:19:52.583 "uuid": "5fd80812-b897-4172-b61a-ce98fa206986", 00:19:52.583 "strip_size_kb": 64, 00:19:52.583 "state": "online", 00:19:52.583 "raid_level": "concat", 00:19:52.583 "superblock": true, 00:19:52.583 "num_base_bdevs": 4, 00:19:52.583 "num_base_bdevs_discovered": 4, 00:19:52.583 "num_base_bdevs_operational": 4, 00:19:52.583 "base_bdevs_list": [ 00:19:52.583 { 00:19:52.583 "name": "pt1", 00:19:52.583 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:52.583 "is_configured": true, 00:19:52.583 "data_offset": 2048, 00:19:52.583 "data_size": 63488 00:19:52.583 }, 00:19:52.583 { 00:19:52.583 "name": "pt2", 00:19:52.583 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:52.583 "is_configured": true, 00:19:52.583 "data_offset": 2048, 00:19:52.583 "data_size": 63488 00:19:52.583 }, 00:19:52.583 { 00:19:52.583 "name": "pt3", 00:19:52.583 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:52.583 "is_configured": true, 00:19:52.583 "data_offset": 2048, 00:19:52.583 "data_size": 63488 00:19:52.583 }, 00:19:52.583 { 00:19:52.583 "name": "pt4", 00:19:52.583 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:52.583 "is_configured": true, 00:19:52.583 "data_offset": 2048, 00:19:52.583 "data_size": 63488 00:19:52.583 } 00:19:52.583 ] 00:19:52.583 } 00:19:52.583 } 00:19:52.583 }' 00:19:52.583 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:52.583 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:52.583 pt2 00:19:52.583 pt3 00:19:52.583 pt4' 00:19:52.583 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:52.583 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:52.583 08:33:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:52.583 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:52.583 "name": "pt1", 00:19:52.583 "aliases": [ 00:19:52.583 "00000000-0000-0000-0000-000000000001" 00:19:52.583 ], 00:19:52.583 "product_name": "passthru", 00:19:52.583 "block_size": 512, 00:19:52.583 "num_blocks": 65536, 00:19:52.583 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:52.583 "assigned_rate_limits": { 00:19:52.583 "rw_ios_per_sec": 0, 00:19:52.583 "rw_mbytes_per_sec": 0, 00:19:52.583 "r_mbytes_per_sec": 0, 00:19:52.583 "w_mbytes_per_sec": 0 00:19:52.583 }, 00:19:52.583 "claimed": true, 00:19:52.583 "claim_type": "exclusive_write", 00:19:52.583 "zoned": false, 00:19:52.583 "supported_io_types": { 00:19:52.583 "read": true, 00:19:52.583 "write": true, 00:19:52.583 "unmap": true, 00:19:52.583 "flush": true, 00:19:52.583 "reset": true, 00:19:52.583 "nvme_admin": false, 00:19:52.583 "nvme_io": false, 00:19:52.583 "nvme_io_md": false, 00:19:52.583 "write_zeroes": true, 00:19:52.583 "zcopy": true, 00:19:52.583 "get_zone_info": false, 00:19:52.583 "zone_management": false, 00:19:52.583 "zone_append": false, 00:19:52.583 "compare": false, 00:19:52.583 "compare_and_write": false, 00:19:52.583 "abort": true, 00:19:52.583 "seek_hole": false, 00:19:52.583 "seek_data": false, 00:19:52.583 "copy": true, 00:19:52.583 "nvme_iov_md": false 00:19:52.583 }, 00:19:52.584 "memory_domains": [ 00:19:52.584 { 00:19:52.584 "dma_device_id": "system", 00:19:52.584 "dma_device_type": 1 00:19:52.584 }, 00:19:52.584 { 00:19:52.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:52.584 "dma_device_type": 2 00:19:52.584 } 00:19:52.584 ], 00:19:52.584 "driver_specific": { 00:19:52.584 "passthru": { 00:19:52.584 "name": "pt1", 00:19:52.584 "base_bdev_name": "malloc1" 00:19:52.584 } 00:19:52.584 } 00:19:52.584 }' 00:19:52.584 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:52.842 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:52.842 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:52.842 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:52.842 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:52.842 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:52.842 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:52.842 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:52.842 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:52.842 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:52.842 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:53.101 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:53.101 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:53.101 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:53.101 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:53.101 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:53.101 "name": "pt2", 00:19:53.101 "aliases": [ 00:19:53.101 "00000000-0000-0000-0000-000000000002" 00:19:53.101 ], 00:19:53.101 "product_name": "passthru", 00:19:53.101 "block_size": 512, 00:19:53.101 "num_blocks": 65536, 00:19:53.101 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:53.101 "assigned_rate_limits": { 00:19:53.101 "rw_ios_per_sec": 0, 00:19:53.101 "rw_mbytes_per_sec": 0, 00:19:53.101 "r_mbytes_per_sec": 0, 00:19:53.101 "w_mbytes_per_sec": 0 00:19:53.101 }, 00:19:53.101 "claimed": true, 00:19:53.101 "claim_type": "exclusive_write", 00:19:53.101 "zoned": false, 00:19:53.101 "supported_io_types": { 00:19:53.101 "read": true, 00:19:53.101 "write": true, 00:19:53.101 "unmap": true, 00:19:53.101 "flush": true, 00:19:53.101 "reset": true, 00:19:53.101 "nvme_admin": false, 00:19:53.101 "nvme_io": false, 00:19:53.101 "nvme_io_md": false, 00:19:53.101 "write_zeroes": true, 00:19:53.101 "zcopy": true, 00:19:53.101 "get_zone_info": false, 00:19:53.101 "zone_management": false, 00:19:53.101 "zone_append": false, 00:19:53.101 "compare": false, 00:19:53.101 "compare_and_write": false, 00:19:53.101 "abort": true, 00:19:53.101 "seek_hole": false, 00:19:53.101 "seek_data": false, 00:19:53.101 "copy": true, 00:19:53.101 "nvme_iov_md": false 00:19:53.101 }, 00:19:53.101 "memory_domains": [ 00:19:53.101 { 00:19:53.101 "dma_device_id": "system", 00:19:53.101 "dma_device_type": 1 00:19:53.101 }, 00:19:53.102 { 00:19:53.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.102 "dma_device_type": 2 00:19:53.102 } 00:19:53.102 ], 00:19:53.102 "driver_specific": { 00:19:53.102 "passthru": { 00:19:53.102 "name": "pt2", 00:19:53.102 "base_bdev_name": "malloc2" 00:19:53.102 } 00:19:53.102 } 00:19:53.102 }' 00:19:53.102 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:53.102 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:53.102 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:53.102 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:53.360 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:53.360 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:53.360 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:53.360 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:53.360 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:53.360 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:53.360 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:53.360 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:53.360 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:53.360 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:53.360 08:33:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:53.618 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:53.618 "name": "pt3", 00:19:53.618 "aliases": [ 00:19:53.618 "00000000-0000-0000-0000-000000000003" 00:19:53.618 ], 00:19:53.618 "product_name": "passthru", 00:19:53.618 "block_size": 512, 00:19:53.618 "num_blocks": 65536, 00:19:53.618 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:53.618 "assigned_rate_limits": { 00:19:53.618 "rw_ios_per_sec": 0, 00:19:53.618 "rw_mbytes_per_sec": 0, 00:19:53.618 "r_mbytes_per_sec": 0, 00:19:53.618 "w_mbytes_per_sec": 0 00:19:53.618 }, 00:19:53.618 "claimed": true, 00:19:53.618 "claim_type": "exclusive_write", 00:19:53.618 "zoned": false, 00:19:53.618 "supported_io_types": { 00:19:53.618 "read": true, 00:19:53.618 "write": true, 00:19:53.618 "unmap": true, 00:19:53.618 "flush": true, 00:19:53.618 "reset": true, 00:19:53.618 "nvme_admin": false, 00:19:53.618 "nvme_io": false, 00:19:53.618 "nvme_io_md": false, 00:19:53.618 "write_zeroes": true, 00:19:53.618 "zcopy": true, 00:19:53.618 "get_zone_info": false, 00:19:53.618 "zone_management": false, 00:19:53.618 "zone_append": false, 00:19:53.618 "compare": false, 00:19:53.618 "compare_and_write": false, 00:19:53.618 "abort": true, 00:19:53.618 "seek_hole": false, 00:19:53.618 "seek_data": false, 00:19:53.618 "copy": true, 00:19:53.618 "nvme_iov_md": false 00:19:53.618 }, 00:19:53.618 "memory_domains": [ 00:19:53.618 { 00:19:53.619 "dma_device_id": "system", 00:19:53.619 "dma_device_type": 1 00:19:53.619 }, 00:19:53.619 { 00:19:53.619 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.619 "dma_device_type": 2 00:19:53.619 } 00:19:53.619 ], 00:19:53.619 "driver_specific": { 00:19:53.619 "passthru": { 00:19:53.619 "name": "pt3", 00:19:53.619 "base_bdev_name": "malloc3" 00:19:53.619 } 00:19:53.619 } 00:19:53.619 }' 00:19:53.619 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:53.619 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:53.619 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:53.619 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:53.877 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:53.877 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:53.877 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:53.877 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:53.877 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:53.877 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:53.877 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:53.877 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:53.877 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:53.877 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:53.877 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:54.136 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:54.136 "name": "pt4", 00:19:54.136 "aliases": [ 00:19:54.136 "00000000-0000-0000-0000-000000000004" 00:19:54.136 ], 00:19:54.136 "product_name": "passthru", 00:19:54.136 "block_size": 512, 00:19:54.136 "num_blocks": 65536, 00:19:54.136 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:54.136 "assigned_rate_limits": { 00:19:54.136 "rw_ios_per_sec": 0, 00:19:54.136 "rw_mbytes_per_sec": 0, 00:19:54.136 "r_mbytes_per_sec": 0, 00:19:54.136 "w_mbytes_per_sec": 0 00:19:54.136 }, 00:19:54.136 "claimed": true, 00:19:54.136 "claim_type": "exclusive_write", 00:19:54.136 "zoned": false, 00:19:54.136 "supported_io_types": { 00:19:54.136 "read": true, 00:19:54.136 "write": true, 00:19:54.136 "unmap": true, 00:19:54.136 "flush": true, 00:19:54.136 "reset": true, 00:19:54.136 "nvme_admin": false, 00:19:54.136 "nvme_io": false, 00:19:54.136 "nvme_io_md": false, 00:19:54.136 "write_zeroes": true, 00:19:54.136 "zcopy": true, 00:19:54.136 "get_zone_info": false, 00:19:54.136 "zone_management": false, 00:19:54.136 "zone_append": false, 00:19:54.136 "compare": false, 00:19:54.136 "compare_and_write": false, 00:19:54.136 "abort": true, 00:19:54.136 "seek_hole": false, 00:19:54.136 "seek_data": false, 00:19:54.136 "copy": true, 00:19:54.136 "nvme_iov_md": false 00:19:54.136 }, 00:19:54.136 "memory_domains": [ 00:19:54.136 { 00:19:54.136 "dma_device_id": "system", 00:19:54.136 "dma_device_type": 1 00:19:54.136 }, 00:19:54.136 { 00:19:54.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.136 "dma_device_type": 2 00:19:54.136 } 00:19:54.136 ], 00:19:54.136 "driver_specific": { 00:19:54.136 "passthru": { 00:19:54.136 "name": "pt4", 00:19:54.136 "base_bdev_name": "malloc4" 00:19:54.136 } 00:19:54.136 } 00:19:54.136 }' 00:19:54.136 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:54.136 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:54.136 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:54.136 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:54.136 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:54.136 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:54.136 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:54.395 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:54.395 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:54.395 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:54.395 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:54.395 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:54.395 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:54.395 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:19:54.395 [2024-07-23 08:33:06.903007] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:54.653 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 5fd80812-b897-4172-b61a-ce98fa206986 '!=' 5fd80812-b897-4172-b61a-ce98fa206986 ']' 00:19:54.653 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:19:54.653 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:54.654 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:54.654 08:33:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1498500 00:19:54.654 08:33:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1498500 ']' 00:19:54.654 08:33:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1498500 00:19:54.654 08:33:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:19:54.654 08:33:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:54.654 08:33:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1498500 00:19:54.654 08:33:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:54.654 08:33:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:54.654 08:33:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1498500' 00:19:54.654 killing process with pid 1498500 00:19:54.654 08:33:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1498500 00:19:54.654 [2024-07-23 08:33:06.969252] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:54.654 [2024-07-23 08:33:06.969336] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:54.654 08:33:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1498500 00:19:54.654 [2024-07-23 08:33:06.969405] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:54.654 [2024-07-23 08:33:06.969416] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038480 name raid_bdev1, state offline 00:19:54.912 [2024-07-23 08:33:07.332759] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:56.288 08:33:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:19:56.288 00:19:56.288 real 0m13.858s 00:19:56.288 user 0m23.822s 00:19:56.288 sys 0m2.047s 00:19:56.288 08:33:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:56.288 08:33:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:56.288 ************************************ 00:19:56.288 END TEST raid_superblock_test 00:19:56.288 ************************************ 00:19:56.288 08:33:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:56.288 08:33:08 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:19:56.288 08:33:08 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:56.288 08:33:08 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:56.288 08:33:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:56.288 ************************************ 00:19:56.288 START TEST raid_read_error_test 00:19:56.288 ************************************ 00:19:56.288 08:33:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:19:56.288 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.GUDBCpFfp3 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1501486 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1501486 /var/tmp/spdk-raid.sock 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1501486 ']' 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:56.289 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:56.289 08:33:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:56.289 [2024-07-23 08:33:08.721370] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:19:56.289 [2024-07-23 08:33:08.721458] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1501486 ] 00:19:56.547 [2024-07-23 08:33:08.846537] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:56.805 [2024-07-23 08:33:09.072253] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:56.805 [2024-07-23 08:33:09.321353] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:56.805 [2024-07-23 08:33:09.321386] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:57.064 08:33:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:57.064 08:33:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:57.064 08:33:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:57.064 08:33:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:57.323 BaseBdev1_malloc 00:19:57.323 08:33:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:57.581 true 00:19:57.581 08:33:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:57.581 [2024-07-23 08:33:10.000671] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:57.581 [2024-07-23 08:33:10.000732] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:57.581 [2024-07-23 08:33:10.000754] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:19:57.581 [2024-07-23 08:33:10.000766] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:57.581 [2024-07-23 08:33:10.002904] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:57.581 [2024-07-23 08:33:10.002937] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:57.581 BaseBdev1 00:19:57.582 08:33:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:57.582 08:33:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:57.840 BaseBdev2_malloc 00:19:57.840 08:33:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:58.098 true 00:19:58.099 08:33:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:58.099 [2024-07-23 08:33:10.528270] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:58.099 [2024-07-23 08:33:10.528319] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:58.099 [2024-07-23 08:33:10.528338] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:19:58.099 [2024-07-23 08:33:10.528351] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:58.099 [2024-07-23 08:33:10.530326] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:58.099 [2024-07-23 08:33:10.530356] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:58.099 BaseBdev2 00:19:58.099 08:33:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:58.099 08:33:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:58.357 BaseBdev3_malloc 00:19:58.357 08:33:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:58.615 true 00:19:58.615 08:33:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:58.615 [2024-07-23 08:33:11.036142] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:58.615 [2024-07-23 08:33:11.036195] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:58.615 [2024-07-23 08:33:11.036215] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036980 00:19:58.615 [2024-07-23 08:33:11.036227] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:58.615 [2024-07-23 08:33:11.038229] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:58.616 [2024-07-23 08:33:11.038258] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:58.616 BaseBdev3 00:19:58.616 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:58.616 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:58.873 BaseBdev4_malloc 00:19:58.873 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:59.131 true 00:19:59.131 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:59.131 [2024-07-23 08:33:11.608621] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:59.131 [2024-07-23 08:33:11.608673] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:59.131 [2024-07-23 08:33:11.608710] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037880 00:19:59.132 [2024-07-23 08:33:11.608721] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:59.132 [2024-07-23 08:33:11.610627] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:59.132 [2024-07-23 08:33:11.610656] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:59.132 BaseBdev4 00:19:59.132 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:59.411 [2024-07-23 08:33:11.777117] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:59.411 [2024-07-23 08:33:11.778699] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:59.411 [2024-07-23 08:33:11.778771] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:59.411 [2024-07-23 08:33:11.778832] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:59.411 [2024-07-23 08:33:11.779061] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037e80 00:19:59.411 [2024-07-23 08:33:11.779075] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:59.411 [2024-07-23 08:33:11.779321] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:19:59.411 [2024-07-23 08:33:11.779532] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037e80 00:19:59.411 [2024-07-23 08:33:11.779543] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037e80 00:19:59.411 [2024-07-23 08:33:11.779716] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:59.411 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:19:59.411 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:59.411 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:59.411 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:59.411 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:59.411 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:59.411 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:59.411 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:59.411 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:59.411 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:59.411 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.411 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.680 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:59.681 "name": "raid_bdev1", 00:19:59.681 "uuid": "bfd819ef-047e-45d3-8f64-8d250bfe7dd6", 00:19:59.681 "strip_size_kb": 64, 00:19:59.681 "state": "online", 00:19:59.681 "raid_level": "concat", 00:19:59.681 "superblock": true, 00:19:59.681 "num_base_bdevs": 4, 00:19:59.681 "num_base_bdevs_discovered": 4, 00:19:59.681 "num_base_bdevs_operational": 4, 00:19:59.681 "base_bdevs_list": [ 00:19:59.681 { 00:19:59.681 "name": "BaseBdev1", 00:19:59.681 "uuid": "9c77225a-5b10-5c2b-ac94-71ff408d7cc4", 00:19:59.681 "is_configured": true, 00:19:59.681 "data_offset": 2048, 00:19:59.681 "data_size": 63488 00:19:59.681 }, 00:19:59.681 { 00:19:59.681 "name": "BaseBdev2", 00:19:59.681 "uuid": "b8e0b353-d07b-508f-9583-2c9fad664a97", 00:19:59.681 "is_configured": true, 00:19:59.681 "data_offset": 2048, 00:19:59.681 "data_size": 63488 00:19:59.681 }, 00:19:59.681 { 00:19:59.681 "name": "BaseBdev3", 00:19:59.681 "uuid": "3777dab9-9273-55ec-acdb-b2499487abe8", 00:19:59.681 "is_configured": true, 00:19:59.681 "data_offset": 2048, 00:19:59.681 "data_size": 63488 00:19:59.681 }, 00:19:59.681 { 00:19:59.681 "name": "BaseBdev4", 00:19:59.681 "uuid": "488147ce-f009-5a56-84fd-df3a3554eb03", 00:19:59.681 "is_configured": true, 00:19:59.681 "data_offset": 2048, 00:19:59.681 "data_size": 63488 00:19:59.681 } 00:19:59.681 ] 00:19:59.681 }' 00:19:59.681 08:33:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:59.681 08:33:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:59.944 08:33:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:59.944 08:33:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:00.202 [2024-07-23 08:33:12.492437] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c130 00:20:01.192 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:20:01.192 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:01.192 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:20:01.192 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:01.192 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:01.192 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:01.192 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:01.192 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:01.192 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:01.192 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:01.192 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:01.192 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:01.192 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:01.192 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:01.192 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.192 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:01.451 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:01.451 "name": "raid_bdev1", 00:20:01.451 "uuid": "bfd819ef-047e-45d3-8f64-8d250bfe7dd6", 00:20:01.451 "strip_size_kb": 64, 00:20:01.451 "state": "online", 00:20:01.451 "raid_level": "concat", 00:20:01.451 "superblock": true, 00:20:01.451 "num_base_bdevs": 4, 00:20:01.451 "num_base_bdevs_discovered": 4, 00:20:01.451 "num_base_bdevs_operational": 4, 00:20:01.451 "base_bdevs_list": [ 00:20:01.451 { 00:20:01.451 "name": "BaseBdev1", 00:20:01.451 "uuid": "9c77225a-5b10-5c2b-ac94-71ff408d7cc4", 00:20:01.451 "is_configured": true, 00:20:01.451 "data_offset": 2048, 00:20:01.451 "data_size": 63488 00:20:01.451 }, 00:20:01.451 { 00:20:01.451 "name": "BaseBdev2", 00:20:01.451 "uuid": "b8e0b353-d07b-508f-9583-2c9fad664a97", 00:20:01.451 "is_configured": true, 00:20:01.451 "data_offset": 2048, 00:20:01.451 "data_size": 63488 00:20:01.451 }, 00:20:01.451 { 00:20:01.451 "name": "BaseBdev3", 00:20:01.451 "uuid": "3777dab9-9273-55ec-acdb-b2499487abe8", 00:20:01.451 "is_configured": true, 00:20:01.451 "data_offset": 2048, 00:20:01.451 "data_size": 63488 00:20:01.451 }, 00:20:01.451 { 00:20:01.451 "name": "BaseBdev4", 00:20:01.451 "uuid": "488147ce-f009-5a56-84fd-df3a3554eb03", 00:20:01.451 "is_configured": true, 00:20:01.451 "data_offset": 2048, 00:20:01.451 "data_size": 63488 00:20:01.451 } 00:20:01.451 ] 00:20:01.451 }' 00:20:01.451 08:33:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:01.451 08:33:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:01.709 08:33:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:01.968 [2024-07-23 08:33:14.376984] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:01.968 [2024-07-23 08:33:14.377016] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:01.968 [2024-07-23 08:33:14.379441] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:01.968 [2024-07-23 08:33:14.379489] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:01.968 [2024-07-23 08:33:14.379532] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:01.968 [2024-07-23 08:33:14.379546] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037e80 name raid_bdev1, state offline 00:20:01.968 0 00:20:01.968 08:33:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1501486 00:20:01.968 08:33:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1501486 ']' 00:20:01.968 08:33:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1501486 00:20:01.968 08:33:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:20:01.968 08:33:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:01.968 08:33:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1501486 00:20:01.968 08:33:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:01.968 08:33:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:01.968 08:33:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1501486' 00:20:01.968 killing process with pid 1501486 00:20:01.968 08:33:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1501486 00:20:01.968 [2024-07-23 08:33:14.437686] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:01.968 08:33:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1501486 00:20:02.227 [2024-07-23 08:33:14.706596] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:03.603 08:33:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:03.603 08:33:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.GUDBCpFfp3 00:20:03.603 08:33:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:03.603 08:33:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:20:03.603 08:33:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:20:03.603 08:33:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:03.603 08:33:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:03.603 08:33:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:20:03.603 00:20:03.603 real 0m7.386s 00:20:03.603 user 0m10.578s 00:20:03.603 sys 0m0.976s 00:20:03.603 08:33:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:03.603 08:33:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:03.603 ************************************ 00:20:03.603 END TEST raid_read_error_test 00:20:03.603 ************************************ 00:20:03.603 08:33:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:03.603 08:33:16 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:20:03.603 08:33:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:03.603 08:33:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:03.603 08:33:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:03.603 ************************************ 00:20:03.603 START TEST raid_write_error_test 00:20:03.603 ************************************ 00:20:03.603 08:33:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:20:03.603 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:20:03.603 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:03.603 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:20:03.603 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:03.603 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:03.603 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:03.603 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:03.603 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:03.603 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:03.603 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.D4l3gN8Vxi 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1502930 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1502930 /var/tmp/spdk-raid.sock 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1502930 ']' 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:03.604 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:03.604 08:33:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:03.863 [2024-07-23 08:33:16.168803] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:20:03.863 [2024-07-23 08:33:16.168892] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1502930 ] 00:20:03.863 [2024-07-23 08:33:16.292344] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:04.122 [2024-07-23 08:33:16.506203] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:04.380 [2024-07-23 08:33:16.800527] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:04.380 [2024-07-23 08:33:16.800557] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:04.639 08:33:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:04.639 08:33:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:04.639 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:04.639 08:33:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:04.639 BaseBdev1_malloc 00:20:04.898 08:33:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:04.898 true 00:20:04.898 08:33:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:05.158 [2024-07-23 08:33:17.497025] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:05.158 [2024-07-23 08:33:17.497077] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:05.158 [2024-07-23 08:33:17.497097] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:20:05.158 [2024-07-23 08:33:17.497108] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:05.158 [2024-07-23 08:33:17.499068] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:05.158 [2024-07-23 08:33:17.499099] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:05.158 BaseBdev1 00:20:05.158 08:33:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:05.158 08:33:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:05.417 BaseBdev2_malloc 00:20:05.417 08:33:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:05.417 true 00:20:05.417 08:33:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:05.676 [2024-07-23 08:33:18.041919] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:05.676 [2024-07-23 08:33:18.041974] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:05.676 [2024-07-23 08:33:18.041994] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:20:05.676 [2024-07-23 08:33:18.042008] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:05.676 [2024-07-23 08:33:18.044024] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:05.676 [2024-07-23 08:33:18.044055] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:05.676 BaseBdev2 00:20:05.676 08:33:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:05.676 08:33:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:05.935 BaseBdev3_malloc 00:20:05.935 08:33:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:05.935 true 00:20:05.935 08:33:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:06.193 [2024-07-23 08:33:18.593572] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:06.193 [2024-07-23 08:33:18.593651] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:06.193 [2024-07-23 08:33:18.593674] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036980 00:20:06.193 [2024-07-23 08:33:18.593685] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:06.193 [2024-07-23 08:33:18.595683] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:06.193 [2024-07-23 08:33:18.595710] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:06.193 BaseBdev3 00:20:06.193 08:33:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:06.193 08:33:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:06.451 BaseBdev4_malloc 00:20:06.451 08:33:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:06.710 true 00:20:06.710 08:33:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:06.710 [2024-07-23 08:33:19.146879] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:06.710 [2024-07-23 08:33:19.146932] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:06.710 [2024-07-23 08:33:19.146970] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037880 00:20:06.710 [2024-07-23 08:33:19.146981] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:06.710 [2024-07-23 08:33:19.148896] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:06.710 [2024-07-23 08:33:19.148925] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:06.710 BaseBdev4 00:20:06.711 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:06.970 [2024-07-23 08:33:19.311352] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:06.970 [2024-07-23 08:33:19.312960] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:06.970 [2024-07-23 08:33:19.313035] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:06.970 [2024-07-23 08:33:19.313101] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:06.970 [2024-07-23 08:33:19.313331] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037e80 00:20:06.970 [2024-07-23 08:33:19.313344] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:06.970 [2024-07-23 08:33:19.313590] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:20:06.970 [2024-07-23 08:33:19.313796] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037e80 00:20:06.970 [2024-07-23 08:33:19.313807] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037e80 00:20:06.970 [2024-07-23 08:33:19.313968] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:06.970 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:06.970 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:06.970 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:06.970 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:06.970 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:06.970 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:06.970 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:06.970 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:06.970 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:06.970 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:06.970 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.970 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:07.229 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:07.229 "name": "raid_bdev1", 00:20:07.229 "uuid": "52df605c-ae9b-4cc3-8e79-d3dc3d2c30c4", 00:20:07.229 "strip_size_kb": 64, 00:20:07.229 "state": "online", 00:20:07.229 "raid_level": "concat", 00:20:07.229 "superblock": true, 00:20:07.229 "num_base_bdevs": 4, 00:20:07.229 "num_base_bdevs_discovered": 4, 00:20:07.229 "num_base_bdevs_operational": 4, 00:20:07.229 "base_bdevs_list": [ 00:20:07.229 { 00:20:07.229 "name": "BaseBdev1", 00:20:07.229 "uuid": "80a4a171-ec77-51d6-bb8a-d9c91565822f", 00:20:07.229 "is_configured": true, 00:20:07.229 "data_offset": 2048, 00:20:07.229 "data_size": 63488 00:20:07.229 }, 00:20:07.229 { 00:20:07.229 "name": "BaseBdev2", 00:20:07.229 "uuid": "e45651ce-971d-53f7-bbc2-d2c7aed203b6", 00:20:07.229 "is_configured": true, 00:20:07.229 "data_offset": 2048, 00:20:07.229 "data_size": 63488 00:20:07.229 }, 00:20:07.229 { 00:20:07.229 "name": "BaseBdev3", 00:20:07.229 "uuid": "83c26f30-de15-52ff-837f-420d72b4af6c", 00:20:07.229 "is_configured": true, 00:20:07.229 "data_offset": 2048, 00:20:07.229 "data_size": 63488 00:20:07.229 }, 00:20:07.229 { 00:20:07.229 "name": "BaseBdev4", 00:20:07.229 "uuid": "d5993f73-6524-5446-bdb5-84db35a53acc", 00:20:07.229 "is_configured": true, 00:20:07.229 "data_offset": 2048, 00:20:07.229 "data_size": 63488 00:20:07.229 } 00:20:07.229 ] 00:20:07.229 }' 00:20:07.229 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:07.229 08:33:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:07.487 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:07.487 08:33:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:07.746 [2024-07-23 08:33:20.062760] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c130 00:20:08.684 08:33:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:08.684 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:08.684 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:20:08.684 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:08.684 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:08.684 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:08.684 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:08.684 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:08.684 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:08.684 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:08.684 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:08.684 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:08.684 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:08.684 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:08.684 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.684 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:08.943 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:08.943 "name": "raid_bdev1", 00:20:08.943 "uuid": "52df605c-ae9b-4cc3-8e79-d3dc3d2c30c4", 00:20:08.943 "strip_size_kb": 64, 00:20:08.943 "state": "online", 00:20:08.943 "raid_level": "concat", 00:20:08.943 "superblock": true, 00:20:08.943 "num_base_bdevs": 4, 00:20:08.943 "num_base_bdevs_discovered": 4, 00:20:08.943 "num_base_bdevs_operational": 4, 00:20:08.943 "base_bdevs_list": [ 00:20:08.943 { 00:20:08.943 "name": "BaseBdev1", 00:20:08.943 "uuid": "80a4a171-ec77-51d6-bb8a-d9c91565822f", 00:20:08.943 "is_configured": true, 00:20:08.943 "data_offset": 2048, 00:20:08.943 "data_size": 63488 00:20:08.943 }, 00:20:08.943 { 00:20:08.943 "name": "BaseBdev2", 00:20:08.943 "uuid": "e45651ce-971d-53f7-bbc2-d2c7aed203b6", 00:20:08.943 "is_configured": true, 00:20:08.943 "data_offset": 2048, 00:20:08.943 "data_size": 63488 00:20:08.943 }, 00:20:08.943 { 00:20:08.943 "name": "BaseBdev3", 00:20:08.943 "uuid": "83c26f30-de15-52ff-837f-420d72b4af6c", 00:20:08.943 "is_configured": true, 00:20:08.943 "data_offset": 2048, 00:20:08.943 "data_size": 63488 00:20:08.943 }, 00:20:08.943 { 00:20:08.943 "name": "BaseBdev4", 00:20:08.943 "uuid": "d5993f73-6524-5446-bdb5-84db35a53acc", 00:20:08.943 "is_configured": true, 00:20:08.943 "data_offset": 2048, 00:20:08.943 "data_size": 63488 00:20:08.943 } 00:20:08.943 ] 00:20:08.943 }' 00:20:08.943 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:08.943 08:33:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:09.511 08:33:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:09.511 [2024-07-23 08:33:21.983022] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:09.511 [2024-07-23 08:33:21.983056] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:09.511 [2024-07-23 08:33:21.985470] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:09.511 [2024-07-23 08:33:21.985521] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:09.511 [2024-07-23 08:33:21.985565] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:09.511 [2024-07-23 08:33:21.985579] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037e80 name raid_bdev1, state offline 00:20:09.511 0 00:20:09.511 08:33:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1502930 00:20:09.511 08:33:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1502930 ']' 00:20:09.511 08:33:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1502930 00:20:09.511 08:33:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:20:09.511 08:33:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:09.511 08:33:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1502930 00:20:09.770 08:33:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:09.770 08:33:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:09.770 08:33:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1502930' 00:20:09.770 killing process with pid 1502930 00:20:09.770 08:33:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1502930 00:20:09.770 [2024-07-23 08:33:22.046987] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:09.770 08:33:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1502930 00:20:10.029 [2024-07-23 08:33:22.331514] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:11.407 08:33:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:11.407 08:33:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.D4l3gN8Vxi 00:20:11.407 08:33:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:11.407 08:33:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:20:11.407 08:33:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:20:11.407 08:33:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:11.407 08:33:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:11.407 08:33:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:20:11.407 00:20:11.407 real 0m7.604s 00:20:11.407 user 0m10.915s 00:20:11.407 sys 0m1.043s 00:20:11.407 08:33:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:11.407 08:33:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:11.407 ************************************ 00:20:11.407 END TEST raid_write_error_test 00:20:11.407 ************************************ 00:20:11.407 08:33:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:11.407 08:33:23 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:20:11.407 08:33:23 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:20:11.407 08:33:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:11.407 08:33:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:11.407 08:33:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:11.407 ************************************ 00:20:11.407 START TEST raid_state_function_test 00:20:11.407 ************************************ 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1504427 00:20:11.407 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1504427' 00:20:11.407 Process raid pid: 1504427 00:20:11.408 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:11.408 08:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1504427 /var/tmp/spdk-raid.sock 00:20:11.408 08:33:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1504427 ']' 00:20:11.408 08:33:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:11.408 08:33:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:11.408 08:33:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:11.408 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:11.408 08:33:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:11.408 08:33:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:11.408 [2024-07-23 08:33:23.835921] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:20:11.408 [2024-07-23 08:33:23.836007] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:11.667 [2024-07-23 08:33:23.961960] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:11.667 [2024-07-23 08:33:24.167797] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:11.926 [2024-07-23 08:33:24.421500] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:11.926 [2024-07-23 08:33:24.421531] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:12.185 08:33:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:12.185 08:33:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:20:12.185 08:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:12.444 [2024-07-23 08:33:24.755385] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:12.444 [2024-07-23 08:33:24.755429] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:12.444 [2024-07-23 08:33:24.755439] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:12.444 [2024-07-23 08:33:24.755453] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:12.444 [2024-07-23 08:33:24.755460] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:12.444 [2024-07-23 08:33:24.755469] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:12.444 [2024-07-23 08:33:24.755475] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:12.444 [2024-07-23 08:33:24.755484] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:12.444 08:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:12.444 08:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:12.444 08:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:12.444 08:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:12.444 08:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:12.444 08:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:12.444 08:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:12.444 08:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:12.444 08:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:12.444 08:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:12.444 08:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.444 08:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:12.444 08:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:12.444 "name": "Existed_Raid", 00:20:12.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:12.444 "strip_size_kb": 0, 00:20:12.444 "state": "configuring", 00:20:12.444 "raid_level": "raid1", 00:20:12.444 "superblock": false, 00:20:12.444 "num_base_bdevs": 4, 00:20:12.444 "num_base_bdevs_discovered": 0, 00:20:12.444 "num_base_bdevs_operational": 4, 00:20:12.444 "base_bdevs_list": [ 00:20:12.444 { 00:20:12.444 "name": "BaseBdev1", 00:20:12.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:12.444 "is_configured": false, 00:20:12.444 "data_offset": 0, 00:20:12.444 "data_size": 0 00:20:12.444 }, 00:20:12.444 { 00:20:12.444 "name": "BaseBdev2", 00:20:12.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:12.444 "is_configured": false, 00:20:12.444 "data_offset": 0, 00:20:12.444 "data_size": 0 00:20:12.444 }, 00:20:12.444 { 00:20:12.444 "name": "BaseBdev3", 00:20:12.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:12.444 "is_configured": false, 00:20:12.444 "data_offset": 0, 00:20:12.444 "data_size": 0 00:20:12.444 }, 00:20:12.444 { 00:20:12.444 "name": "BaseBdev4", 00:20:12.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:12.444 "is_configured": false, 00:20:12.444 "data_offset": 0, 00:20:12.444 "data_size": 0 00:20:12.444 } 00:20:12.444 ] 00:20:12.444 }' 00:20:12.444 08:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:12.444 08:33:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:13.011 08:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:13.269 [2024-07-23 08:33:25.565405] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:13.269 [2024-07-23 08:33:25.565440] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:20:13.269 08:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:13.269 [2024-07-23 08:33:25.741883] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:13.269 [2024-07-23 08:33:25.741922] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:13.269 [2024-07-23 08:33:25.741933] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:13.269 [2024-07-23 08:33:25.741958] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:13.269 [2024-07-23 08:33:25.741965] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:13.269 [2024-07-23 08:33:25.741974] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:13.269 [2024-07-23 08:33:25.741981] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:13.269 [2024-07-23 08:33:25.741990] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:13.269 08:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:13.527 [2024-07-23 08:33:25.947903] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:13.527 BaseBdev1 00:20:13.527 08:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:13.527 08:33:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:13.527 08:33:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:13.527 08:33:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:13.527 08:33:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:13.527 08:33:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:13.527 08:33:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:13.785 08:33:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:14.044 [ 00:20:14.044 { 00:20:14.044 "name": "BaseBdev1", 00:20:14.044 "aliases": [ 00:20:14.044 "dc1bf698-d1b6-4fae-828b-f518e3859f34" 00:20:14.044 ], 00:20:14.044 "product_name": "Malloc disk", 00:20:14.044 "block_size": 512, 00:20:14.044 "num_blocks": 65536, 00:20:14.044 "uuid": "dc1bf698-d1b6-4fae-828b-f518e3859f34", 00:20:14.044 "assigned_rate_limits": { 00:20:14.044 "rw_ios_per_sec": 0, 00:20:14.044 "rw_mbytes_per_sec": 0, 00:20:14.044 "r_mbytes_per_sec": 0, 00:20:14.044 "w_mbytes_per_sec": 0 00:20:14.044 }, 00:20:14.044 "claimed": true, 00:20:14.044 "claim_type": "exclusive_write", 00:20:14.044 "zoned": false, 00:20:14.044 "supported_io_types": { 00:20:14.044 "read": true, 00:20:14.044 "write": true, 00:20:14.044 "unmap": true, 00:20:14.044 "flush": true, 00:20:14.044 "reset": true, 00:20:14.044 "nvme_admin": false, 00:20:14.044 "nvme_io": false, 00:20:14.044 "nvme_io_md": false, 00:20:14.044 "write_zeroes": true, 00:20:14.044 "zcopy": true, 00:20:14.044 "get_zone_info": false, 00:20:14.044 "zone_management": false, 00:20:14.044 "zone_append": false, 00:20:14.044 "compare": false, 00:20:14.044 "compare_and_write": false, 00:20:14.044 "abort": true, 00:20:14.044 "seek_hole": false, 00:20:14.044 "seek_data": false, 00:20:14.044 "copy": true, 00:20:14.044 "nvme_iov_md": false 00:20:14.044 }, 00:20:14.044 "memory_domains": [ 00:20:14.044 { 00:20:14.044 "dma_device_id": "system", 00:20:14.044 "dma_device_type": 1 00:20:14.044 }, 00:20:14.044 { 00:20:14.044 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.044 "dma_device_type": 2 00:20:14.044 } 00:20:14.044 ], 00:20:14.044 "driver_specific": {} 00:20:14.044 } 00:20:14.044 ] 00:20:14.044 08:33:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:14.044 08:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:14.044 08:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:14.044 08:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:14.044 08:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:14.044 08:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:14.044 08:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:14.044 08:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:14.044 08:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:14.044 08:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:14.044 08:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:14.044 08:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.044 08:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:14.044 08:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:14.044 "name": "Existed_Raid", 00:20:14.044 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:14.044 "strip_size_kb": 0, 00:20:14.044 "state": "configuring", 00:20:14.044 "raid_level": "raid1", 00:20:14.044 "superblock": false, 00:20:14.044 "num_base_bdevs": 4, 00:20:14.044 "num_base_bdevs_discovered": 1, 00:20:14.044 "num_base_bdevs_operational": 4, 00:20:14.044 "base_bdevs_list": [ 00:20:14.044 { 00:20:14.044 "name": "BaseBdev1", 00:20:14.044 "uuid": "dc1bf698-d1b6-4fae-828b-f518e3859f34", 00:20:14.044 "is_configured": true, 00:20:14.044 "data_offset": 0, 00:20:14.044 "data_size": 65536 00:20:14.044 }, 00:20:14.044 { 00:20:14.044 "name": "BaseBdev2", 00:20:14.044 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:14.044 "is_configured": false, 00:20:14.044 "data_offset": 0, 00:20:14.044 "data_size": 0 00:20:14.044 }, 00:20:14.044 { 00:20:14.044 "name": "BaseBdev3", 00:20:14.044 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:14.044 "is_configured": false, 00:20:14.044 "data_offset": 0, 00:20:14.044 "data_size": 0 00:20:14.044 }, 00:20:14.044 { 00:20:14.044 "name": "BaseBdev4", 00:20:14.044 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:14.044 "is_configured": false, 00:20:14.044 "data_offset": 0, 00:20:14.044 "data_size": 0 00:20:14.044 } 00:20:14.044 ] 00:20:14.044 }' 00:20:14.044 08:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:14.044 08:33:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:14.682 08:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:14.682 [2024-07-23 08:33:27.151128] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:14.682 [2024-07-23 08:33:27.151178] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:20:14.682 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:14.941 [2024-07-23 08:33:27.319590] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:14.941 [2024-07-23 08:33:27.321168] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:14.941 [2024-07-23 08:33:27.321202] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:14.941 [2024-07-23 08:33:27.321211] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:14.941 [2024-07-23 08:33:27.321221] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:14.941 [2024-07-23 08:33:27.321228] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:14.941 [2024-07-23 08:33:27.321239] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:14.941 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:14.941 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:14.941 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:14.941 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:14.941 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:14.941 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:14.941 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:14.941 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:14.941 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:14.941 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:14.941 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:14.941 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:14.941 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.941 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:15.199 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:15.199 "name": "Existed_Raid", 00:20:15.199 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.199 "strip_size_kb": 0, 00:20:15.199 "state": "configuring", 00:20:15.199 "raid_level": "raid1", 00:20:15.199 "superblock": false, 00:20:15.199 "num_base_bdevs": 4, 00:20:15.199 "num_base_bdevs_discovered": 1, 00:20:15.199 "num_base_bdevs_operational": 4, 00:20:15.199 "base_bdevs_list": [ 00:20:15.199 { 00:20:15.199 "name": "BaseBdev1", 00:20:15.199 "uuid": "dc1bf698-d1b6-4fae-828b-f518e3859f34", 00:20:15.199 "is_configured": true, 00:20:15.199 "data_offset": 0, 00:20:15.199 "data_size": 65536 00:20:15.199 }, 00:20:15.199 { 00:20:15.199 "name": "BaseBdev2", 00:20:15.199 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.199 "is_configured": false, 00:20:15.199 "data_offset": 0, 00:20:15.199 "data_size": 0 00:20:15.199 }, 00:20:15.199 { 00:20:15.200 "name": "BaseBdev3", 00:20:15.200 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.200 "is_configured": false, 00:20:15.200 "data_offset": 0, 00:20:15.200 "data_size": 0 00:20:15.200 }, 00:20:15.200 { 00:20:15.200 "name": "BaseBdev4", 00:20:15.200 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.200 "is_configured": false, 00:20:15.200 "data_offset": 0, 00:20:15.200 "data_size": 0 00:20:15.200 } 00:20:15.200 ] 00:20:15.200 }' 00:20:15.200 08:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:15.200 08:33:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:15.765 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:15.765 [2024-07-23 08:33:28.202372] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:15.765 BaseBdev2 00:20:15.765 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:15.765 08:33:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:15.765 08:33:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:15.765 08:33:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:15.765 08:33:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:15.765 08:33:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:15.765 08:33:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:16.023 08:33:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:16.023 [ 00:20:16.023 { 00:20:16.023 "name": "BaseBdev2", 00:20:16.023 "aliases": [ 00:20:16.023 "6059bd82-adac-43f1-8450-346d157d42db" 00:20:16.023 ], 00:20:16.023 "product_name": "Malloc disk", 00:20:16.023 "block_size": 512, 00:20:16.023 "num_blocks": 65536, 00:20:16.023 "uuid": "6059bd82-adac-43f1-8450-346d157d42db", 00:20:16.023 "assigned_rate_limits": { 00:20:16.023 "rw_ios_per_sec": 0, 00:20:16.023 "rw_mbytes_per_sec": 0, 00:20:16.023 "r_mbytes_per_sec": 0, 00:20:16.023 "w_mbytes_per_sec": 0 00:20:16.023 }, 00:20:16.023 "claimed": true, 00:20:16.023 "claim_type": "exclusive_write", 00:20:16.023 "zoned": false, 00:20:16.023 "supported_io_types": { 00:20:16.023 "read": true, 00:20:16.023 "write": true, 00:20:16.023 "unmap": true, 00:20:16.023 "flush": true, 00:20:16.023 "reset": true, 00:20:16.023 "nvme_admin": false, 00:20:16.023 "nvme_io": false, 00:20:16.023 "nvme_io_md": false, 00:20:16.023 "write_zeroes": true, 00:20:16.023 "zcopy": true, 00:20:16.023 "get_zone_info": false, 00:20:16.023 "zone_management": false, 00:20:16.023 "zone_append": false, 00:20:16.023 "compare": false, 00:20:16.023 "compare_and_write": false, 00:20:16.023 "abort": true, 00:20:16.023 "seek_hole": false, 00:20:16.023 "seek_data": false, 00:20:16.023 "copy": true, 00:20:16.023 "nvme_iov_md": false 00:20:16.023 }, 00:20:16.023 "memory_domains": [ 00:20:16.023 { 00:20:16.023 "dma_device_id": "system", 00:20:16.023 "dma_device_type": 1 00:20:16.023 }, 00:20:16.023 { 00:20:16.023 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:16.023 "dma_device_type": 2 00:20:16.023 } 00:20:16.023 ], 00:20:16.023 "driver_specific": {} 00:20:16.023 } 00:20:16.023 ] 00:20:16.023 08:33:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:16.281 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:16.281 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:16.281 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:16.281 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:16.281 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:16.281 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:16.281 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:16.281 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:16.281 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:16.281 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:16.281 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:16.281 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:16.281 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:16.281 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:16.281 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:16.281 "name": "Existed_Raid", 00:20:16.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:16.282 "strip_size_kb": 0, 00:20:16.282 "state": "configuring", 00:20:16.282 "raid_level": "raid1", 00:20:16.282 "superblock": false, 00:20:16.282 "num_base_bdevs": 4, 00:20:16.282 "num_base_bdevs_discovered": 2, 00:20:16.282 "num_base_bdevs_operational": 4, 00:20:16.282 "base_bdevs_list": [ 00:20:16.282 { 00:20:16.282 "name": "BaseBdev1", 00:20:16.282 "uuid": "dc1bf698-d1b6-4fae-828b-f518e3859f34", 00:20:16.282 "is_configured": true, 00:20:16.282 "data_offset": 0, 00:20:16.282 "data_size": 65536 00:20:16.282 }, 00:20:16.282 { 00:20:16.282 "name": "BaseBdev2", 00:20:16.282 "uuid": "6059bd82-adac-43f1-8450-346d157d42db", 00:20:16.282 "is_configured": true, 00:20:16.282 "data_offset": 0, 00:20:16.282 "data_size": 65536 00:20:16.282 }, 00:20:16.282 { 00:20:16.282 "name": "BaseBdev3", 00:20:16.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:16.282 "is_configured": false, 00:20:16.282 "data_offset": 0, 00:20:16.282 "data_size": 0 00:20:16.282 }, 00:20:16.282 { 00:20:16.282 "name": "BaseBdev4", 00:20:16.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:16.282 "is_configured": false, 00:20:16.282 "data_offset": 0, 00:20:16.282 "data_size": 0 00:20:16.282 } 00:20:16.282 ] 00:20:16.282 }' 00:20:16.282 08:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:16.282 08:33:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:16.848 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:17.105 [2024-07-23 08:33:29.386405] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:17.105 BaseBdev3 00:20:17.105 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:17.105 08:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:17.105 08:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:17.105 08:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:17.105 08:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:17.105 08:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:17.105 08:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:17.105 08:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:17.363 [ 00:20:17.363 { 00:20:17.363 "name": "BaseBdev3", 00:20:17.363 "aliases": [ 00:20:17.363 "d281fd1a-79c3-4f73-848f-04190f249d24" 00:20:17.363 ], 00:20:17.363 "product_name": "Malloc disk", 00:20:17.363 "block_size": 512, 00:20:17.363 "num_blocks": 65536, 00:20:17.363 "uuid": "d281fd1a-79c3-4f73-848f-04190f249d24", 00:20:17.363 "assigned_rate_limits": { 00:20:17.363 "rw_ios_per_sec": 0, 00:20:17.363 "rw_mbytes_per_sec": 0, 00:20:17.363 "r_mbytes_per_sec": 0, 00:20:17.363 "w_mbytes_per_sec": 0 00:20:17.363 }, 00:20:17.363 "claimed": true, 00:20:17.363 "claim_type": "exclusive_write", 00:20:17.363 "zoned": false, 00:20:17.363 "supported_io_types": { 00:20:17.363 "read": true, 00:20:17.363 "write": true, 00:20:17.363 "unmap": true, 00:20:17.363 "flush": true, 00:20:17.363 "reset": true, 00:20:17.363 "nvme_admin": false, 00:20:17.363 "nvme_io": false, 00:20:17.363 "nvme_io_md": false, 00:20:17.363 "write_zeroes": true, 00:20:17.363 "zcopy": true, 00:20:17.363 "get_zone_info": false, 00:20:17.363 "zone_management": false, 00:20:17.363 "zone_append": false, 00:20:17.363 "compare": false, 00:20:17.363 "compare_and_write": false, 00:20:17.363 "abort": true, 00:20:17.363 "seek_hole": false, 00:20:17.363 "seek_data": false, 00:20:17.363 "copy": true, 00:20:17.363 "nvme_iov_md": false 00:20:17.363 }, 00:20:17.363 "memory_domains": [ 00:20:17.363 { 00:20:17.363 "dma_device_id": "system", 00:20:17.363 "dma_device_type": 1 00:20:17.363 }, 00:20:17.363 { 00:20:17.363 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.363 "dma_device_type": 2 00:20:17.363 } 00:20:17.363 ], 00:20:17.363 "driver_specific": {} 00:20:17.363 } 00:20:17.363 ] 00:20:17.363 08:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:17.363 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:17.363 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:17.363 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:17.363 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:17.363 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:17.363 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:17.364 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:17.364 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:17.364 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:17.364 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:17.364 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:17.364 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:17.364 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.364 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:17.622 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:17.622 "name": "Existed_Raid", 00:20:17.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.622 "strip_size_kb": 0, 00:20:17.622 "state": "configuring", 00:20:17.622 "raid_level": "raid1", 00:20:17.622 "superblock": false, 00:20:17.622 "num_base_bdevs": 4, 00:20:17.622 "num_base_bdevs_discovered": 3, 00:20:17.622 "num_base_bdevs_operational": 4, 00:20:17.622 "base_bdevs_list": [ 00:20:17.622 { 00:20:17.622 "name": "BaseBdev1", 00:20:17.622 "uuid": "dc1bf698-d1b6-4fae-828b-f518e3859f34", 00:20:17.622 "is_configured": true, 00:20:17.622 "data_offset": 0, 00:20:17.622 "data_size": 65536 00:20:17.622 }, 00:20:17.622 { 00:20:17.622 "name": "BaseBdev2", 00:20:17.622 "uuid": "6059bd82-adac-43f1-8450-346d157d42db", 00:20:17.622 "is_configured": true, 00:20:17.622 "data_offset": 0, 00:20:17.622 "data_size": 65536 00:20:17.622 }, 00:20:17.622 { 00:20:17.622 "name": "BaseBdev3", 00:20:17.622 "uuid": "d281fd1a-79c3-4f73-848f-04190f249d24", 00:20:17.622 "is_configured": true, 00:20:17.622 "data_offset": 0, 00:20:17.622 "data_size": 65536 00:20:17.622 }, 00:20:17.622 { 00:20:17.622 "name": "BaseBdev4", 00:20:17.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.622 "is_configured": false, 00:20:17.622 "data_offset": 0, 00:20:17.622 "data_size": 0 00:20:17.622 } 00:20:17.622 ] 00:20:17.622 }' 00:20:17.622 08:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:17.622 08:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:17.880 08:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:18.138 [2024-07-23 08:33:30.560813] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:18.138 [2024-07-23 08:33:30.560860] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:20:18.138 [2024-07-23 08:33:30.560871] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:18.138 [2024-07-23 08:33:30.561141] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:20:18.138 [2024-07-23 08:33:30.561363] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:20:18.138 [2024-07-23 08:33:30.561377] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:20:18.138 [2024-07-23 08:33:30.561677] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:18.138 BaseBdev4 00:20:18.138 08:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:18.138 08:33:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:18.138 08:33:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:18.138 08:33:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:18.138 08:33:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:18.138 08:33:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:18.138 08:33:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:18.397 08:33:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:18.655 [ 00:20:18.655 { 00:20:18.655 "name": "BaseBdev4", 00:20:18.655 "aliases": [ 00:20:18.655 "43a0e686-8006-4255-bb41-14549afc9175" 00:20:18.655 ], 00:20:18.655 "product_name": "Malloc disk", 00:20:18.655 "block_size": 512, 00:20:18.655 "num_blocks": 65536, 00:20:18.655 "uuid": "43a0e686-8006-4255-bb41-14549afc9175", 00:20:18.655 "assigned_rate_limits": { 00:20:18.655 "rw_ios_per_sec": 0, 00:20:18.655 "rw_mbytes_per_sec": 0, 00:20:18.655 "r_mbytes_per_sec": 0, 00:20:18.655 "w_mbytes_per_sec": 0 00:20:18.655 }, 00:20:18.655 "claimed": true, 00:20:18.655 "claim_type": "exclusive_write", 00:20:18.655 "zoned": false, 00:20:18.655 "supported_io_types": { 00:20:18.655 "read": true, 00:20:18.655 "write": true, 00:20:18.655 "unmap": true, 00:20:18.655 "flush": true, 00:20:18.655 "reset": true, 00:20:18.655 "nvme_admin": false, 00:20:18.655 "nvme_io": false, 00:20:18.655 "nvme_io_md": false, 00:20:18.655 "write_zeroes": true, 00:20:18.655 "zcopy": true, 00:20:18.655 "get_zone_info": false, 00:20:18.655 "zone_management": false, 00:20:18.655 "zone_append": false, 00:20:18.655 "compare": false, 00:20:18.655 "compare_and_write": false, 00:20:18.655 "abort": true, 00:20:18.655 "seek_hole": false, 00:20:18.655 "seek_data": false, 00:20:18.655 "copy": true, 00:20:18.655 "nvme_iov_md": false 00:20:18.655 }, 00:20:18.655 "memory_domains": [ 00:20:18.655 { 00:20:18.655 "dma_device_id": "system", 00:20:18.655 "dma_device_type": 1 00:20:18.655 }, 00:20:18.655 { 00:20:18.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:18.655 "dma_device_type": 2 00:20:18.655 } 00:20:18.655 ], 00:20:18.655 "driver_specific": {} 00:20:18.655 } 00:20:18.655 ] 00:20:18.655 08:33:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:18.655 08:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:18.655 08:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:18.655 08:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:20:18.655 08:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:18.655 08:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:18.655 08:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:18.655 08:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:18.655 08:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:18.655 08:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:18.655 08:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:18.655 08:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:18.655 08:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:18.655 08:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.655 08:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:18.655 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:18.655 "name": "Existed_Raid", 00:20:18.655 "uuid": "4dfea6b0-3b30-49c5-890f-6cd1dbe27257", 00:20:18.655 "strip_size_kb": 0, 00:20:18.655 "state": "online", 00:20:18.655 "raid_level": "raid1", 00:20:18.655 "superblock": false, 00:20:18.655 "num_base_bdevs": 4, 00:20:18.655 "num_base_bdevs_discovered": 4, 00:20:18.655 "num_base_bdevs_operational": 4, 00:20:18.655 "base_bdevs_list": [ 00:20:18.655 { 00:20:18.655 "name": "BaseBdev1", 00:20:18.655 "uuid": "dc1bf698-d1b6-4fae-828b-f518e3859f34", 00:20:18.655 "is_configured": true, 00:20:18.655 "data_offset": 0, 00:20:18.655 "data_size": 65536 00:20:18.655 }, 00:20:18.655 { 00:20:18.655 "name": "BaseBdev2", 00:20:18.655 "uuid": "6059bd82-adac-43f1-8450-346d157d42db", 00:20:18.655 "is_configured": true, 00:20:18.655 "data_offset": 0, 00:20:18.655 "data_size": 65536 00:20:18.655 }, 00:20:18.655 { 00:20:18.655 "name": "BaseBdev3", 00:20:18.655 "uuid": "d281fd1a-79c3-4f73-848f-04190f249d24", 00:20:18.655 "is_configured": true, 00:20:18.655 "data_offset": 0, 00:20:18.655 "data_size": 65536 00:20:18.655 }, 00:20:18.655 { 00:20:18.655 "name": "BaseBdev4", 00:20:18.655 "uuid": "43a0e686-8006-4255-bb41-14549afc9175", 00:20:18.655 "is_configured": true, 00:20:18.655 "data_offset": 0, 00:20:18.655 "data_size": 65536 00:20:18.655 } 00:20:18.655 ] 00:20:18.655 }' 00:20:18.655 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:18.655 08:33:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:19.221 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:19.221 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:19.221 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:19.221 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:19.221 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:19.221 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:19.221 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:19.221 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:19.480 [2024-07-23 08:33:31.760358] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:19.480 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:19.480 "name": "Existed_Raid", 00:20:19.480 "aliases": [ 00:20:19.480 "4dfea6b0-3b30-49c5-890f-6cd1dbe27257" 00:20:19.480 ], 00:20:19.480 "product_name": "Raid Volume", 00:20:19.480 "block_size": 512, 00:20:19.480 "num_blocks": 65536, 00:20:19.480 "uuid": "4dfea6b0-3b30-49c5-890f-6cd1dbe27257", 00:20:19.480 "assigned_rate_limits": { 00:20:19.480 "rw_ios_per_sec": 0, 00:20:19.480 "rw_mbytes_per_sec": 0, 00:20:19.480 "r_mbytes_per_sec": 0, 00:20:19.480 "w_mbytes_per_sec": 0 00:20:19.480 }, 00:20:19.480 "claimed": false, 00:20:19.480 "zoned": false, 00:20:19.480 "supported_io_types": { 00:20:19.480 "read": true, 00:20:19.480 "write": true, 00:20:19.480 "unmap": false, 00:20:19.480 "flush": false, 00:20:19.480 "reset": true, 00:20:19.480 "nvme_admin": false, 00:20:19.480 "nvme_io": false, 00:20:19.480 "nvme_io_md": false, 00:20:19.480 "write_zeroes": true, 00:20:19.480 "zcopy": false, 00:20:19.480 "get_zone_info": false, 00:20:19.480 "zone_management": false, 00:20:19.480 "zone_append": false, 00:20:19.480 "compare": false, 00:20:19.480 "compare_and_write": false, 00:20:19.480 "abort": false, 00:20:19.480 "seek_hole": false, 00:20:19.480 "seek_data": false, 00:20:19.480 "copy": false, 00:20:19.480 "nvme_iov_md": false 00:20:19.480 }, 00:20:19.480 "memory_domains": [ 00:20:19.480 { 00:20:19.480 "dma_device_id": "system", 00:20:19.480 "dma_device_type": 1 00:20:19.480 }, 00:20:19.480 { 00:20:19.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:19.480 "dma_device_type": 2 00:20:19.480 }, 00:20:19.480 { 00:20:19.480 "dma_device_id": "system", 00:20:19.480 "dma_device_type": 1 00:20:19.480 }, 00:20:19.480 { 00:20:19.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:19.480 "dma_device_type": 2 00:20:19.480 }, 00:20:19.480 { 00:20:19.480 "dma_device_id": "system", 00:20:19.480 "dma_device_type": 1 00:20:19.480 }, 00:20:19.480 { 00:20:19.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:19.480 "dma_device_type": 2 00:20:19.480 }, 00:20:19.480 { 00:20:19.480 "dma_device_id": "system", 00:20:19.480 "dma_device_type": 1 00:20:19.480 }, 00:20:19.480 { 00:20:19.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:19.480 "dma_device_type": 2 00:20:19.480 } 00:20:19.480 ], 00:20:19.480 "driver_specific": { 00:20:19.480 "raid": { 00:20:19.480 "uuid": "4dfea6b0-3b30-49c5-890f-6cd1dbe27257", 00:20:19.480 "strip_size_kb": 0, 00:20:19.480 "state": "online", 00:20:19.480 "raid_level": "raid1", 00:20:19.480 "superblock": false, 00:20:19.480 "num_base_bdevs": 4, 00:20:19.480 "num_base_bdevs_discovered": 4, 00:20:19.480 "num_base_bdevs_operational": 4, 00:20:19.480 "base_bdevs_list": [ 00:20:19.480 { 00:20:19.480 "name": "BaseBdev1", 00:20:19.480 "uuid": "dc1bf698-d1b6-4fae-828b-f518e3859f34", 00:20:19.480 "is_configured": true, 00:20:19.480 "data_offset": 0, 00:20:19.480 "data_size": 65536 00:20:19.480 }, 00:20:19.480 { 00:20:19.480 "name": "BaseBdev2", 00:20:19.480 "uuid": "6059bd82-adac-43f1-8450-346d157d42db", 00:20:19.480 "is_configured": true, 00:20:19.480 "data_offset": 0, 00:20:19.480 "data_size": 65536 00:20:19.480 }, 00:20:19.480 { 00:20:19.480 "name": "BaseBdev3", 00:20:19.480 "uuid": "d281fd1a-79c3-4f73-848f-04190f249d24", 00:20:19.480 "is_configured": true, 00:20:19.480 "data_offset": 0, 00:20:19.480 "data_size": 65536 00:20:19.480 }, 00:20:19.480 { 00:20:19.480 "name": "BaseBdev4", 00:20:19.480 "uuid": "43a0e686-8006-4255-bb41-14549afc9175", 00:20:19.480 "is_configured": true, 00:20:19.480 "data_offset": 0, 00:20:19.480 "data_size": 65536 00:20:19.480 } 00:20:19.480 ] 00:20:19.480 } 00:20:19.480 } 00:20:19.480 }' 00:20:19.480 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:19.480 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:19.480 BaseBdev2 00:20:19.480 BaseBdev3 00:20:19.480 BaseBdev4' 00:20:19.480 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:19.480 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:19.481 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:19.739 08:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:19.739 "name": "BaseBdev1", 00:20:19.739 "aliases": [ 00:20:19.739 "dc1bf698-d1b6-4fae-828b-f518e3859f34" 00:20:19.739 ], 00:20:19.739 "product_name": "Malloc disk", 00:20:19.739 "block_size": 512, 00:20:19.739 "num_blocks": 65536, 00:20:19.739 "uuid": "dc1bf698-d1b6-4fae-828b-f518e3859f34", 00:20:19.739 "assigned_rate_limits": { 00:20:19.739 "rw_ios_per_sec": 0, 00:20:19.739 "rw_mbytes_per_sec": 0, 00:20:19.739 "r_mbytes_per_sec": 0, 00:20:19.739 "w_mbytes_per_sec": 0 00:20:19.739 }, 00:20:19.739 "claimed": true, 00:20:19.739 "claim_type": "exclusive_write", 00:20:19.739 "zoned": false, 00:20:19.739 "supported_io_types": { 00:20:19.739 "read": true, 00:20:19.739 "write": true, 00:20:19.739 "unmap": true, 00:20:19.739 "flush": true, 00:20:19.739 "reset": true, 00:20:19.739 "nvme_admin": false, 00:20:19.739 "nvme_io": false, 00:20:19.739 "nvme_io_md": false, 00:20:19.739 "write_zeroes": true, 00:20:19.739 "zcopy": true, 00:20:19.739 "get_zone_info": false, 00:20:19.739 "zone_management": false, 00:20:19.739 "zone_append": false, 00:20:19.739 "compare": false, 00:20:19.739 "compare_and_write": false, 00:20:19.739 "abort": true, 00:20:19.739 "seek_hole": false, 00:20:19.739 "seek_data": false, 00:20:19.739 "copy": true, 00:20:19.739 "nvme_iov_md": false 00:20:19.739 }, 00:20:19.739 "memory_domains": [ 00:20:19.739 { 00:20:19.739 "dma_device_id": "system", 00:20:19.739 "dma_device_type": 1 00:20:19.739 }, 00:20:19.739 { 00:20:19.739 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:19.739 "dma_device_type": 2 00:20:19.739 } 00:20:19.739 ], 00:20:19.739 "driver_specific": {} 00:20:19.739 }' 00:20:19.739 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:19.739 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:19.739 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:19.739 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:19.739 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:19.739 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:19.739 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:19.739 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:19.739 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:19.739 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:19.997 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:19.997 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:19.997 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:19.997 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:19.997 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:19.997 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:19.997 "name": "BaseBdev2", 00:20:19.997 "aliases": [ 00:20:19.997 "6059bd82-adac-43f1-8450-346d157d42db" 00:20:19.997 ], 00:20:19.997 "product_name": "Malloc disk", 00:20:19.997 "block_size": 512, 00:20:19.997 "num_blocks": 65536, 00:20:19.997 "uuid": "6059bd82-adac-43f1-8450-346d157d42db", 00:20:19.997 "assigned_rate_limits": { 00:20:19.997 "rw_ios_per_sec": 0, 00:20:19.997 "rw_mbytes_per_sec": 0, 00:20:19.997 "r_mbytes_per_sec": 0, 00:20:19.997 "w_mbytes_per_sec": 0 00:20:19.997 }, 00:20:19.997 "claimed": true, 00:20:19.997 "claim_type": "exclusive_write", 00:20:19.997 "zoned": false, 00:20:19.997 "supported_io_types": { 00:20:19.997 "read": true, 00:20:19.997 "write": true, 00:20:19.997 "unmap": true, 00:20:19.997 "flush": true, 00:20:19.997 "reset": true, 00:20:19.997 "nvme_admin": false, 00:20:19.997 "nvme_io": false, 00:20:19.997 "nvme_io_md": false, 00:20:19.997 "write_zeroes": true, 00:20:19.997 "zcopy": true, 00:20:19.997 "get_zone_info": false, 00:20:19.997 "zone_management": false, 00:20:19.997 "zone_append": false, 00:20:19.997 "compare": false, 00:20:19.997 "compare_and_write": false, 00:20:19.997 "abort": true, 00:20:19.997 "seek_hole": false, 00:20:19.997 "seek_data": false, 00:20:19.997 "copy": true, 00:20:19.997 "nvme_iov_md": false 00:20:19.997 }, 00:20:19.997 "memory_domains": [ 00:20:19.997 { 00:20:19.997 "dma_device_id": "system", 00:20:19.997 "dma_device_type": 1 00:20:19.997 }, 00:20:19.997 { 00:20:19.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:19.997 "dma_device_type": 2 00:20:19.997 } 00:20:19.997 ], 00:20:19.997 "driver_specific": {} 00:20:19.997 }' 00:20:19.997 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:19.997 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:20.255 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:20.255 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:20.255 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:20.255 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:20.255 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:20.255 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:20.255 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:20.255 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:20.255 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:20.255 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:20.255 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:20.255 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:20.255 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:20.513 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:20.513 "name": "BaseBdev3", 00:20:20.513 "aliases": [ 00:20:20.513 "d281fd1a-79c3-4f73-848f-04190f249d24" 00:20:20.513 ], 00:20:20.513 "product_name": "Malloc disk", 00:20:20.513 "block_size": 512, 00:20:20.513 "num_blocks": 65536, 00:20:20.513 "uuid": "d281fd1a-79c3-4f73-848f-04190f249d24", 00:20:20.513 "assigned_rate_limits": { 00:20:20.513 "rw_ios_per_sec": 0, 00:20:20.513 "rw_mbytes_per_sec": 0, 00:20:20.513 "r_mbytes_per_sec": 0, 00:20:20.513 "w_mbytes_per_sec": 0 00:20:20.513 }, 00:20:20.513 "claimed": true, 00:20:20.513 "claim_type": "exclusive_write", 00:20:20.513 "zoned": false, 00:20:20.513 "supported_io_types": { 00:20:20.513 "read": true, 00:20:20.513 "write": true, 00:20:20.513 "unmap": true, 00:20:20.513 "flush": true, 00:20:20.513 "reset": true, 00:20:20.513 "nvme_admin": false, 00:20:20.513 "nvme_io": false, 00:20:20.513 "nvme_io_md": false, 00:20:20.513 "write_zeroes": true, 00:20:20.513 "zcopy": true, 00:20:20.513 "get_zone_info": false, 00:20:20.513 "zone_management": false, 00:20:20.513 "zone_append": false, 00:20:20.513 "compare": false, 00:20:20.513 "compare_and_write": false, 00:20:20.513 "abort": true, 00:20:20.513 "seek_hole": false, 00:20:20.513 "seek_data": false, 00:20:20.513 "copy": true, 00:20:20.513 "nvme_iov_md": false 00:20:20.513 }, 00:20:20.513 "memory_domains": [ 00:20:20.513 { 00:20:20.513 "dma_device_id": "system", 00:20:20.513 "dma_device_type": 1 00:20:20.513 }, 00:20:20.513 { 00:20:20.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:20.513 "dma_device_type": 2 00:20:20.513 } 00:20:20.513 ], 00:20:20.513 "driver_specific": {} 00:20:20.513 }' 00:20:20.513 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:20.513 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:20.513 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:20.513 08:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:20.771 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:20.771 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:20.771 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:20.771 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:20.771 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:20.771 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:20.771 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:20.771 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:20.771 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:20.771 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:20.771 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:21.029 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:21.029 "name": "BaseBdev4", 00:20:21.029 "aliases": [ 00:20:21.029 "43a0e686-8006-4255-bb41-14549afc9175" 00:20:21.029 ], 00:20:21.029 "product_name": "Malloc disk", 00:20:21.029 "block_size": 512, 00:20:21.029 "num_blocks": 65536, 00:20:21.029 "uuid": "43a0e686-8006-4255-bb41-14549afc9175", 00:20:21.029 "assigned_rate_limits": { 00:20:21.029 "rw_ios_per_sec": 0, 00:20:21.029 "rw_mbytes_per_sec": 0, 00:20:21.029 "r_mbytes_per_sec": 0, 00:20:21.029 "w_mbytes_per_sec": 0 00:20:21.029 }, 00:20:21.029 "claimed": true, 00:20:21.029 "claim_type": "exclusive_write", 00:20:21.029 "zoned": false, 00:20:21.029 "supported_io_types": { 00:20:21.029 "read": true, 00:20:21.029 "write": true, 00:20:21.029 "unmap": true, 00:20:21.029 "flush": true, 00:20:21.029 "reset": true, 00:20:21.029 "nvme_admin": false, 00:20:21.029 "nvme_io": false, 00:20:21.029 "nvme_io_md": false, 00:20:21.029 "write_zeroes": true, 00:20:21.029 "zcopy": true, 00:20:21.029 "get_zone_info": false, 00:20:21.029 "zone_management": false, 00:20:21.029 "zone_append": false, 00:20:21.029 "compare": false, 00:20:21.029 "compare_and_write": false, 00:20:21.029 "abort": true, 00:20:21.029 "seek_hole": false, 00:20:21.029 "seek_data": false, 00:20:21.029 "copy": true, 00:20:21.029 "nvme_iov_md": false 00:20:21.029 }, 00:20:21.029 "memory_domains": [ 00:20:21.029 { 00:20:21.029 "dma_device_id": "system", 00:20:21.029 "dma_device_type": 1 00:20:21.029 }, 00:20:21.029 { 00:20:21.029 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.029 "dma_device_type": 2 00:20:21.029 } 00:20:21.029 ], 00:20:21.029 "driver_specific": {} 00:20:21.029 }' 00:20:21.029 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:21.029 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:21.029 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:21.029 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:21.029 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:21.287 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:21.287 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:21.287 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:21.287 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:21.287 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:21.287 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:21.287 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:21.287 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:21.545 [2024-07-23 08:33:33.869694] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.545 08:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:21.803 08:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:21.803 "name": "Existed_Raid", 00:20:21.803 "uuid": "4dfea6b0-3b30-49c5-890f-6cd1dbe27257", 00:20:21.803 "strip_size_kb": 0, 00:20:21.803 "state": "online", 00:20:21.803 "raid_level": "raid1", 00:20:21.803 "superblock": false, 00:20:21.803 "num_base_bdevs": 4, 00:20:21.803 "num_base_bdevs_discovered": 3, 00:20:21.803 "num_base_bdevs_operational": 3, 00:20:21.803 "base_bdevs_list": [ 00:20:21.803 { 00:20:21.803 "name": null, 00:20:21.803 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:21.803 "is_configured": false, 00:20:21.803 "data_offset": 0, 00:20:21.803 "data_size": 65536 00:20:21.803 }, 00:20:21.803 { 00:20:21.803 "name": "BaseBdev2", 00:20:21.803 "uuid": "6059bd82-adac-43f1-8450-346d157d42db", 00:20:21.803 "is_configured": true, 00:20:21.803 "data_offset": 0, 00:20:21.803 "data_size": 65536 00:20:21.803 }, 00:20:21.803 { 00:20:21.803 "name": "BaseBdev3", 00:20:21.803 "uuid": "d281fd1a-79c3-4f73-848f-04190f249d24", 00:20:21.803 "is_configured": true, 00:20:21.803 "data_offset": 0, 00:20:21.803 "data_size": 65536 00:20:21.803 }, 00:20:21.803 { 00:20:21.803 "name": "BaseBdev4", 00:20:21.803 "uuid": "43a0e686-8006-4255-bb41-14549afc9175", 00:20:21.803 "is_configured": true, 00:20:21.803 "data_offset": 0, 00:20:21.803 "data_size": 65536 00:20:21.803 } 00:20:21.803 ] 00:20:21.803 }' 00:20:21.803 08:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:21.803 08:33:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:22.061 08:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:22.061 08:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:22.061 08:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.061 08:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:22.319 08:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:22.319 08:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:22.319 08:33:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:22.577 [2024-07-23 08:33:34.901726] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:22.577 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:22.577 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:22.577 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.577 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:22.836 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:22.836 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:22.836 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:22.836 [2024-07-23 08:33:35.342808] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:23.094 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:23.094 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:23.094 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.094 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:23.352 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:23.352 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:23.352 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:23.352 [2024-07-23 08:33:35.780332] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:23.352 [2024-07-23 08:33:35.780429] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:23.611 [2024-07-23 08:33:35.876743] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:23.611 [2024-07-23 08:33:35.876793] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:23.611 [2024-07-23 08:33:35.876805] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:20:23.611 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:23.611 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:23.611 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.611 08:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:23.611 08:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:23.611 08:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:23.611 08:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:23.611 08:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:23.611 08:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:23.611 08:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:23.869 BaseBdev2 00:20:23.869 08:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:23.870 08:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:23.870 08:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:23.870 08:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:23.870 08:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:23.870 08:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:23.870 08:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:24.129 08:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:24.129 [ 00:20:24.129 { 00:20:24.129 "name": "BaseBdev2", 00:20:24.129 "aliases": [ 00:20:24.129 "41c7140a-e76b-4a48-b482-1e45d356a737" 00:20:24.129 ], 00:20:24.129 "product_name": "Malloc disk", 00:20:24.129 "block_size": 512, 00:20:24.129 "num_blocks": 65536, 00:20:24.129 "uuid": "41c7140a-e76b-4a48-b482-1e45d356a737", 00:20:24.129 "assigned_rate_limits": { 00:20:24.129 "rw_ios_per_sec": 0, 00:20:24.129 "rw_mbytes_per_sec": 0, 00:20:24.129 "r_mbytes_per_sec": 0, 00:20:24.129 "w_mbytes_per_sec": 0 00:20:24.129 }, 00:20:24.129 "claimed": false, 00:20:24.129 "zoned": false, 00:20:24.129 "supported_io_types": { 00:20:24.129 "read": true, 00:20:24.129 "write": true, 00:20:24.129 "unmap": true, 00:20:24.129 "flush": true, 00:20:24.129 "reset": true, 00:20:24.129 "nvme_admin": false, 00:20:24.129 "nvme_io": false, 00:20:24.129 "nvme_io_md": false, 00:20:24.129 "write_zeroes": true, 00:20:24.129 "zcopy": true, 00:20:24.129 "get_zone_info": false, 00:20:24.129 "zone_management": false, 00:20:24.129 "zone_append": false, 00:20:24.129 "compare": false, 00:20:24.129 "compare_and_write": false, 00:20:24.129 "abort": true, 00:20:24.129 "seek_hole": false, 00:20:24.129 "seek_data": false, 00:20:24.129 "copy": true, 00:20:24.129 "nvme_iov_md": false 00:20:24.129 }, 00:20:24.129 "memory_domains": [ 00:20:24.129 { 00:20:24.129 "dma_device_id": "system", 00:20:24.129 "dma_device_type": 1 00:20:24.129 }, 00:20:24.129 { 00:20:24.129 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:24.129 "dma_device_type": 2 00:20:24.129 } 00:20:24.129 ], 00:20:24.129 "driver_specific": {} 00:20:24.129 } 00:20:24.129 ] 00:20:24.129 08:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:24.129 08:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:24.129 08:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:24.129 08:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:24.388 BaseBdev3 00:20:24.388 08:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:24.388 08:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:24.388 08:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:24.388 08:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:24.388 08:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:24.388 08:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:24.388 08:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:24.646 08:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:24.646 [ 00:20:24.646 { 00:20:24.646 "name": "BaseBdev3", 00:20:24.646 "aliases": [ 00:20:24.646 "e9e592ae-b14e-4d17-8c62-cc746d74d485" 00:20:24.646 ], 00:20:24.646 "product_name": "Malloc disk", 00:20:24.646 "block_size": 512, 00:20:24.646 "num_blocks": 65536, 00:20:24.646 "uuid": "e9e592ae-b14e-4d17-8c62-cc746d74d485", 00:20:24.646 "assigned_rate_limits": { 00:20:24.646 "rw_ios_per_sec": 0, 00:20:24.646 "rw_mbytes_per_sec": 0, 00:20:24.646 "r_mbytes_per_sec": 0, 00:20:24.646 "w_mbytes_per_sec": 0 00:20:24.646 }, 00:20:24.646 "claimed": false, 00:20:24.646 "zoned": false, 00:20:24.646 "supported_io_types": { 00:20:24.646 "read": true, 00:20:24.646 "write": true, 00:20:24.646 "unmap": true, 00:20:24.646 "flush": true, 00:20:24.646 "reset": true, 00:20:24.646 "nvme_admin": false, 00:20:24.646 "nvme_io": false, 00:20:24.646 "nvme_io_md": false, 00:20:24.646 "write_zeroes": true, 00:20:24.646 "zcopy": true, 00:20:24.646 "get_zone_info": false, 00:20:24.646 "zone_management": false, 00:20:24.646 "zone_append": false, 00:20:24.646 "compare": false, 00:20:24.646 "compare_and_write": false, 00:20:24.646 "abort": true, 00:20:24.646 "seek_hole": false, 00:20:24.646 "seek_data": false, 00:20:24.646 "copy": true, 00:20:24.646 "nvme_iov_md": false 00:20:24.646 }, 00:20:24.646 "memory_domains": [ 00:20:24.646 { 00:20:24.646 "dma_device_id": "system", 00:20:24.646 "dma_device_type": 1 00:20:24.646 }, 00:20:24.646 { 00:20:24.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:24.646 "dma_device_type": 2 00:20:24.646 } 00:20:24.646 ], 00:20:24.646 "driver_specific": {} 00:20:24.646 } 00:20:24.646 ] 00:20:24.646 08:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:24.646 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:24.646 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:24.647 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:24.905 BaseBdev4 00:20:24.905 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:24.905 08:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:24.905 08:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:24.905 08:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:24.905 08:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:24.905 08:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:24.905 08:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:25.163 08:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:25.422 [ 00:20:25.422 { 00:20:25.422 "name": "BaseBdev4", 00:20:25.422 "aliases": [ 00:20:25.422 "bbc18772-11f8-4173-a86f-a7df0819714e" 00:20:25.422 ], 00:20:25.422 "product_name": "Malloc disk", 00:20:25.422 "block_size": 512, 00:20:25.422 "num_blocks": 65536, 00:20:25.422 "uuid": "bbc18772-11f8-4173-a86f-a7df0819714e", 00:20:25.422 "assigned_rate_limits": { 00:20:25.422 "rw_ios_per_sec": 0, 00:20:25.422 "rw_mbytes_per_sec": 0, 00:20:25.422 "r_mbytes_per_sec": 0, 00:20:25.422 "w_mbytes_per_sec": 0 00:20:25.422 }, 00:20:25.422 "claimed": false, 00:20:25.422 "zoned": false, 00:20:25.422 "supported_io_types": { 00:20:25.422 "read": true, 00:20:25.422 "write": true, 00:20:25.422 "unmap": true, 00:20:25.422 "flush": true, 00:20:25.422 "reset": true, 00:20:25.422 "nvme_admin": false, 00:20:25.422 "nvme_io": false, 00:20:25.422 "nvme_io_md": false, 00:20:25.422 "write_zeroes": true, 00:20:25.422 "zcopy": true, 00:20:25.422 "get_zone_info": false, 00:20:25.422 "zone_management": false, 00:20:25.422 "zone_append": false, 00:20:25.422 "compare": false, 00:20:25.422 "compare_and_write": false, 00:20:25.422 "abort": true, 00:20:25.422 "seek_hole": false, 00:20:25.422 "seek_data": false, 00:20:25.422 "copy": true, 00:20:25.422 "nvme_iov_md": false 00:20:25.422 }, 00:20:25.422 "memory_domains": [ 00:20:25.422 { 00:20:25.422 "dma_device_id": "system", 00:20:25.422 "dma_device_type": 1 00:20:25.422 }, 00:20:25.422 { 00:20:25.422 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:25.422 "dma_device_type": 2 00:20:25.422 } 00:20:25.422 ], 00:20:25.422 "driver_specific": {} 00:20:25.422 } 00:20:25.422 ] 00:20:25.422 08:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:25.422 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:25.422 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:25.422 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:25.422 [2024-07-23 08:33:37.838987] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:25.422 [2024-07-23 08:33:37.839027] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:25.422 [2024-07-23 08:33:37.839049] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:25.422 [2024-07-23 08:33:37.840655] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:25.422 [2024-07-23 08:33:37.840699] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:25.422 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:25.422 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:25.422 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:25.422 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:25.422 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:25.422 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:25.422 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:25.422 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:25.422 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:25.422 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:25.422 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.422 08:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:25.680 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:25.680 "name": "Existed_Raid", 00:20:25.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.680 "strip_size_kb": 0, 00:20:25.680 "state": "configuring", 00:20:25.680 "raid_level": "raid1", 00:20:25.680 "superblock": false, 00:20:25.680 "num_base_bdevs": 4, 00:20:25.680 "num_base_bdevs_discovered": 3, 00:20:25.680 "num_base_bdevs_operational": 4, 00:20:25.680 "base_bdevs_list": [ 00:20:25.680 { 00:20:25.680 "name": "BaseBdev1", 00:20:25.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.680 "is_configured": false, 00:20:25.680 "data_offset": 0, 00:20:25.680 "data_size": 0 00:20:25.681 }, 00:20:25.681 { 00:20:25.681 "name": "BaseBdev2", 00:20:25.681 "uuid": "41c7140a-e76b-4a48-b482-1e45d356a737", 00:20:25.681 "is_configured": true, 00:20:25.681 "data_offset": 0, 00:20:25.681 "data_size": 65536 00:20:25.681 }, 00:20:25.681 { 00:20:25.681 "name": "BaseBdev3", 00:20:25.681 "uuid": "e9e592ae-b14e-4d17-8c62-cc746d74d485", 00:20:25.681 "is_configured": true, 00:20:25.681 "data_offset": 0, 00:20:25.681 "data_size": 65536 00:20:25.681 }, 00:20:25.681 { 00:20:25.681 "name": "BaseBdev4", 00:20:25.681 "uuid": "bbc18772-11f8-4173-a86f-a7df0819714e", 00:20:25.681 "is_configured": true, 00:20:25.681 "data_offset": 0, 00:20:25.681 "data_size": 65536 00:20:25.681 } 00:20:25.681 ] 00:20:25.681 }' 00:20:25.681 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:25.681 08:33:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:26.248 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:26.248 [2024-07-23 08:33:38.653137] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:26.248 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:26.248 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:26.248 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:26.248 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:26.248 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:26.248 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:26.248 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:26.248 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:26.248 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:26.248 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:26.248 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.248 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:26.506 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:26.506 "name": "Existed_Raid", 00:20:26.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:26.506 "strip_size_kb": 0, 00:20:26.506 "state": "configuring", 00:20:26.506 "raid_level": "raid1", 00:20:26.506 "superblock": false, 00:20:26.506 "num_base_bdevs": 4, 00:20:26.506 "num_base_bdevs_discovered": 2, 00:20:26.506 "num_base_bdevs_operational": 4, 00:20:26.506 "base_bdevs_list": [ 00:20:26.506 { 00:20:26.506 "name": "BaseBdev1", 00:20:26.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:26.506 "is_configured": false, 00:20:26.506 "data_offset": 0, 00:20:26.506 "data_size": 0 00:20:26.506 }, 00:20:26.506 { 00:20:26.506 "name": null, 00:20:26.506 "uuid": "41c7140a-e76b-4a48-b482-1e45d356a737", 00:20:26.506 "is_configured": false, 00:20:26.506 "data_offset": 0, 00:20:26.506 "data_size": 65536 00:20:26.506 }, 00:20:26.506 { 00:20:26.506 "name": "BaseBdev3", 00:20:26.506 "uuid": "e9e592ae-b14e-4d17-8c62-cc746d74d485", 00:20:26.506 "is_configured": true, 00:20:26.506 "data_offset": 0, 00:20:26.506 "data_size": 65536 00:20:26.506 }, 00:20:26.506 { 00:20:26.506 "name": "BaseBdev4", 00:20:26.506 "uuid": "bbc18772-11f8-4173-a86f-a7df0819714e", 00:20:26.506 "is_configured": true, 00:20:26.506 "data_offset": 0, 00:20:26.506 "data_size": 65536 00:20:26.506 } 00:20:26.506 ] 00:20:26.506 }' 00:20:26.506 08:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:26.506 08:33:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:27.110 08:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.110 08:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:27.110 08:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:27.110 08:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:27.368 [2024-07-23 08:33:39.687279] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:27.368 BaseBdev1 00:20:27.368 08:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:27.368 08:33:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:27.368 08:33:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:27.368 08:33:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:27.368 08:33:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:27.368 08:33:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:27.368 08:33:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:27.368 08:33:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:27.626 [ 00:20:27.626 { 00:20:27.626 "name": "BaseBdev1", 00:20:27.626 "aliases": [ 00:20:27.626 "02046ca7-c474-4b4c-bd90-ec46402b4812" 00:20:27.626 ], 00:20:27.626 "product_name": "Malloc disk", 00:20:27.626 "block_size": 512, 00:20:27.626 "num_blocks": 65536, 00:20:27.626 "uuid": "02046ca7-c474-4b4c-bd90-ec46402b4812", 00:20:27.626 "assigned_rate_limits": { 00:20:27.626 "rw_ios_per_sec": 0, 00:20:27.626 "rw_mbytes_per_sec": 0, 00:20:27.626 "r_mbytes_per_sec": 0, 00:20:27.626 "w_mbytes_per_sec": 0 00:20:27.626 }, 00:20:27.626 "claimed": true, 00:20:27.626 "claim_type": "exclusive_write", 00:20:27.626 "zoned": false, 00:20:27.626 "supported_io_types": { 00:20:27.626 "read": true, 00:20:27.626 "write": true, 00:20:27.626 "unmap": true, 00:20:27.626 "flush": true, 00:20:27.626 "reset": true, 00:20:27.626 "nvme_admin": false, 00:20:27.626 "nvme_io": false, 00:20:27.626 "nvme_io_md": false, 00:20:27.626 "write_zeroes": true, 00:20:27.626 "zcopy": true, 00:20:27.626 "get_zone_info": false, 00:20:27.627 "zone_management": false, 00:20:27.627 "zone_append": false, 00:20:27.627 "compare": false, 00:20:27.627 "compare_and_write": false, 00:20:27.627 "abort": true, 00:20:27.627 "seek_hole": false, 00:20:27.627 "seek_data": false, 00:20:27.627 "copy": true, 00:20:27.627 "nvme_iov_md": false 00:20:27.627 }, 00:20:27.627 "memory_domains": [ 00:20:27.627 { 00:20:27.627 "dma_device_id": "system", 00:20:27.627 "dma_device_type": 1 00:20:27.627 }, 00:20:27.627 { 00:20:27.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:27.627 "dma_device_type": 2 00:20:27.627 } 00:20:27.627 ], 00:20:27.627 "driver_specific": {} 00:20:27.627 } 00:20:27.627 ] 00:20:27.627 08:33:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:27.627 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:27.627 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:27.627 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:27.627 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:27.627 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:27.627 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:27.627 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:27.627 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:27.627 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:27.627 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:27.627 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.627 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:27.885 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:27.885 "name": "Existed_Raid", 00:20:27.885 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:27.885 "strip_size_kb": 0, 00:20:27.885 "state": "configuring", 00:20:27.885 "raid_level": "raid1", 00:20:27.885 "superblock": false, 00:20:27.885 "num_base_bdevs": 4, 00:20:27.885 "num_base_bdevs_discovered": 3, 00:20:27.885 "num_base_bdevs_operational": 4, 00:20:27.885 "base_bdevs_list": [ 00:20:27.885 { 00:20:27.885 "name": "BaseBdev1", 00:20:27.885 "uuid": "02046ca7-c474-4b4c-bd90-ec46402b4812", 00:20:27.885 "is_configured": true, 00:20:27.885 "data_offset": 0, 00:20:27.885 "data_size": 65536 00:20:27.885 }, 00:20:27.885 { 00:20:27.885 "name": null, 00:20:27.885 "uuid": "41c7140a-e76b-4a48-b482-1e45d356a737", 00:20:27.885 "is_configured": false, 00:20:27.885 "data_offset": 0, 00:20:27.885 "data_size": 65536 00:20:27.885 }, 00:20:27.885 { 00:20:27.885 "name": "BaseBdev3", 00:20:27.885 "uuid": "e9e592ae-b14e-4d17-8c62-cc746d74d485", 00:20:27.885 "is_configured": true, 00:20:27.885 "data_offset": 0, 00:20:27.885 "data_size": 65536 00:20:27.885 }, 00:20:27.885 { 00:20:27.885 "name": "BaseBdev4", 00:20:27.885 "uuid": "bbc18772-11f8-4173-a86f-a7df0819714e", 00:20:27.885 "is_configured": true, 00:20:27.885 "data_offset": 0, 00:20:27.885 "data_size": 65536 00:20:27.885 } 00:20:27.885 ] 00:20:27.885 }' 00:20:27.885 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:27.885 08:33:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:28.452 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.452 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:28.452 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:28.452 08:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:28.710 [2024-07-23 08:33:41.038901] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:28.710 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:28.710 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:28.710 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:28.710 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:28.710 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:28.710 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:28.710 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:28.710 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:28.710 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:28.710 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:28.710 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.710 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:28.710 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:28.710 "name": "Existed_Raid", 00:20:28.710 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.710 "strip_size_kb": 0, 00:20:28.710 "state": "configuring", 00:20:28.710 "raid_level": "raid1", 00:20:28.710 "superblock": false, 00:20:28.710 "num_base_bdevs": 4, 00:20:28.710 "num_base_bdevs_discovered": 2, 00:20:28.710 "num_base_bdevs_operational": 4, 00:20:28.710 "base_bdevs_list": [ 00:20:28.710 { 00:20:28.710 "name": "BaseBdev1", 00:20:28.710 "uuid": "02046ca7-c474-4b4c-bd90-ec46402b4812", 00:20:28.710 "is_configured": true, 00:20:28.710 "data_offset": 0, 00:20:28.710 "data_size": 65536 00:20:28.710 }, 00:20:28.710 { 00:20:28.711 "name": null, 00:20:28.711 "uuid": "41c7140a-e76b-4a48-b482-1e45d356a737", 00:20:28.711 "is_configured": false, 00:20:28.711 "data_offset": 0, 00:20:28.711 "data_size": 65536 00:20:28.711 }, 00:20:28.711 { 00:20:28.711 "name": null, 00:20:28.711 "uuid": "e9e592ae-b14e-4d17-8c62-cc746d74d485", 00:20:28.711 "is_configured": false, 00:20:28.711 "data_offset": 0, 00:20:28.711 "data_size": 65536 00:20:28.711 }, 00:20:28.711 { 00:20:28.711 "name": "BaseBdev4", 00:20:28.711 "uuid": "bbc18772-11f8-4173-a86f-a7df0819714e", 00:20:28.711 "is_configured": true, 00:20:28.711 "data_offset": 0, 00:20:28.711 "data_size": 65536 00:20:28.711 } 00:20:28.711 ] 00:20:28.711 }' 00:20:28.711 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:28.711 08:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:29.277 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:29.277 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.535 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:29.535 08:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:29.535 [2024-07-23 08:33:42.049601] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:29.794 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:29.794 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:29.794 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:29.794 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:29.794 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:29.794 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:29.794 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:29.794 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:29.794 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:29.794 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:29.794 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.794 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:29.794 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:29.794 "name": "Existed_Raid", 00:20:29.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:29.794 "strip_size_kb": 0, 00:20:29.794 "state": "configuring", 00:20:29.794 "raid_level": "raid1", 00:20:29.794 "superblock": false, 00:20:29.794 "num_base_bdevs": 4, 00:20:29.794 "num_base_bdevs_discovered": 3, 00:20:29.794 "num_base_bdevs_operational": 4, 00:20:29.794 "base_bdevs_list": [ 00:20:29.794 { 00:20:29.794 "name": "BaseBdev1", 00:20:29.794 "uuid": "02046ca7-c474-4b4c-bd90-ec46402b4812", 00:20:29.794 "is_configured": true, 00:20:29.794 "data_offset": 0, 00:20:29.794 "data_size": 65536 00:20:29.794 }, 00:20:29.794 { 00:20:29.794 "name": null, 00:20:29.794 "uuid": "41c7140a-e76b-4a48-b482-1e45d356a737", 00:20:29.794 "is_configured": false, 00:20:29.794 "data_offset": 0, 00:20:29.794 "data_size": 65536 00:20:29.794 }, 00:20:29.794 { 00:20:29.794 "name": "BaseBdev3", 00:20:29.794 "uuid": "e9e592ae-b14e-4d17-8c62-cc746d74d485", 00:20:29.794 "is_configured": true, 00:20:29.794 "data_offset": 0, 00:20:29.794 "data_size": 65536 00:20:29.794 }, 00:20:29.794 { 00:20:29.794 "name": "BaseBdev4", 00:20:29.794 "uuid": "bbc18772-11f8-4173-a86f-a7df0819714e", 00:20:29.794 "is_configured": true, 00:20:29.794 "data_offset": 0, 00:20:29.794 "data_size": 65536 00:20:29.794 } 00:20:29.794 ] 00:20:29.794 }' 00:20:29.794 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:29.794 08:33:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:30.360 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.360 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:30.619 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:30.619 08:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:30.619 [2024-07-23 08:33:43.044228] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:30.883 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:30.883 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:30.883 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:30.883 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:30.883 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:30.883 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:30.883 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:30.883 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:30.883 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:30.884 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:30.884 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.884 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:30.884 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:30.884 "name": "Existed_Raid", 00:20:30.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:30.884 "strip_size_kb": 0, 00:20:30.884 "state": "configuring", 00:20:30.884 "raid_level": "raid1", 00:20:30.884 "superblock": false, 00:20:30.884 "num_base_bdevs": 4, 00:20:30.884 "num_base_bdevs_discovered": 2, 00:20:30.884 "num_base_bdevs_operational": 4, 00:20:30.884 "base_bdevs_list": [ 00:20:30.884 { 00:20:30.884 "name": null, 00:20:30.884 "uuid": "02046ca7-c474-4b4c-bd90-ec46402b4812", 00:20:30.884 "is_configured": false, 00:20:30.884 "data_offset": 0, 00:20:30.884 "data_size": 65536 00:20:30.884 }, 00:20:30.884 { 00:20:30.884 "name": null, 00:20:30.884 "uuid": "41c7140a-e76b-4a48-b482-1e45d356a737", 00:20:30.884 "is_configured": false, 00:20:30.884 "data_offset": 0, 00:20:30.884 "data_size": 65536 00:20:30.884 }, 00:20:30.884 { 00:20:30.884 "name": "BaseBdev3", 00:20:30.884 "uuid": "e9e592ae-b14e-4d17-8c62-cc746d74d485", 00:20:30.884 "is_configured": true, 00:20:30.884 "data_offset": 0, 00:20:30.884 "data_size": 65536 00:20:30.884 }, 00:20:30.884 { 00:20:30.884 "name": "BaseBdev4", 00:20:30.884 "uuid": "bbc18772-11f8-4173-a86f-a7df0819714e", 00:20:30.884 "is_configured": true, 00:20:30.884 "data_offset": 0, 00:20:30.884 "data_size": 65536 00:20:30.884 } 00:20:30.884 ] 00:20:30.884 }' 00:20:30.884 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:30.884 08:33:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:31.452 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:31.452 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.711 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:31.711 08:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:31.711 [2024-07-23 08:33:44.148233] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:31.711 08:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:31.711 08:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:31.711 08:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:31.711 08:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:31.711 08:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:31.711 08:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:31.711 08:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:31.711 08:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:31.711 08:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:31.711 08:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:31.711 08:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.711 08:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:31.969 08:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:31.969 "name": "Existed_Raid", 00:20:31.969 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:31.969 "strip_size_kb": 0, 00:20:31.969 "state": "configuring", 00:20:31.969 "raid_level": "raid1", 00:20:31.969 "superblock": false, 00:20:31.969 "num_base_bdevs": 4, 00:20:31.969 "num_base_bdevs_discovered": 3, 00:20:31.969 "num_base_bdevs_operational": 4, 00:20:31.969 "base_bdevs_list": [ 00:20:31.970 { 00:20:31.970 "name": null, 00:20:31.970 "uuid": "02046ca7-c474-4b4c-bd90-ec46402b4812", 00:20:31.970 "is_configured": false, 00:20:31.970 "data_offset": 0, 00:20:31.970 "data_size": 65536 00:20:31.970 }, 00:20:31.970 { 00:20:31.970 "name": "BaseBdev2", 00:20:31.970 "uuid": "41c7140a-e76b-4a48-b482-1e45d356a737", 00:20:31.970 "is_configured": true, 00:20:31.970 "data_offset": 0, 00:20:31.970 "data_size": 65536 00:20:31.970 }, 00:20:31.970 { 00:20:31.970 "name": "BaseBdev3", 00:20:31.970 "uuid": "e9e592ae-b14e-4d17-8c62-cc746d74d485", 00:20:31.970 "is_configured": true, 00:20:31.970 "data_offset": 0, 00:20:31.970 "data_size": 65536 00:20:31.970 }, 00:20:31.970 { 00:20:31.970 "name": "BaseBdev4", 00:20:31.970 "uuid": "bbc18772-11f8-4173-a86f-a7df0819714e", 00:20:31.970 "is_configured": true, 00:20:31.970 "data_offset": 0, 00:20:31.970 "data_size": 65536 00:20:31.970 } 00:20:31.970 ] 00:20:31.970 }' 00:20:31.970 08:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:31.970 08:33:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:32.536 08:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:32.536 08:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:32.536 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:32.536 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:32.536 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:32.795 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 02046ca7-c474-4b4c-bd90-ec46402b4812 00:20:33.054 [2024-07-23 08:33:45.377685] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:33.054 [2024-07-23 08:33:45.377726] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037280 00:20:33.054 [2024-07-23 08:33:45.377737] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:33.054 [2024-07-23 08:33:45.377964] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c3a0 00:20:33.054 [2024-07-23 08:33:45.378135] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037280 00:20:33.054 [2024-07-23 08:33:45.378144] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000037280 00:20:33.054 [2024-07-23 08:33:45.378399] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:33.054 NewBaseBdev 00:20:33.054 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:33.054 08:33:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:33.054 08:33:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:33.054 08:33:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:33.054 08:33:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:33.054 08:33:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:33.054 08:33:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:33.312 08:33:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:33.312 [ 00:20:33.312 { 00:20:33.312 "name": "NewBaseBdev", 00:20:33.312 "aliases": [ 00:20:33.312 "02046ca7-c474-4b4c-bd90-ec46402b4812" 00:20:33.312 ], 00:20:33.312 "product_name": "Malloc disk", 00:20:33.312 "block_size": 512, 00:20:33.312 "num_blocks": 65536, 00:20:33.312 "uuid": "02046ca7-c474-4b4c-bd90-ec46402b4812", 00:20:33.312 "assigned_rate_limits": { 00:20:33.312 "rw_ios_per_sec": 0, 00:20:33.312 "rw_mbytes_per_sec": 0, 00:20:33.312 "r_mbytes_per_sec": 0, 00:20:33.312 "w_mbytes_per_sec": 0 00:20:33.312 }, 00:20:33.312 "claimed": true, 00:20:33.312 "claim_type": "exclusive_write", 00:20:33.312 "zoned": false, 00:20:33.312 "supported_io_types": { 00:20:33.312 "read": true, 00:20:33.312 "write": true, 00:20:33.312 "unmap": true, 00:20:33.312 "flush": true, 00:20:33.312 "reset": true, 00:20:33.312 "nvme_admin": false, 00:20:33.312 "nvme_io": false, 00:20:33.312 "nvme_io_md": false, 00:20:33.312 "write_zeroes": true, 00:20:33.312 "zcopy": true, 00:20:33.312 "get_zone_info": false, 00:20:33.312 "zone_management": false, 00:20:33.312 "zone_append": false, 00:20:33.312 "compare": false, 00:20:33.312 "compare_and_write": false, 00:20:33.312 "abort": true, 00:20:33.312 "seek_hole": false, 00:20:33.312 "seek_data": false, 00:20:33.312 "copy": true, 00:20:33.312 "nvme_iov_md": false 00:20:33.312 }, 00:20:33.312 "memory_domains": [ 00:20:33.313 { 00:20:33.313 "dma_device_id": "system", 00:20:33.313 "dma_device_type": 1 00:20:33.313 }, 00:20:33.313 { 00:20:33.313 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:33.313 "dma_device_type": 2 00:20:33.313 } 00:20:33.313 ], 00:20:33.313 "driver_specific": {} 00:20:33.313 } 00:20:33.313 ] 00:20:33.313 08:33:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:33.313 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:20:33.313 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:33.313 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:33.313 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:33.313 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:33.313 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:33.313 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:33.313 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:33.313 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:33.313 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:33.313 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:33.313 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.572 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:33.572 "name": "Existed_Raid", 00:20:33.572 "uuid": "e3f0a21e-317f-4abd-8d0c-d75e93f3b2f0", 00:20:33.572 "strip_size_kb": 0, 00:20:33.572 "state": "online", 00:20:33.572 "raid_level": "raid1", 00:20:33.572 "superblock": false, 00:20:33.572 "num_base_bdevs": 4, 00:20:33.572 "num_base_bdevs_discovered": 4, 00:20:33.572 "num_base_bdevs_operational": 4, 00:20:33.572 "base_bdevs_list": [ 00:20:33.572 { 00:20:33.572 "name": "NewBaseBdev", 00:20:33.572 "uuid": "02046ca7-c474-4b4c-bd90-ec46402b4812", 00:20:33.572 "is_configured": true, 00:20:33.572 "data_offset": 0, 00:20:33.572 "data_size": 65536 00:20:33.572 }, 00:20:33.572 { 00:20:33.572 "name": "BaseBdev2", 00:20:33.572 "uuid": "41c7140a-e76b-4a48-b482-1e45d356a737", 00:20:33.572 "is_configured": true, 00:20:33.572 "data_offset": 0, 00:20:33.572 "data_size": 65536 00:20:33.572 }, 00:20:33.572 { 00:20:33.572 "name": "BaseBdev3", 00:20:33.572 "uuid": "e9e592ae-b14e-4d17-8c62-cc746d74d485", 00:20:33.572 "is_configured": true, 00:20:33.572 "data_offset": 0, 00:20:33.572 "data_size": 65536 00:20:33.572 }, 00:20:33.572 { 00:20:33.572 "name": "BaseBdev4", 00:20:33.572 "uuid": "bbc18772-11f8-4173-a86f-a7df0819714e", 00:20:33.572 "is_configured": true, 00:20:33.572 "data_offset": 0, 00:20:33.572 "data_size": 65536 00:20:33.572 } 00:20:33.572 ] 00:20:33.572 }' 00:20:33.572 08:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:33.572 08:33:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:34.138 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:34.138 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:34.138 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:34.139 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:34.139 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:34.139 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:34.139 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:34.139 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:34.139 [2024-07-23 08:33:46.549102] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:34.139 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:34.139 "name": "Existed_Raid", 00:20:34.139 "aliases": [ 00:20:34.139 "e3f0a21e-317f-4abd-8d0c-d75e93f3b2f0" 00:20:34.139 ], 00:20:34.139 "product_name": "Raid Volume", 00:20:34.139 "block_size": 512, 00:20:34.139 "num_blocks": 65536, 00:20:34.139 "uuid": "e3f0a21e-317f-4abd-8d0c-d75e93f3b2f0", 00:20:34.139 "assigned_rate_limits": { 00:20:34.139 "rw_ios_per_sec": 0, 00:20:34.139 "rw_mbytes_per_sec": 0, 00:20:34.139 "r_mbytes_per_sec": 0, 00:20:34.139 "w_mbytes_per_sec": 0 00:20:34.139 }, 00:20:34.139 "claimed": false, 00:20:34.139 "zoned": false, 00:20:34.139 "supported_io_types": { 00:20:34.139 "read": true, 00:20:34.139 "write": true, 00:20:34.139 "unmap": false, 00:20:34.139 "flush": false, 00:20:34.139 "reset": true, 00:20:34.139 "nvme_admin": false, 00:20:34.139 "nvme_io": false, 00:20:34.139 "nvme_io_md": false, 00:20:34.139 "write_zeroes": true, 00:20:34.139 "zcopy": false, 00:20:34.139 "get_zone_info": false, 00:20:34.139 "zone_management": false, 00:20:34.139 "zone_append": false, 00:20:34.139 "compare": false, 00:20:34.139 "compare_and_write": false, 00:20:34.139 "abort": false, 00:20:34.139 "seek_hole": false, 00:20:34.139 "seek_data": false, 00:20:34.139 "copy": false, 00:20:34.139 "nvme_iov_md": false 00:20:34.139 }, 00:20:34.139 "memory_domains": [ 00:20:34.139 { 00:20:34.139 "dma_device_id": "system", 00:20:34.139 "dma_device_type": 1 00:20:34.139 }, 00:20:34.139 { 00:20:34.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:34.139 "dma_device_type": 2 00:20:34.139 }, 00:20:34.139 { 00:20:34.139 "dma_device_id": "system", 00:20:34.139 "dma_device_type": 1 00:20:34.139 }, 00:20:34.139 { 00:20:34.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:34.139 "dma_device_type": 2 00:20:34.139 }, 00:20:34.139 { 00:20:34.139 "dma_device_id": "system", 00:20:34.139 "dma_device_type": 1 00:20:34.139 }, 00:20:34.139 { 00:20:34.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:34.139 "dma_device_type": 2 00:20:34.139 }, 00:20:34.139 { 00:20:34.139 "dma_device_id": "system", 00:20:34.139 "dma_device_type": 1 00:20:34.139 }, 00:20:34.139 { 00:20:34.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:34.139 "dma_device_type": 2 00:20:34.139 } 00:20:34.139 ], 00:20:34.139 "driver_specific": { 00:20:34.139 "raid": { 00:20:34.139 "uuid": "e3f0a21e-317f-4abd-8d0c-d75e93f3b2f0", 00:20:34.139 "strip_size_kb": 0, 00:20:34.139 "state": "online", 00:20:34.139 "raid_level": "raid1", 00:20:34.139 "superblock": false, 00:20:34.139 "num_base_bdevs": 4, 00:20:34.139 "num_base_bdevs_discovered": 4, 00:20:34.139 "num_base_bdevs_operational": 4, 00:20:34.139 "base_bdevs_list": [ 00:20:34.139 { 00:20:34.139 "name": "NewBaseBdev", 00:20:34.139 "uuid": "02046ca7-c474-4b4c-bd90-ec46402b4812", 00:20:34.139 "is_configured": true, 00:20:34.139 "data_offset": 0, 00:20:34.139 "data_size": 65536 00:20:34.139 }, 00:20:34.139 { 00:20:34.139 "name": "BaseBdev2", 00:20:34.139 "uuid": "41c7140a-e76b-4a48-b482-1e45d356a737", 00:20:34.139 "is_configured": true, 00:20:34.139 "data_offset": 0, 00:20:34.139 "data_size": 65536 00:20:34.139 }, 00:20:34.139 { 00:20:34.139 "name": "BaseBdev3", 00:20:34.139 "uuid": "e9e592ae-b14e-4d17-8c62-cc746d74d485", 00:20:34.139 "is_configured": true, 00:20:34.139 "data_offset": 0, 00:20:34.139 "data_size": 65536 00:20:34.139 }, 00:20:34.139 { 00:20:34.139 "name": "BaseBdev4", 00:20:34.139 "uuid": "bbc18772-11f8-4173-a86f-a7df0819714e", 00:20:34.139 "is_configured": true, 00:20:34.139 "data_offset": 0, 00:20:34.139 "data_size": 65536 00:20:34.139 } 00:20:34.139 ] 00:20:34.139 } 00:20:34.139 } 00:20:34.139 }' 00:20:34.139 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:34.139 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:34.139 BaseBdev2 00:20:34.139 BaseBdev3 00:20:34.139 BaseBdev4' 00:20:34.139 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:34.139 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:34.139 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:34.397 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:34.397 "name": "NewBaseBdev", 00:20:34.397 "aliases": [ 00:20:34.397 "02046ca7-c474-4b4c-bd90-ec46402b4812" 00:20:34.397 ], 00:20:34.397 "product_name": "Malloc disk", 00:20:34.397 "block_size": 512, 00:20:34.397 "num_blocks": 65536, 00:20:34.397 "uuid": "02046ca7-c474-4b4c-bd90-ec46402b4812", 00:20:34.397 "assigned_rate_limits": { 00:20:34.397 "rw_ios_per_sec": 0, 00:20:34.397 "rw_mbytes_per_sec": 0, 00:20:34.397 "r_mbytes_per_sec": 0, 00:20:34.397 "w_mbytes_per_sec": 0 00:20:34.397 }, 00:20:34.397 "claimed": true, 00:20:34.397 "claim_type": "exclusive_write", 00:20:34.397 "zoned": false, 00:20:34.397 "supported_io_types": { 00:20:34.397 "read": true, 00:20:34.397 "write": true, 00:20:34.397 "unmap": true, 00:20:34.397 "flush": true, 00:20:34.397 "reset": true, 00:20:34.397 "nvme_admin": false, 00:20:34.397 "nvme_io": false, 00:20:34.397 "nvme_io_md": false, 00:20:34.397 "write_zeroes": true, 00:20:34.397 "zcopy": true, 00:20:34.397 "get_zone_info": false, 00:20:34.397 "zone_management": false, 00:20:34.397 "zone_append": false, 00:20:34.397 "compare": false, 00:20:34.397 "compare_and_write": false, 00:20:34.397 "abort": true, 00:20:34.397 "seek_hole": false, 00:20:34.397 "seek_data": false, 00:20:34.397 "copy": true, 00:20:34.397 "nvme_iov_md": false 00:20:34.397 }, 00:20:34.397 "memory_domains": [ 00:20:34.397 { 00:20:34.397 "dma_device_id": "system", 00:20:34.397 "dma_device_type": 1 00:20:34.397 }, 00:20:34.397 { 00:20:34.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:34.397 "dma_device_type": 2 00:20:34.397 } 00:20:34.397 ], 00:20:34.397 "driver_specific": {} 00:20:34.397 }' 00:20:34.397 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:34.397 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:34.397 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:34.397 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:34.397 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:34.656 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:34.656 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:34.656 08:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:34.656 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:34.656 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:34.656 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:34.656 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:34.656 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:34.656 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:34.656 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:34.914 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:34.914 "name": "BaseBdev2", 00:20:34.914 "aliases": [ 00:20:34.914 "41c7140a-e76b-4a48-b482-1e45d356a737" 00:20:34.914 ], 00:20:34.914 "product_name": "Malloc disk", 00:20:34.914 "block_size": 512, 00:20:34.914 "num_blocks": 65536, 00:20:34.914 "uuid": "41c7140a-e76b-4a48-b482-1e45d356a737", 00:20:34.914 "assigned_rate_limits": { 00:20:34.914 "rw_ios_per_sec": 0, 00:20:34.914 "rw_mbytes_per_sec": 0, 00:20:34.914 "r_mbytes_per_sec": 0, 00:20:34.914 "w_mbytes_per_sec": 0 00:20:34.914 }, 00:20:34.914 "claimed": true, 00:20:34.914 "claim_type": "exclusive_write", 00:20:34.914 "zoned": false, 00:20:34.914 "supported_io_types": { 00:20:34.914 "read": true, 00:20:34.914 "write": true, 00:20:34.914 "unmap": true, 00:20:34.914 "flush": true, 00:20:34.914 "reset": true, 00:20:34.914 "nvme_admin": false, 00:20:34.914 "nvme_io": false, 00:20:34.914 "nvme_io_md": false, 00:20:34.914 "write_zeroes": true, 00:20:34.914 "zcopy": true, 00:20:34.914 "get_zone_info": false, 00:20:34.914 "zone_management": false, 00:20:34.914 "zone_append": false, 00:20:34.914 "compare": false, 00:20:34.914 "compare_and_write": false, 00:20:34.914 "abort": true, 00:20:34.914 "seek_hole": false, 00:20:34.914 "seek_data": false, 00:20:34.914 "copy": true, 00:20:34.914 "nvme_iov_md": false 00:20:34.914 }, 00:20:34.914 "memory_domains": [ 00:20:34.914 { 00:20:34.914 "dma_device_id": "system", 00:20:34.914 "dma_device_type": 1 00:20:34.914 }, 00:20:34.914 { 00:20:34.914 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:34.914 "dma_device_type": 2 00:20:34.914 } 00:20:34.914 ], 00:20:34.914 "driver_specific": {} 00:20:34.914 }' 00:20:34.914 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:34.914 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:34.914 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:34.914 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:34.914 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:35.172 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:35.172 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:35.172 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:35.172 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:35.172 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:35.172 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:35.172 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:35.172 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:35.172 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:35.172 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:35.430 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:35.430 "name": "BaseBdev3", 00:20:35.430 "aliases": [ 00:20:35.430 "e9e592ae-b14e-4d17-8c62-cc746d74d485" 00:20:35.430 ], 00:20:35.430 "product_name": "Malloc disk", 00:20:35.430 "block_size": 512, 00:20:35.430 "num_blocks": 65536, 00:20:35.430 "uuid": "e9e592ae-b14e-4d17-8c62-cc746d74d485", 00:20:35.430 "assigned_rate_limits": { 00:20:35.430 "rw_ios_per_sec": 0, 00:20:35.430 "rw_mbytes_per_sec": 0, 00:20:35.430 "r_mbytes_per_sec": 0, 00:20:35.430 "w_mbytes_per_sec": 0 00:20:35.430 }, 00:20:35.430 "claimed": true, 00:20:35.430 "claim_type": "exclusive_write", 00:20:35.430 "zoned": false, 00:20:35.430 "supported_io_types": { 00:20:35.430 "read": true, 00:20:35.430 "write": true, 00:20:35.430 "unmap": true, 00:20:35.430 "flush": true, 00:20:35.430 "reset": true, 00:20:35.430 "nvme_admin": false, 00:20:35.430 "nvme_io": false, 00:20:35.430 "nvme_io_md": false, 00:20:35.430 "write_zeroes": true, 00:20:35.430 "zcopy": true, 00:20:35.430 "get_zone_info": false, 00:20:35.430 "zone_management": false, 00:20:35.430 "zone_append": false, 00:20:35.430 "compare": false, 00:20:35.430 "compare_and_write": false, 00:20:35.430 "abort": true, 00:20:35.430 "seek_hole": false, 00:20:35.431 "seek_data": false, 00:20:35.431 "copy": true, 00:20:35.431 "nvme_iov_md": false 00:20:35.431 }, 00:20:35.431 "memory_domains": [ 00:20:35.431 { 00:20:35.431 "dma_device_id": "system", 00:20:35.431 "dma_device_type": 1 00:20:35.431 }, 00:20:35.431 { 00:20:35.431 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:35.431 "dma_device_type": 2 00:20:35.431 } 00:20:35.431 ], 00:20:35.431 "driver_specific": {} 00:20:35.431 }' 00:20:35.431 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:35.431 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:35.431 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:35.431 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:35.431 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:35.689 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:35.689 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:35.689 08:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:35.689 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:35.689 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:35.689 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:35.689 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:35.689 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:35.689 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:35.689 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:35.947 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:35.947 "name": "BaseBdev4", 00:20:35.947 "aliases": [ 00:20:35.947 "bbc18772-11f8-4173-a86f-a7df0819714e" 00:20:35.947 ], 00:20:35.947 "product_name": "Malloc disk", 00:20:35.947 "block_size": 512, 00:20:35.947 "num_blocks": 65536, 00:20:35.947 "uuid": "bbc18772-11f8-4173-a86f-a7df0819714e", 00:20:35.947 "assigned_rate_limits": { 00:20:35.947 "rw_ios_per_sec": 0, 00:20:35.947 "rw_mbytes_per_sec": 0, 00:20:35.947 "r_mbytes_per_sec": 0, 00:20:35.947 "w_mbytes_per_sec": 0 00:20:35.947 }, 00:20:35.947 "claimed": true, 00:20:35.947 "claim_type": "exclusive_write", 00:20:35.947 "zoned": false, 00:20:35.947 "supported_io_types": { 00:20:35.947 "read": true, 00:20:35.947 "write": true, 00:20:35.947 "unmap": true, 00:20:35.947 "flush": true, 00:20:35.947 "reset": true, 00:20:35.947 "nvme_admin": false, 00:20:35.947 "nvme_io": false, 00:20:35.947 "nvme_io_md": false, 00:20:35.947 "write_zeroes": true, 00:20:35.947 "zcopy": true, 00:20:35.947 "get_zone_info": false, 00:20:35.947 "zone_management": false, 00:20:35.947 "zone_append": false, 00:20:35.947 "compare": false, 00:20:35.947 "compare_and_write": false, 00:20:35.947 "abort": true, 00:20:35.947 "seek_hole": false, 00:20:35.947 "seek_data": false, 00:20:35.947 "copy": true, 00:20:35.947 "nvme_iov_md": false 00:20:35.947 }, 00:20:35.947 "memory_domains": [ 00:20:35.947 { 00:20:35.947 "dma_device_id": "system", 00:20:35.947 "dma_device_type": 1 00:20:35.947 }, 00:20:35.947 { 00:20:35.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:35.947 "dma_device_type": 2 00:20:35.947 } 00:20:35.947 ], 00:20:35.947 "driver_specific": {} 00:20:35.947 }' 00:20:35.947 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:35.947 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:35.947 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:35.948 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:35.948 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:35.948 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:35.948 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:35.948 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:36.206 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:36.206 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:36.207 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:36.207 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:36.207 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:36.207 [2024-07-23 08:33:48.706653] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:36.207 [2024-07-23 08:33:48.706679] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:36.207 [2024-07-23 08:33:48.706751] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:36.207 [2024-07-23 08:33:48.707023] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:36.207 [2024-07-23 08:33:48.707037] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037280 name Existed_Raid, state offline 00:20:36.207 08:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1504427 00:20:36.207 08:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1504427 ']' 00:20:36.207 08:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1504427 00:20:36.207 08:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:20:36.207 08:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:36.466 08:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1504427 00:20:36.466 08:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:36.466 08:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:36.466 08:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1504427' 00:20:36.466 killing process with pid 1504427 00:20:36.466 08:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1504427 00:20:36.466 [2024-07-23 08:33:48.761791] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:36.466 08:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1504427 00:20:36.725 [2024-07-23 08:33:49.082089] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:20:38.102 00:20:38.102 real 0m26.597s 00:20:38.102 user 0m47.610s 00:20:38.102 sys 0m3.940s 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:38.102 ************************************ 00:20:38.102 END TEST raid_state_function_test 00:20:38.102 ************************************ 00:20:38.102 08:33:50 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:38.102 08:33:50 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:20:38.102 08:33:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:38.102 08:33:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:38.102 08:33:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:38.102 ************************************ 00:20:38.102 START TEST raid_state_function_test_sb 00:20:38.102 ************************************ 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1509992 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1509992' 00:20:38.102 Process raid pid: 1509992 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1509992 /var/tmp/spdk-raid.sock 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1509992 ']' 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:38.102 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:38.102 08:33:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:38.102 [2024-07-23 08:33:50.494188] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:20:38.102 [2024-07-23 08:33:50.494277] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:38.102 [2024-07-23 08:33:50.620071] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:38.361 [2024-07-23 08:33:50.837480] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:38.620 [2024-07-23 08:33:51.112619] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:38.620 [2024-07-23 08:33:51.112647] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:38.879 08:33:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:38.879 08:33:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:20:38.879 08:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:39.147 [2024-07-23 08:33:51.424819] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:39.147 [2024-07-23 08:33:51.424860] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:39.147 [2024-07-23 08:33:51.424870] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:39.147 [2024-07-23 08:33:51.424880] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:39.147 [2024-07-23 08:33:51.424887] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:39.147 [2024-07-23 08:33:51.424896] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:39.147 [2024-07-23 08:33:51.424902] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:39.147 [2024-07-23 08:33:51.424911] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:39.147 08:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:39.147 08:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:39.147 08:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:39.147 08:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:39.147 08:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:39.147 08:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:39.147 08:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:39.147 08:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:39.147 08:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:39.147 08:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:39.147 08:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.147 08:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:39.147 08:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:39.147 "name": "Existed_Raid", 00:20:39.147 "uuid": "6fbd0f4e-49e0-4147-b541-7f427c42b9c8", 00:20:39.147 "strip_size_kb": 0, 00:20:39.147 "state": "configuring", 00:20:39.147 "raid_level": "raid1", 00:20:39.147 "superblock": true, 00:20:39.147 "num_base_bdevs": 4, 00:20:39.147 "num_base_bdevs_discovered": 0, 00:20:39.147 "num_base_bdevs_operational": 4, 00:20:39.147 "base_bdevs_list": [ 00:20:39.147 { 00:20:39.147 "name": "BaseBdev1", 00:20:39.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.147 "is_configured": false, 00:20:39.147 "data_offset": 0, 00:20:39.147 "data_size": 0 00:20:39.147 }, 00:20:39.147 { 00:20:39.147 "name": "BaseBdev2", 00:20:39.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.147 "is_configured": false, 00:20:39.147 "data_offset": 0, 00:20:39.147 "data_size": 0 00:20:39.147 }, 00:20:39.147 { 00:20:39.147 "name": "BaseBdev3", 00:20:39.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.147 "is_configured": false, 00:20:39.147 "data_offset": 0, 00:20:39.147 "data_size": 0 00:20:39.147 }, 00:20:39.147 { 00:20:39.147 "name": "BaseBdev4", 00:20:39.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.147 "is_configured": false, 00:20:39.147 "data_offset": 0, 00:20:39.147 "data_size": 0 00:20:39.147 } 00:20:39.147 ] 00:20:39.147 }' 00:20:39.147 08:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:39.147 08:33:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:39.719 08:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:39.977 [2024-07-23 08:33:52.246865] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:39.977 [2024-07-23 08:33:52.246900] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:20:39.977 08:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:39.977 [2024-07-23 08:33:52.423363] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:39.977 [2024-07-23 08:33:52.423407] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:39.977 [2024-07-23 08:33:52.423415] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:39.977 [2024-07-23 08:33:52.423424] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:39.977 [2024-07-23 08:33:52.423430] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:39.977 [2024-07-23 08:33:52.423439] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:39.977 [2024-07-23 08:33:52.423445] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:39.977 [2024-07-23 08:33:52.423453] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:39.977 08:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:40.236 [2024-07-23 08:33:52.657703] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:40.236 BaseBdev1 00:20:40.236 08:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:40.236 08:33:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:40.236 08:33:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:40.236 08:33:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:40.236 08:33:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:40.236 08:33:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:40.236 08:33:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:40.540 08:33:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:40.540 [ 00:20:40.540 { 00:20:40.540 "name": "BaseBdev1", 00:20:40.540 "aliases": [ 00:20:40.540 "ced64e8f-2431-412a-9e6c-cc73f31959b6" 00:20:40.540 ], 00:20:40.540 "product_name": "Malloc disk", 00:20:40.540 "block_size": 512, 00:20:40.540 "num_blocks": 65536, 00:20:40.540 "uuid": "ced64e8f-2431-412a-9e6c-cc73f31959b6", 00:20:40.540 "assigned_rate_limits": { 00:20:40.540 "rw_ios_per_sec": 0, 00:20:40.540 "rw_mbytes_per_sec": 0, 00:20:40.540 "r_mbytes_per_sec": 0, 00:20:40.540 "w_mbytes_per_sec": 0 00:20:40.540 }, 00:20:40.540 "claimed": true, 00:20:40.540 "claim_type": "exclusive_write", 00:20:40.540 "zoned": false, 00:20:40.540 "supported_io_types": { 00:20:40.540 "read": true, 00:20:40.540 "write": true, 00:20:40.540 "unmap": true, 00:20:40.540 "flush": true, 00:20:40.540 "reset": true, 00:20:40.540 "nvme_admin": false, 00:20:40.540 "nvme_io": false, 00:20:40.540 "nvme_io_md": false, 00:20:40.540 "write_zeroes": true, 00:20:40.540 "zcopy": true, 00:20:40.540 "get_zone_info": false, 00:20:40.540 "zone_management": false, 00:20:40.540 "zone_append": false, 00:20:40.540 "compare": false, 00:20:40.540 "compare_and_write": false, 00:20:40.540 "abort": true, 00:20:40.540 "seek_hole": false, 00:20:40.540 "seek_data": false, 00:20:40.540 "copy": true, 00:20:40.540 "nvme_iov_md": false 00:20:40.540 }, 00:20:40.540 "memory_domains": [ 00:20:40.540 { 00:20:40.540 "dma_device_id": "system", 00:20:40.540 "dma_device_type": 1 00:20:40.540 }, 00:20:40.540 { 00:20:40.540 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:40.540 "dma_device_type": 2 00:20:40.540 } 00:20:40.540 ], 00:20:40.540 "driver_specific": {} 00:20:40.540 } 00:20:40.540 ] 00:20:40.540 08:33:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:40.540 08:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:40.540 08:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:40.540 08:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:40.540 08:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:40.540 08:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:40.540 08:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:40.540 08:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:40.540 08:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:40.540 08:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:40.540 08:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:40.540 08:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:40.540 08:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.817 08:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:40.817 "name": "Existed_Raid", 00:20:40.817 "uuid": "cc59bc05-0b5e-4bc1-b663-9ae76b158d7d", 00:20:40.817 "strip_size_kb": 0, 00:20:40.817 "state": "configuring", 00:20:40.817 "raid_level": "raid1", 00:20:40.817 "superblock": true, 00:20:40.817 "num_base_bdevs": 4, 00:20:40.817 "num_base_bdevs_discovered": 1, 00:20:40.817 "num_base_bdevs_operational": 4, 00:20:40.817 "base_bdevs_list": [ 00:20:40.817 { 00:20:40.817 "name": "BaseBdev1", 00:20:40.817 "uuid": "ced64e8f-2431-412a-9e6c-cc73f31959b6", 00:20:40.817 "is_configured": true, 00:20:40.817 "data_offset": 2048, 00:20:40.817 "data_size": 63488 00:20:40.817 }, 00:20:40.817 { 00:20:40.817 "name": "BaseBdev2", 00:20:40.817 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.817 "is_configured": false, 00:20:40.817 "data_offset": 0, 00:20:40.817 "data_size": 0 00:20:40.817 }, 00:20:40.817 { 00:20:40.817 "name": "BaseBdev3", 00:20:40.817 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.817 "is_configured": false, 00:20:40.817 "data_offset": 0, 00:20:40.817 "data_size": 0 00:20:40.817 }, 00:20:40.817 { 00:20:40.817 "name": "BaseBdev4", 00:20:40.817 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.817 "is_configured": false, 00:20:40.817 "data_offset": 0, 00:20:40.817 "data_size": 0 00:20:40.817 } 00:20:40.817 ] 00:20:40.817 }' 00:20:40.817 08:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:40.817 08:33:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:41.385 08:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:41.385 [2024-07-23 08:33:53.836856] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:41.385 [2024-07-23 08:33:53.836911] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:20:41.385 08:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:41.643 [2024-07-23 08:33:54.025382] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:41.643 [2024-07-23 08:33:54.026979] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:41.643 [2024-07-23 08:33:54.027016] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:41.643 [2024-07-23 08:33:54.027026] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:41.643 [2024-07-23 08:33:54.027052] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:41.643 [2024-07-23 08:33:54.027059] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:41.643 [2024-07-23 08:33:54.027071] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:41.643 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:41.643 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:41.643 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:41.643 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:41.644 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:41.644 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:41.644 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:41.644 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:41.644 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:41.644 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:41.644 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:41.644 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:41.644 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.644 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:41.902 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:41.902 "name": "Existed_Raid", 00:20:41.902 "uuid": "b4de9b3e-9d12-402b-8a24-817bc9887ac4", 00:20:41.902 "strip_size_kb": 0, 00:20:41.902 "state": "configuring", 00:20:41.902 "raid_level": "raid1", 00:20:41.902 "superblock": true, 00:20:41.902 "num_base_bdevs": 4, 00:20:41.902 "num_base_bdevs_discovered": 1, 00:20:41.902 "num_base_bdevs_operational": 4, 00:20:41.902 "base_bdevs_list": [ 00:20:41.902 { 00:20:41.902 "name": "BaseBdev1", 00:20:41.902 "uuid": "ced64e8f-2431-412a-9e6c-cc73f31959b6", 00:20:41.902 "is_configured": true, 00:20:41.902 "data_offset": 2048, 00:20:41.902 "data_size": 63488 00:20:41.902 }, 00:20:41.902 { 00:20:41.903 "name": "BaseBdev2", 00:20:41.903 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.903 "is_configured": false, 00:20:41.903 "data_offset": 0, 00:20:41.903 "data_size": 0 00:20:41.903 }, 00:20:41.903 { 00:20:41.903 "name": "BaseBdev3", 00:20:41.903 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.903 "is_configured": false, 00:20:41.903 "data_offset": 0, 00:20:41.903 "data_size": 0 00:20:41.903 }, 00:20:41.903 { 00:20:41.903 "name": "BaseBdev4", 00:20:41.903 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.903 "is_configured": false, 00:20:41.903 "data_offset": 0, 00:20:41.903 "data_size": 0 00:20:41.903 } 00:20:41.903 ] 00:20:41.903 }' 00:20:41.903 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:41.903 08:33:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:42.471 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:42.471 [2024-07-23 08:33:54.893695] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:42.471 BaseBdev2 00:20:42.471 08:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:42.471 08:33:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:42.471 08:33:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:42.471 08:33:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:42.471 08:33:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:42.471 08:33:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:42.471 08:33:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:42.730 08:33:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:42.730 [ 00:20:42.730 { 00:20:42.730 "name": "BaseBdev2", 00:20:42.730 "aliases": [ 00:20:42.730 "1a1bf42a-1864-4bd4-a58c-3c9f6c3eee6c" 00:20:42.730 ], 00:20:42.730 "product_name": "Malloc disk", 00:20:42.730 "block_size": 512, 00:20:42.730 "num_blocks": 65536, 00:20:42.730 "uuid": "1a1bf42a-1864-4bd4-a58c-3c9f6c3eee6c", 00:20:42.730 "assigned_rate_limits": { 00:20:42.730 "rw_ios_per_sec": 0, 00:20:42.730 "rw_mbytes_per_sec": 0, 00:20:42.730 "r_mbytes_per_sec": 0, 00:20:42.730 "w_mbytes_per_sec": 0 00:20:42.730 }, 00:20:42.730 "claimed": true, 00:20:42.730 "claim_type": "exclusive_write", 00:20:42.730 "zoned": false, 00:20:42.730 "supported_io_types": { 00:20:42.730 "read": true, 00:20:42.730 "write": true, 00:20:42.730 "unmap": true, 00:20:42.730 "flush": true, 00:20:42.730 "reset": true, 00:20:42.730 "nvme_admin": false, 00:20:42.730 "nvme_io": false, 00:20:42.730 "nvme_io_md": false, 00:20:42.730 "write_zeroes": true, 00:20:42.730 "zcopy": true, 00:20:42.730 "get_zone_info": false, 00:20:42.730 "zone_management": false, 00:20:42.730 "zone_append": false, 00:20:42.730 "compare": false, 00:20:42.730 "compare_and_write": false, 00:20:42.730 "abort": true, 00:20:42.730 "seek_hole": false, 00:20:42.730 "seek_data": false, 00:20:42.730 "copy": true, 00:20:42.730 "nvme_iov_md": false 00:20:42.730 }, 00:20:42.730 "memory_domains": [ 00:20:42.730 { 00:20:42.730 "dma_device_id": "system", 00:20:42.730 "dma_device_type": 1 00:20:42.730 }, 00:20:42.730 { 00:20:42.730 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:42.730 "dma_device_type": 2 00:20:42.730 } 00:20:42.730 ], 00:20:42.730 "driver_specific": {} 00:20:42.730 } 00:20:42.730 ] 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:42.989 "name": "Existed_Raid", 00:20:42.989 "uuid": "b4de9b3e-9d12-402b-8a24-817bc9887ac4", 00:20:42.989 "strip_size_kb": 0, 00:20:42.989 "state": "configuring", 00:20:42.989 "raid_level": "raid1", 00:20:42.989 "superblock": true, 00:20:42.989 "num_base_bdevs": 4, 00:20:42.989 "num_base_bdevs_discovered": 2, 00:20:42.989 "num_base_bdevs_operational": 4, 00:20:42.989 "base_bdevs_list": [ 00:20:42.989 { 00:20:42.989 "name": "BaseBdev1", 00:20:42.989 "uuid": "ced64e8f-2431-412a-9e6c-cc73f31959b6", 00:20:42.989 "is_configured": true, 00:20:42.989 "data_offset": 2048, 00:20:42.989 "data_size": 63488 00:20:42.989 }, 00:20:42.989 { 00:20:42.989 "name": "BaseBdev2", 00:20:42.989 "uuid": "1a1bf42a-1864-4bd4-a58c-3c9f6c3eee6c", 00:20:42.989 "is_configured": true, 00:20:42.989 "data_offset": 2048, 00:20:42.989 "data_size": 63488 00:20:42.989 }, 00:20:42.989 { 00:20:42.989 "name": "BaseBdev3", 00:20:42.989 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:42.989 "is_configured": false, 00:20:42.989 "data_offset": 0, 00:20:42.989 "data_size": 0 00:20:42.989 }, 00:20:42.989 { 00:20:42.989 "name": "BaseBdev4", 00:20:42.989 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:42.989 "is_configured": false, 00:20:42.989 "data_offset": 0, 00:20:42.989 "data_size": 0 00:20:42.989 } 00:20:42.989 ] 00:20:42.989 }' 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:42.989 08:33:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:43.556 08:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:43.814 [2024-07-23 08:33:56.152717] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:43.814 BaseBdev3 00:20:43.814 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:43.814 08:33:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:43.814 08:33:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:43.814 08:33:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:43.814 08:33:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:43.814 08:33:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:43.814 08:33:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:44.073 08:33:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:44.073 [ 00:20:44.073 { 00:20:44.073 "name": "BaseBdev3", 00:20:44.073 "aliases": [ 00:20:44.073 "5f43f4b4-66c7-4bb5-af99-5e0b0f9176c5" 00:20:44.073 ], 00:20:44.073 "product_name": "Malloc disk", 00:20:44.073 "block_size": 512, 00:20:44.073 "num_blocks": 65536, 00:20:44.073 "uuid": "5f43f4b4-66c7-4bb5-af99-5e0b0f9176c5", 00:20:44.073 "assigned_rate_limits": { 00:20:44.073 "rw_ios_per_sec": 0, 00:20:44.073 "rw_mbytes_per_sec": 0, 00:20:44.073 "r_mbytes_per_sec": 0, 00:20:44.073 "w_mbytes_per_sec": 0 00:20:44.073 }, 00:20:44.073 "claimed": true, 00:20:44.073 "claim_type": "exclusive_write", 00:20:44.073 "zoned": false, 00:20:44.073 "supported_io_types": { 00:20:44.073 "read": true, 00:20:44.073 "write": true, 00:20:44.073 "unmap": true, 00:20:44.073 "flush": true, 00:20:44.073 "reset": true, 00:20:44.073 "nvme_admin": false, 00:20:44.073 "nvme_io": false, 00:20:44.073 "nvme_io_md": false, 00:20:44.073 "write_zeroes": true, 00:20:44.073 "zcopy": true, 00:20:44.073 "get_zone_info": false, 00:20:44.073 "zone_management": false, 00:20:44.073 "zone_append": false, 00:20:44.073 "compare": false, 00:20:44.073 "compare_and_write": false, 00:20:44.073 "abort": true, 00:20:44.073 "seek_hole": false, 00:20:44.073 "seek_data": false, 00:20:44.073 "copy": true, 00:20:44.073 "nvme_iov_md": false 00:20:44.073 }, 00:20:44.073 "memory_domains": [ 00:20:44.073 { 00:20:44.073 "dma_device_id": "system", 00:20:44.073 "dma_device_type": 1 00:20:44.073 }, 00:20:44.073 { 00:20:44.073 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:44.073 "dma_device_type": 2 00:20:44.073 } 00:20:44.073 ], 00:20:44.073 "driver_specific": {} 00:20:44.073 } 00:20:44.073 ] 00:20:44.073 08:33:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:44.073 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:44.073 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:44.073 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:44.073 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:44.073 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:44.073 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:44.073 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:44.073 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:44.073 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:44.073 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:44.073 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:44.073 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:44.074 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.074 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:44.332 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:44.332 "name": "Existed_Raid", 00:20:44.332 "uuid": "b4de9b3e-9d12-402b-8a24-817bc9887ac4", 00:20:44.332 "strip_size_kb": 0, 00:20:44.332 "state": "configuring", 00:20:44.332 "raid_level": "raid1", 00:20:44.332 "superblock": true, 00:20:44.332 "num_base_bdevs": 4, 00:20:44.332 "num_base_bdevs_discovered": 3, 00:20:44.332 "num_base_bdevs_operational": 4, 00:20:44.332 "base_bdevs_list": [ 00:20:44.332 { 00:20:44.332 "name": "BaseBdev1", 00:20:44.332 "uuid": "ced64e8f-2431-412a-9e6c-cc73f31959b6", 00:20:44.332 "is_configured": true, 00:20:44.332 "data_offset": 2048, 00:20:44.332 "data_size": 63488 00:20:44.332 }, 00:20:44.332 { 00:20:44.332 "name": "BaseBdev2", 00:20:44.332 "uuid": "1a1bf42a-1864-4bd4-a58c-3c9f6c3eee6c", 00:20:44.332 "is_configured": true, 00:20:44.332 "data_offset": 2048, 00:20:44.332 "data_size": 63488 00:20:44.332 }, 00:20:44.332 { 00:20:44.332 "name": "BaseBdev3", 00:20:44.333 "uuid": "5f43f4b4-66c7-4bb5-af99-5e0b0f9176c5", 00:20:44.333 "is_configured": true, 00:20:44.333 "data_offset": 2048, 00:20:44.333 "data_size": 63488 00:20:44.333 }, 00:20:44.333 { 00:20:44.333 "name": "BaseBdev4", 00:20:44.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:44.333 "is_configured": false, 00:20:44.333 "data_offset": 0, 00:20:44.333 "data_size": 0 00:20:44.333 } 00:20:44.333 ] 00:20:44.333 }' 00:20:44.333 08:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:44.333 08:33:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:44.900 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:44.900 [2024-07-23 08:33:57.354091] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:44.900 [2024-07-23 08:33:57.354328] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:20:44.900 [2024-07-23 08:33:57.354344] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:44.900 [2024-07-23 08:33:57.354596] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:20:44.900 [2024-07-23 08:33:57.354788] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:20:44.900 [2024-07-23 08:33:57.354800] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:20:44.900 BaseBdev4 00:20:44.900 [2024-07-23 08:33:57.354967] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:44.900 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:44.900 08:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:44.900 08:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:44.900 08:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:44.900 08:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:44.900 08:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:44.900 08:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:45.161 08:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:45.420 [ 00:20:45.420 { 00:20:45.420 "name": "BaseBdev4", 00:20:45.420 "aliases": [ 00:20:45.420 "8685dcfd-5765-4ab5-a903-95b3219339f9" 00:20:45.420 ], 00:20:45.420 "product_name": "Malloc disk", 00:20:45.420 "block_size": 512, 00:20:45.420 "num_blocks": 65536, 00:20:45.420 "uuid": "8685dcfd-5765-4ab5-a903-95b3219339f9", 00:20:45.420 "assigned_rate_limits": { 00:20:45.420 "rw_ios_per_sec": 0, 00:20:45.420 "rw_mbytes_per_sec": 0, 00:20:45.420 "r_mbytes_per_sec": 0, 00:20:45.420 "w_mbytes_per_sec": 0 00:20:45.420 }, 00:20:45.420 "claimed": true, 00:20:45.420 "claim_type": "exclusive_write", 00:20:45.420 "zoned": false, 00:20:45.420 "supported_io_types": { 00:20:45.420 "read": true, 00:20:45.420 "write": true, 00:20:45.420 "unmap": true, 00:20:45.420 "flush": true, 00:20:45.420 "reset": true, 00:20:45.420 "nvme_admin": false, 00:20:45.420 "nvme_io": false, 00:20:45.420 "nvme_io_md": false, 00:20:45.420 "write_zeroes": true, 00:20:45.420 "zcopy": true, 00:20:45.420 "get_zone_info": false, 00:20:45.420 "zone_management": false, 00:20:45.420 "zone_append": false, 00:20:45.420 "compare": false, 00:20:45.420 "compare_and_write": false, 00:20:45.420 "abort": true, 00:20:45.420 "seek_hole": false, 00:20:45.420 "seek_data": false, 00:20:45.420 "copy": true, 00:20:45.420 "nvme_iov_md": false 00:20:45.420 }, 00:20:45.420 "memory_domains": [ 00:20:45.420 { 00:20:45.420 "dma_device_id": "system", 00:20:45.420 "dma_device_type": 1 00:20:45.420 }, 00:20:45.421 { 00:20:45.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:45.421 "dma_device_type": 2 00:20:45.421 } 00:20:45.421 ], 00:20:45.421 "driver_specific": {} 00:20:45.421 } 00:20:45.421 ] 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:45.421 "name": "Existed_Raid", 00:20:45.421 "uuid": "b4de9b3e-9d12-402b-8a24-817bc9887ac4", 00:20:45.421 "strip_size_kb": 0, 00:20:45.421 "state": "online", 00:20:45.421 "raid_level": "raid1", 00:20:45.421 "superblock": true, 00:20:45.421 "num_base_bdevs": 4, 00:20:45.421 "num_base_bdevs_discovered": 4, 00:20:45.421 "num_base_bdevs_operational": 4, 00:20:45.421 "base_bdevs_list": [ 00:20:45.421 { 00:20:45.421 "name": "BaseBdev1", 00:20:45.421 "uuid": "ced64e8f-2431-412a-9e6c-cc73f31959b6", 00:20:45.421 "is_configured": true, 00:20:45.421 "data_offset": 2048, 00:20:45.421 "data_size": 63488 00:20:45.421 }, 00:20:45.421 { 00:20:45.421 "name": "BaseBdev2", 00:20:45.421 "uuid": "1a1bf42a-1864-4bd4-a58c-3c9f6c3eee6c", 00:20:45.421 "is_configured": true, 00:20:45.421 "data_offset": 2048, 00:20:45.421 "data_size": 63488 00:20:45.421 }, 00:20:45.421 { 00:20:45.421 "name": "BaseBdev3", 00:20:45.421 "uuid": "5f43f4b4-66c7-4bb5-af99-5e0b0f9176c5", 00:20:45.421 "is_configured": true, 00:20:45.421 "data_offset": 2048, 00:20:45.421 "data_size": 63488 00:20:45.421 }, 00:20:45.421 { 00:20:45.421 "name": "BaseBdev4", 00:20:45.421 "uuid": "8685dcfd-5765-4ab5-a903-95b3219339f9", 00:20:45.421 "is_configured": true, 00:20:45.421 "data_offset": 2048, 00:20:45.421 "data_size": 63488 00:20:45.421 } 00:20:45.421 ] 00:20:45.421 }' 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:45.421 08:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:45.988 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:45.988 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:45.988 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:45.988 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:45.988 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:45.988 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:45.988 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:45.988 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:45.988 [2024-07-23 08:33:58.505519] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:46.248 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:46.248 "name": "Existed_Raid", 00:20:46.248 "aliases": [ 00:20:46.248 "b4de9b3e-9d12-402b-8a24-817bc9887ac4" 00:20:46.248 ], 00:20:46.248 "product_name": "Raid Volume", 00:20:46.248 "block_size": 512, 00:20:46.248 "num_blocks": 63488, 00:20:46.248 "uuid": "b4de9b3e-9d12-402b-8a24-817bc9887ac4", 00:20:46.248 "assigned_rate_limits": { 00:20:46.248 "rw_ios_per_sec": 0, 00:20:46.248 "rw_mbytes_per_sec": 0, 00:20:46.248 "r_mbytes_per_sec": 0, 00:20:46.248 "w_mbytes_per_sec": 0 00:20:46.248 }, 00:20:46.248 "claimed": false, 00:20:46.248 "zoned": false, 00:20:46.248 "supported_io_types": { 00:20:46.248 "read": true, 00:20:46.248 "write": true, 00:20:46.248 "unmap": false, 00:20:46.248 "flush": false, 00:20:46.248 "reset": true, 00:20:46.248 "nvme_admin": false, 00:20:46.248 "nvme_io": false, 00:20:46.248 "nvme_io_md": false, 00:20:46.248 "write_zeroes": true, 00:20:46.248 "zcopy": false, 00:20:46.248 "get_zone_info": false, 00:20:46.248 "zone_management": false, 00:20:46.248 "zone_append": false, 00:20:46.248 "compare": false, 00:20:46.248 "compare_and_write": false, 00:20:46.248 "abort": false, 00:20:46.248 "seek_hole": false, 00:20:46.248 "seek_data": false, 00:20:46.248 "copy": false, 00:20:46.248 "nvme_iov_md": false 00:20:46.248 }, 00:20:46.248 "memory_domains": [ 00:20:46.248 { 00:20:46.248 "dma_device_id": "system", 00:20:46.248 "dma_device_type": 1 00:20:46.248 }, 00:20:46.248 { 00:20:46.248 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.248 "dma_device_type": 2 00:20:46.248 }, 00:20:46.248 { 00:20:46.248 "dma_device_id": "system", 00:20:46.248 "dma_device_type": 1 00:20:46.248 }, 00:20:46.248 { 00:20:46.248 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.248 "dma_device_type": 2 00:20:46.248 }, 00:20:46.248 { 00:20:46.248 "dma_device_id": "system", 00:20:46.248 "dma_device_type": 1 00:20:46.248 }, 00:20:46.248 { 00:20:46.248 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.248 "dma_device_type": 2 00:20:46.248 }, 00:20:46.248 { 00:20:46.248 "dma_device_id": "system", 00:20:46.248 "dma_device_type": 1 00:20:46.248 }, 00:20:46.248 { 00:20:46.248 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.248 "dma_device_type": 2 00:20:46.248 } 00:20:46.248 ], 00:20:46.248 "driver_specific": { 00:20:46.248 "raid": { 00:20:46.248 "uuid": "b4de9b3e-9d12-402b-8a24-817bc9887ac4", 00:20:46.248 "strip_size_kb": 0, 00:20:46.248 "state": "online", 00:20:46.248 "raid_level": "raid1", 00:20:46.248 "superblock": true, 00:20:46.248 "num_base_bdevs": 4, 00:20:46.248 "num_base_bdevs_discovered": 4, 00:20:46.248 "num_base_bdevs_operational": 4, 00:20:46.248 "base_bdevs_list": [ 00:20:46.248 { 00:20:46.248 "name": "BaseBdev1", 00:20:46.248 "uuid": "ced64e8f-2431-412a-9e6c-cc73f31959b6", 00:20:46.248 "is_configured": true, 00:20:46.248 "data_offset": 2048, 00:20:46.248 "data_size": 63488 00:20:46.248 }, 00:20:46.248 { 00:20:46.248 "name": "BaseBdev2", 00:20:46.248 "uuid": "1a1bf42a-1864-4bd4-a58c-3c9f6c3eee6c", 00:20:46.248 "is_configured": true, 00:20:46.248 "data_offset": 2048, 00:20:46.248 "data_size": 63488 00:20:46.248 }, 00:20:46.248 { 00:20:46.248 "name": "BaseBdev3", 00:20:46.248 "uuid": "5f43f4b4-66c7-4bb5-af99-5e0b0f9176c5", 00:20:46.248 "is_configured": true, 00:20:46.248 "data_offset": 2048, 00:20:46.248 "data_size": 63488 00:20:46.248 }, 00:20:46.248 { 00:20:46.248 "name": "BaseBdev4", 00:20:46.248 "uuid": "8685dcfd-5765-4ab5-a903-95b3219339f9", 00:20:46.248 "is_configured": true, 00:20:46.248 "data_offset": 2048, 00:20:46.248 "data_size": 63488 00:20:46.248 } 00:20:46.248 ] 00:20:46.248 } 00:20:46.248 } 00:20:46.248 }' 00:20:46.248 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:46.248 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:46.248 BaseBdev2 00:20:46.248 BaseBdev3 00:20:46.248 BaseBdev4' 00:20:46.248 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:46.248 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:46.248 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:46.248 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:46.248 "name": "BaseBdev1", 00:20:46.248 "aliases": [ 00:20:46.248 "ced64e8f-2431-412a-9e6c-cc73f31959b6" 00:20:46.248 ], 00:20:46.248 "product_name": "Malloc disk", 00:20:46.248 "block_size": 512, 00:20:46.248 "num_blocks": 65536, 00:20:46.248 "uuid": "ced64e8f-2431-412a-9e6c-cc73f31959b6", 00:20:46.248 "assigned_rate_limits": { 00:20:46.248 "rw_ios_per_sec": 0, 00:20:46.248 "rw_mbytes_per_sec": 0, 00:20:46.248 "r_mbytes_per_sec": 0, 00:20:46.248 "w_mbytes_per_sec": 0 00:20:46.248 }, 00:20:46.248 "claimed": true, 00:20:46.248 "claim_type": "exclusive_write", 00:20:46.248 "zoned": false, 00:20:46.248 "supported_io_types": { 00:20:46.248 "read": true, 00:20:46.248 "write": true, 00:20:46.248 "unmap": true, 00:20:46.248 "flush": true, 00:20:46.248 "reset": true, 00:20:46.248 "nvme_admin": false, 00:20:46.248 "nvme_io": false, 00:20:46.248 "nvme_io_md": false, 00:20:46.248 "write_zeroes": true, 00:20:46.248 "zcopy": true, 00:20:46.248 "get_zone_info": false, 00:20:46.248 "zone_management": false, 00:20:46.248 "zone_append": false, 00:20:46.248 "compare": false, 00:20:46.248 "compare_and_write": false, 00:20:46.248 "abort": true, 00:20:46.248 "seek_hole": false, 00:20:46.248 "seek_data": false, 00:20:46.248 "copy": true, 00:20:46.248 "nvme_iov_md": false 00:20:46.248 }, 00:20:46.248 "memory_domains": [ 00:20:46.248 { 00:20:46.248 "dma_device_id": "system", 00:20:46.248 "dma_device_type": 1 00:20:46.248 }, 00:20:46.248 { 00:20:46.248 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.248 "dma_device_type": 2 00:20:46.248 } 00:20:46.248 ], 00:20:46.248 "driver_specific": {} 00:20:46.248 }' 00:20:46.248 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:46.507 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:46.507 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:46.507 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:46.507 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:46.507 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:46.507 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:46.507 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:46.507 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:46.507 08:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:46.766 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:46.766 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:46.766 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:46.766 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:46.766 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:46.766 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:46.766 "name": "BaseBdev2", 00:20:46.766 "aliases": [ 00:20:46.766 "1a1bf42a-1864-4bd4-a58c-3c9f6c3eee6c" 00:20:46.766 ], 00:20:46.766 "product_name": "Malloc disk", 00:20:46.766 "block_size": 512, 00:20:46.766 "num_blocks": 65536, 00:20:46.766 "uuid": "1a1bf42a-1864-4bd4-a58c-3c9f6c3eee6c", 00:20:46.766 "assigned_rate_limits": { 00:20:46.766 "rw_ios_per_sec": 0, 00:20:46.766 "rw_mbytes_per_sec": 0, 00:20:46.766 "r_mbytes_per_sec": 0, 00:20:46.766 "w_mbytes_per_sec": 0 00:20:46.766 }, 00:20:46.766 "claimed": true, 00:20:46.766 "claim_type": "exclusive_write", 00:20:46.766 "zoned": false, 00:20:46.766 "supported_io_types": { 00:20:46.766 "read": true, 00:20:46.766 "write": true, 00:20:46.766 "unmap": true, 00:20:46.766 "flush": true, 00:20:46.766 "reset": true, 00:20:46.766 "nvme_admin": false, 00:20:46.766 "nvme_io": false, 00:20:46.766 "nvme_io_md": false, 00:20:46.766 "write_zeroes": true, 00:20:46.766 "zcopy": true, 00:20:46.766 "get_zone_info": false, 00:20:46.766 "zone_management": false, 00:20:46.766 "zone_append": false, 00:20:46.766 "compare": false, 00:20:46.766 "compare_and_write": false, 00:20:46.766 "abort": true, 00:20:46.766 "seek_hole": false, 00:20:46.766 "seek_data": false, 00:20:46.766 "copy": true, 00:20:46.766 "nvme_iov_md": false 00:20:46.766 }, 00:20:46.766 "memory_domains": [ 00:20:46.766 { 00:20:46.766 "dma_device_id": "system", 00:20:46.766 "dma_device_type": 1 00:20:46.767 }, 00:20:46.767 { 00:20:46.767 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.767 "dma_device_type": 2 00:20:46.767 } 00:20:46.767 ], 00:20:46.767 "driver_specific": {} 00:20:46.767 }' 00:20:46.767 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:46.767 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:47.025 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:47.025 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:47.025 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:47.025 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:47.025 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:47.025 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:47.025 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:47.025 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:47.025 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:47.283 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:47.283 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:47.283 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:47.283 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:47.283 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:47.283 "name": "BaseBdev3", 00:20:47.283 "aliases": [ 00:20:47.283 "5f43f4b4-66c7-4bb5-af99-5e0b0f9176c5" 00:20:47.283 ], 00:20:47.283 "product_name": "Malloc disk", 00:20:47.283 "block_size": 512, 00:20:47.283 "num_blocks": 65536, 00:20:47.283 "uuid": "5f43f4b4-66c7-4bb5-af99-5e0b0f9176c5", 00:20:47.283 "assigned_rate_limits": { 00:20:47.283 "rw_ios_per_sec": 0, 00:20:47.283 "rw_mbytes_per_sec": 0, 00:20:47.283 "r_mbytes_per_sec": 0, 00:20:47.283 "w_mbytes_per_sec": 0 00:20:47.283 }, 00:20:47.283 "claimed": true, 00:20:47.283 "claim_type": "exclusive_write", 00:20:47.283 "zoned": false, 00:20:47.283 "supported_io_types": { 00:20:47.283 "read": true, 00:20:47.283 "write": true, 00:20:47.283 "unmap": true, 00:20:47.283 "flush": true, 00:20:47.283 "reset": true, 00:20:47.283 "nvme_admin": false, 00:20:47.283 "nvme_io": false, 00:20:47.283 "nvme_io_md": false, 00:20:47.283 "write_zeroes": true, 00:20:47.283 "zcopy": true, 00:20:47.283 "get_zone_info": false, 00:20:47.283 "zone_management": false, 00:20:47.283 "zone_append": false, 00:20:47.283 "compare": false, 00:20:47.283 "compare_and_write": false, 00:20:47.283 "abort": true, 00:20:47.283 "seek_hole": false, 00:20:47.283 "seek_data": false, 00:20:47.283 "copy": true, 00:20:47.283 "nvme_iov_md": false 00:20:47.283 }, 00:20:47.283 "memory_domains": [ 00:20:47.283 { 00:20:47.283 "dma_device_id": "system", 00:20:47.283 "dma_device_type": 1 00:20:47.284 }, 00:20:47.284 { 00:20:47.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:47.284 "dma_device_type": 2 00:20:47.284 } 00:20:47.284 ], 00:20:47.284 "driver_specific": {} 00:20:47.284 }' 00:20:47.284 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:47.284 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:47.541 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:47.541 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:47.541 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:47.541 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:47.541 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:47.541 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:47.541 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:47.541 08:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:47.541 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:47.541 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:47.541 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:47.800 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:47.800 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:47.800 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:47.800 "name": "BaseBdev4", 00:20:47.800 "aliases": [ 00:20:47.800 "8685dcfd-5765-4ab5-a903-95b3219339f9" 00:20:47.800 ], 00:20:47.800 "product_name": "Malloc disk", 00:20:47.800 "block_size": 512, 00:20:47.800 "num_blocks": 65536, 00:20:47.800 "uuid": "8685dcfd-5765-4ab5-a903-95b3219339f9", 00:20:47.800 "assigned_rate_limits": { 00:20:47.800 "rw_ios_per_sec": 0, 00:20:47.800 "rw_mbytes_per_sec": 0, 00:20:47.800 "r_mbytes_per_sec": 0, 00:20:47.800 "w_mbytes_per_sec": 0 00:20:47.800 }, 00:20:47.800 "claimed": true, 00:20:47.800 "claim_type": "exclusive_write", 00:20:47.800 "zoned": false, 00:20:47.800 "supported_io_types": { 00:20:47.800 "read": true, 00:20:47.800 "write": true, 00:20:47.800 "unmap": true, 00:20:47.800 "flush": true, 00:20:47.800 "reset": true, 00:20:47.800 "nvme_admin": false, 00:20:47.800 "nvme_io": false, 00:20:47.800 "nvme_io_md": false, 00:20:47.800 "write_zeroes": true, 00:20:47.800 "zcopy": true, 00:20:47.800 "get_zone_info": false, 00:20:47.800 "zone_management": false, 00:20:47.800 "zone_append": false, 00:20:47.800 "compare": false, 00:20:47.800 "compare_and_write": false, 00:20:47.800 "abort": true, 00:20:47.800 "seek_hole": false, 00:20:47.800 "seek_data": false, 00:20:47.800 "copy": true, 00:20:47.800 "nvme_iov_md": false 00:20:47.800 }, 00:20:47.800 "memory_domains": [ 00:20:47.800 { 00:20:47.800 "dma_device_id": "system", 00:20:47.800 "dma_device_type": 1 00:20:47.800 }, 00:20:47.800 { 00:20:47.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:47.800 "dma_device_type": 2 00:20:47.800 } 00:20:47.800 ], 00:20:47.800 "driver_specific": {} 00:20:47.800 }' 00:20:47.800 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:47.800 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:48.059 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:48.059 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:48.059 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:48.059 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:48.059 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:48.059 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:48.059 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:48.059 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:48.059 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:48.059 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:48.059 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:48.317 [2024-07-23 08:34:00.731147] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.317 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:48.576 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:48.576 "name": "Existed_Raid", 00:20:48.576 "uuid": "b4de9b3e-9d12-402b-8a24-817bc9887ac4", 00:20:48.576 "strip_size_kb": 0, 00:20:48.576 "state": "online", 00:20:48.576 "raid_level": "raid1", 00:20:48.576 "superblock": true, 00:20:48.576 "num_base_bdevs": 4, 00:20:48.576 "num_base_bdevs_discovered": 3, 00:20:48.576 "num_base_bdevs_operational": 3, 00:20:48.576 "base_bdevs_list": [ 00:20:48.576 { 00:20:48.576 "name": null, 00:20:48.576 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.576 "is_configured": false, 00:20:48.576 "data_offset": 2048, 00:20:48.576 "data_size": 63488 00:20:48.576 }, 00:20:48.576 { 00:20:48.576 "name": "BaseBdev2", 00:20:48.576 "uuid": "1a1bf42a-1864-4bd4-a58c-3c9f6c3eee6c", 00:20:48.576 "is_configured": true, 00:20:48.576 "data_offset": 2048, 00:20:48.576 "data_size": 63488 00:20:48.576 }, 00:20:48.576 { 00:20:48.576 "name": "BaseBdev3", 00:20:48.576 "uuid": "5f43f4b4-66c7-4bb5-af99-5e0b0f9176c5", 00:20:48.576 "is_configured": true, 00:20:48.576 "data_offset": 2048, 00:20:48.576 "data_size": 63488 00:20:48.576 }, 00:20:48.576 { 00:20:48.576 "name": "BaseBdev4", 00:20:48.576 "uuid": "8685dcfd-5765-4ab5-a903-95b3219339f9", 00:20:48.576 "is_configured": true, 00:20:48.576 "data_offset": 2048, 00:20:48.576 "data_size": 63488 00:20:48.576 } 00:20:48.576 ] 00:20:48.576 }' 00:20:48.576 08:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:48.576 08:34:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:49.143 08:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:49.143 08:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:49.143 08:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.143 08:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:49.143 08:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:49.143 08:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:49.143 08:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:49.402 [2024-07-23 08:34:01.782901] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:49.402 08:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:49.402 08:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:49.402 08:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.402 08:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:49.660 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:49.660 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:49.660 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:49.919 [2024-07-23 08:34:02.213332] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:49.919 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:49.919 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:49.919 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.919 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:50.177 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:50.177 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:50.177 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:50.177 [2024-07-23 08:34:02.649073] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:50.177 [2024-07-23 08:34:02.649174] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:50.436 [2024-07-23 08:34:02.745067] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:50.436 [2024-07-23 08:34:02.745113] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:50.436 [2024-07-23 08:34:02.745125] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:20:50.436 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:50.436 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:50.436 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.436 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:50.436 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:50.436 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:50.436 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:50.436 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:50.436 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:50.436 08:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:50.695 BaseBdev2 00:20:50.695 08:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:50.695 08:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:50.695 08:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:50.695 08:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:50.695 08:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:50.695 08:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:50.695 08:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:50.953 08:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:50.953 [ 00:20:50.953 { 00:20:50.953 "name": "BaseBdev2", 00:20:50.953 "aliases": [ 00:20:50.953 "bbadd96f-8a89-41c3-8aaa-283c9349a9a3" 00:20:50.953 ], 00:20:50.953 "product_name": "Malloc disk", 00:20:50.953 "block_size": 512, 00:20:50.953 "num_blocks": 65536, 00:20:50.953 "uuid": "bbadd96f-8a89-41c3-8aaa-283c9349a9a3", 00:20:50.953 "assigned_rate_limits": { 00:20:50.953 "rw_ios_per_sec": 0, 00:20:50.953 "rw_mbytes_per_sec": 0, 00:20:50.953 "r_mbytes_per_sec": 0, 00:20:50.953 "w_mbytes_per_sec": 0 00:20:50.953 }, 00:20:50.953 "claimed": false, 00:20:50.953 "zoned": false, 00:20:50.953 "supported_io_types": { 00:20:50.953 "read": true, 00:20:50.953 "write": true, 00:20:50.953 "unmap": true, 00:20:50.953 "flush": true, 00:20:50.953 "reset": true, 00:20:50.953 "nvme_admin": false, 00:20:50.953 "nvme_io": false, 00:20:50.953 "nvme_io_md": false, 00:20:50.953 "write_zeroes": true, 00:20:50.953 "zcopy": true, 00:20:50.953 "get_zone_info": false, 00:20:50.953 "zone_management": false, 00:20:50.953 "zone_append": false, 00:20:50.953 "compare": false, 00:20:50.953 "compare_and_write": false, 00:20:50.953 "abort": true, 00:20:50.953 "seek_hole": false, 00:20:50.953 "seek_data": false, 00:20:50.953 "copy": true, 00:20:50.953 "nvme_iov_md": false 00:20:50.953 }, 00:20:50.953 "memory_domains": [ 00:20:50.953 { 00:20:50.953 "dma_device_id": "system", 00:20:50.953 "dma_device_type": 1 00:20:50.953 }, 00:20:50.953 { 00:20:50.953 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:50.953 "dma_device_type": 2 00:20:50.953 } 00:20:50.953 ], 00:20:50.953 "driver_specific": {} 00:20:50.953 } 00:20:50.953 ] 00:20:51.212 08:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:51.212 08:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:51.212 08:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:51.212 08:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:51.212 BaseBdev3 00:20:51.212 08:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:51.212 08:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:51.212 08:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:51.212 08:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:51.212 08:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:51.212 08:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:51.212 08:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:51.471 08:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:51.729 [ 00:20:51.729 { 00:20:51.729 "name": "BaseBdev3", 00:20:51.729 "aliases": [ 00:20:51.729 "256a5791-01b8-4c54-bee4-2c66a245446b" 00:20:51.729 ], 00:20:51.729 "product_name": "Malloc disk", 00:20:51.729 "block_size": 512, 00:20:51.729 "num_blocks": 65536, 00:20:51.729 "uuid": "256a5791-01b8-4c54-bee4-2c66a245446b", 00:20:51.729 "assigned_rate_limits": { 00:20:51.729 "rw_ios_per_sec": 0, 00:20:51.729 "rw_mbytes_per_sec": 0, 00:20:51.729 "r_mbytes_per_sec": 0, 00:20:51.729 "w_mbytes_per_sec": 0 00:20:51.729 }, 00:20:51.729 "claimed": false, 00:20:51.729 "zoned": false, 00:20:51.729 "supported_io_types": { 00:20:51.729 "read": true, 00:20:51.729 "write": true, 00:20:51.729 "unmap": true, 00:20:51.729 "flush": true, 00:20:51.729 "reset": true, 00:20:51.729 "nvme_admin": false, 00:20:51.729 "nvme_io": false, 00:20:51.729 "nvme_io_md": false, 00:20:51.729 "write_zeroes": true, 00:20:51.729 "zcopy": true, 00:20:51.729 "get_zone_info": false, 00:20:51.729 "zone_management": false, 00:20:51.729 "zone_append": false, 00:20:51.729 "compare": false, 00:20:51.729 "compare_and_write": false, 00:20:51.729 "abort": true, 00:20:51.729 "seek_hole": false, 00:20:51.729 "seek_data": false, 00:20:51.729 "copy": true, 00:20:51.729 "nvme_iov_md": false 00:20:51.729 }, 00:20:51.729 "memory_domains": [ 00:20:51.729 { 00:20:51.729 "dma_device_id": "system", 00:20:51.729 "dma_device_type": 1 00:20:51.729 }, 00:20:51.729 { 00:20:51.729 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:51.729 "dma_device_type": 2 00:20:51.729 } 00:20:51.729 ], 00:20:51.729 "driver_specific": {} 00:20:51.729 } 00:20:51.729 ] 00:20:51.729 08:34:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:51.729 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:51.729 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:51.729 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:51.729 BaseBdev4 00:20:51.729 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:51.729 08:34:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:51.729 08:34:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:51.729 08:34:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:51.729 08:34:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:51.729 08:34:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:51.729 08:34:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:51.987 08:34:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:52.246 [ 00:20:52.246 { 00:20:52.246 "name": "BaseBdev4", 00:20:52.246 "aliases": [ 00:20:52.246 "9fe22eee-e8b8-4ff6-99ca-af4f69a15290" 00:20:52.246 ], 00:20:52.246 "product_name": "Malloc disk", 00:20:52.246 "block_size": 512, 00:20:52.246 "num_blocks": 65536, 00:20:52.246 "uuid": "9fe22eee-e8b8-4ff6-99ca-af4f69a15290", 00:20:52.246 "assigned_rate_limits": { 00:20:52.246 "rw_ios_per_sec": 0, 00:20:52.246 "rw_mbytes_per_sec": 0, 00:20:52.246 "r_mbytes_per_sec": 0, 00:20:52.246 "w_mbytes_per_sec": 0 00:20:52.246 }, 00:20:52.246 "claimed": false, 00:20:52.246 "zoned": false, 00:20:52.246 "supported_io_types": { 00:20:52.246 "read": true, 00:20:52.246 "write": true, 00:20:52.246 "unmap": true, 00:20:52.246 "flush": true, 00:20:52.246 "reset": true, 00:20:52.246 "nvme_admin": false, 00:20:52.246 "nvme_io": false, 00:20:52.246 "nvme_io_md": false, 00:20:52.246 "write_zeroes": true, 00:20:52.246 "zcopy": true, 00:20:52.246 "get_zone_info": false, 00:20:52.246 "zone_management": false, 00:20:52.246 "zone_append": false, 00:20:52.246 "compare": false, 00:20:52.246 "compare_and_write": false, 00:20:52.246 "abort": true, 00:20:52.246 "seek_hole": false, 00:20:52.246 "seek_data": false, 00:20:52.246 "copy": true, 00:20:52.246 "nvme_iov_md": false 00:20:52.246 }, 00:20:52.246 "memory_domains": [ 00:20:52.246 { 00:20:52.246 "dma_device_id": "system", 00:20:52.246 "dma_device_type": 1 00:20:52.246 }, 00:20:52.246 { 00:20:52.246 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:52.246 "dma_device_type": 2 00:20:52.246 } 00:20:52.246 ], 00:20:52.246 "driver_specific": {} 00:20:52.246 } 00:20:52.246 ] 00:20:52.246 08:34:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:52.246 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:52.246 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:52.246 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:52.246 [2024-07-23 08:34:04.694946] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:52.246 [2024-07-23 08:34:04.694989] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:52.246 [2024-07-23 08:34:04.695027] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:52.246 [2024-07-23 08:34:04.696635] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:52.246 [2024-07-23 08:34:04.696686] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:52.246 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:52.246 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:52.246 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:52.246 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:52.246 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:52.246 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:52.246 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:52.246 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:52.246 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:52.246 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:52.246 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.246 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:52.505 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:52.505 "name": "Existed_Raid", 00:20:52.505 "uuid": "6b612169-fd30-49f3-93ea-2f7136533115", 00:20:52.505 "strip_size_kb": 0, 00:20:52.505 "state": "configuring", 00:20:52.505 "raid_level": "raid1", 00:20:52.505 "superblock": true, 00:20:52.505 "num_base_bdevs": 4, 00:20:52.505 "num_base_bdevs_discovered": 3, 00:20:52.505 "num_base_bdevs_operational": 4, 00:20:52.505 "base_bdevs_list": [ 00:20:52.505 { 00:20:52.505 "name": "BaseBdev1", 00:20:52.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.505 "is_configured": false, 00:20:52.505 "data_offset": 0, 00:20:52.505 "data_size": 0 00:20:52.505 }, 00:20:52.505 { 00:20:52.505 "name": "BaseBdev2", 00:20:52.505 "uuid": "bbadd96f-8a89-41c3-8aaa-283c9349a9a3", 00:20:52.505 "is_configured": true, 00:20:52.505 "data_offset": 2048, 00:20:52.505 "data_size": 63488 00:20:52.505 }, 00:20:52.505 { 00:20:52.505 "name": "BaseBdev3", 00:20:52.505 "uuid": "256a5791-01b8-4c54-bee4-2c66a245446b", 00:20:52.505 "is_configured": true, 00:20:52.505 "data_offset": 2048, 00:20:52.505 "data_size": 63488 00:20:52.505 }, 00:20:52.505 { 00:20:52.505 "name": "BaseBdev4", 00:20:52.505 "uuid": "9fe22eee-e8b8-4ff6-99ca-af4f69a15290", 00:20:52.505 "is_configured": true, 00:20:52.505 "data_offset": 2048, 00:20:52.505 "data_size": 63488 00:20:52.505 } 00:20:52.505 ] 00:20:52.505 }' 00:20:52.505 08:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:52.505 08:34:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:53.070 08:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:53.070 [2024-07-23 08:34:05.525101] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:53.070 08:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:53.070 08:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:53.070 08:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:53.070 08:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:53.070 08:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:53.070 08:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:53.070 08:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:53.070 08:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:53.070 08:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:53.070 08:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:53.070 08:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:53.070 08:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.336 08:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:53.336 "name": "Existed_Raid", 00:20:53.336 "uuid": "6b612169-fd30-49f3-93ea-2f7136533115", 00:20:53.336 "strip_size_kb": 0, 00:20:53.336 "state": "configuring", 00:20:53.336 "raid_level": "raid1", 00:20:53.336 "superblock": true, 00:20:53.336 "num_base_bdevs": 4, 00:20:53.336 "num_base_bdevs_discovered": 2, 00:20:53.336 "num_base_bdevs_operational": 4, 00:20:53.336 "base_bdevs_list": [ 00:20:53.336 { 00:20:53.336 "name": "BaseBdev1", 00:20:53.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.336 "is_configured": false, 00:20:53.336 "data_offset": 0, 00:20:53.336 "data_size": 0 00:20:53.336 }, 00:20:53.336 { 00:20:53.336 "name": null, 00:20:53.336 "uuid": "bbadd96f-8a89-41c3-8aaa-283c9349a9a3", 00:20:53.336 "is_configured": false, 00:20:53.336 "data_offset": 2048, 00:20:53.336 "data_size": 63488 00:20:53.336 }, 00:20:53.336 { 00:20:53.336 "name": "BaseBdev3", 00:20:53.336 "uuid": "256a5791-01b8-4c54-bee4-2c66a245446b", 00:20:53.336 "is_configured": true, 00:20:53.336 "data_offset": 2048, 00:20:53.336 "data_size": 63488 00:20:53.336 }, 00:20:53.336 { 00:20:53.336 "name": "BaseBdev4", 00:20:53.336 "uuid": "9fe22eee-e8b8-4ff6-99ca-af4f69a15290", 00:20:53.336 "is_configured": true, 00:20:53.336 "data_offset": 2048, 00:20:53.336 "data_size": 63488 00:20:53.336 } 00:20:53.336 ] 00:20:53.336 }' 00:20:53.336 08:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:53.336 08:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:53.934 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.934 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:53.934 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:53.934 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:54.192 [2024-07-23 08:34:06.572942] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:54.192 BaseBdev1 00:20:54.192 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:54.192 08:34:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:54.192 08:34:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:54.192 08:34:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:54.192 08:34:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:54.192 08:34:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:54.192 08:34:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:54.451 08:34:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:54.451 [ 00:20:54.451 { 00:20:54.451 "name": "BaseBdev1", 00:20:54.451 "aliases": [ 00:20:54.451 "4d3a4a4a-fe05-438d-b6aa-d15efb5c8eef" 00:20:54.451 ], 00:20:54.451 "product_name": "Malloc disk", 00:20:54.451 "block_size": 512, 00:20:54.451 "num_blocks": 65536, 00:20:54.451 "uuid": "4d3a4a4a-fe05-438d-b6aa-d15efb5c8eef", 00:20:54.451 "assigned_rate_limits": { 00:20:54.451 "rw_ios_per_sec": 0, 00:20:54.451 "rw_mbytes_per_sec": 0, 00:20:54.451 "r_mbytes_per_sec": 0, 00:20:54.451 "w_mbytes_per_sec": 0 00:20:54.451 }, 00:20:54.451 "claimed": true, 00:20:54.451 "claim_type": "exclusive_write", 00:20:54.451 "zoned": false, 00:20:54.451 "supported_io_types": { 00:20:54.451 "read": true, 00:20:54.451 "write": true, 00:20:54.451 "unmap": true, 00:20:54.451 "flush": true, 00:20:54.451 "reset": true, 00:20:54.451 "nvme_admin": false, 00:20:54.451 "nvme_io": false, 00:20:54.451 "nvme_io_md": false, 00:20:54.451 "write_zeroes": true, 00:20:54.451 "zcopy": true, 00:20:54.451 "get_zone_info": false, 00:20:54.451 "zone_management": false, 00:20:54.451 "zone_append": false, 00:20:54.451 "compare": false, 00:20:54.451 "compare_and_write": false, 00:20:54.451 "abort": true, 00:20:54.451 "seek_hole": false, 00:20:54.451 "seek_data": false, 00:20:54.451 "copy": true, 00:20:54.451 "nvme_iov_md": false 00:20:54.451 }, 00:20:54.451 "memory_domains": [ 00:20:54.451 { 00:20:54.451 "dma_device_id": "system", 00:20:54.451 "dma_device_type": 1 00:20:54.451 }, 00:20:54.451 { 00:20:54.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.451 "dma_device_type": 2 00:20:54.451 } 00:20:54.451 ], 00:20:54.451 "driver_specific": {} 00:20:54.451 } 00:20:54.451 ] 00:20:54.451 08:34:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:54.451 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:54.451 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:54.451 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:54.451 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:54.451 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:54.451 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:54.451 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:54.451 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:54.451 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:54.451 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:54.451 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.451 08:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:54.710 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:54.710 "name": "Existed_Raid", 00:20:54.710 "uuid": "6b612169-fd30-49f3-93ea-2f7136533115", 00:20:54.710 "strip_size_kb": 0, 00:20:54.710 "state": "configuring", 00:20:54.710 "raid_level": "raid1", 00:20:54.710 "superblock": true, 00:20:54.710 "num_base_bdevs": 4, 00:20:54.710 "num_base_bdevs_discovered": 3, 00:20:54.710 "num_base_bdevs_operational": 4, 00:20:54.710 "base_bdevs_list": [ 00:20:54.710 { 00:20:54.710 "name": "BaseBdev1", 00:20:54.710 "uuid": "4d3a4a4a-fe05-438d-b6aa-d15efb5c8eef", 00:20:54.710 "is_configured": true, 00:20:54.710 "data_offset": 2048, 00:20:54.710 "data_size": 63488 00:20:54.710 }, 00:20:54.710 { 00:20:54.710 "name": null, 00:20:54.710 "uuid": "bbadd96f-8a89-41c3-8aaa-283c9349a9a3", 00:20:54.710 "is_configured": false, 00:20:54.710 "data_offset": 2048, 00:20:54.710 "data_size": 63488 00:20:54.710 }, 00:20:54.710 { 00:20:54.710 "name": "BaseBdev3", 00:20:54.710 "uuid": "256a5791-01b8-4c54-bee4-2c66a245446b", 00:20:54.710 "is_configured": true, 00:20:54.710 "data_offset": 2048, 00:20:54.710 "data_size": 63488 00:20:54.710 }, 00:20:54.710 { 00:20:54.710 "name": "BaseBdev4", 00:20:54.710 "uuid": "9fe22eee-e8b8-4ff6-99ca-af4f69a15290", 00:20:54.710 "is_configured": true, 00:20:54.710 "data_offset": 2048, 00:20:54.710 "data_size": 63488 00:20:54.710 } 00:20:54.710 ] 00:20:54.710 }' 00:20:54.710 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:54.710 08:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:55.275 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.275 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:55.275 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:55.275 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:55.534 [2024-07-23 08:34:07.860403] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:55.534 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:55.534 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:55.534 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:55.534 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:55.534 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:55.534 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:55.534 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:55.534 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:55.534 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:55.534 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:55.534 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.534 08:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:55.534 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:55.534 "name": "Existed_Raid", 00:20:55.534 "uuid": "6b612169-fd30-49f3-93ea-2f7136533115", 00:20:55.534 "strip_size_kb": 0, 00:20:55.534 "state": "configuring", 00:20:55.534 "raid_level": "raid1", 00:20:55.534 "superblock": true, 00:20:55.534 "num_base_bdevs": 4, 00:20:55.534 "num_base_bdevs_discovered": 2, 00:20:55.534 "num_base_bdevs_operational": 4, 00:20:55.534 "base_bdevs_list": [ 00:20:55.534 { 00:20:55.534 "name": "BaseBdev1", 00:20:55.534 "uuid": "4d3a4a4a-fe05-438d-b6aa-d15efb5c8eef", 00:20:55.534 "is_configured": true, 00:20:55.534 "data_offset": 2048, 00:20:55.534 "data_size": 63488 00:20:55.534 }, 00:20:55.534 { 00:20:55.534 "name": null, 00:20:55.534 "uuid": "bbadd96f-8a89-41c3-8aaa-283c9349a9a3", 00:20:55.534 "is_configured": false, 00:20:55.534 "data_offset": 2048, 00:20:55.534 "data_size": 63488 00:20:55.534 }, 00:20:55.534 { 00:20:55.534 "name": null, 00:20:55.534 "uuid": "256a5791-01b8-4c54-bee4-2c66a245446b", 00:20:55.534 "is_configured": false, 00:20:55.534 "data_offset": 2048, 00:20:55.534 "data_size": 63488 00:20:55.534 }, 00:20:55.534 { 00:20:55.534 "name": "BaseBdev4", 00:20:55.534 "uuid": "9fe22eee-e8b8-4ff6-99ca-af4f69a15290", 00:20:55.534 "is_configured": true, 00:20:55.534 "data_offset": 2048, 00:20:55.534 "data_size": 63488 00:20:55.534 } 00:20:55.534 ] 00:20:55.534 }' 00:20:55.792 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:55.792 08:34:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:56.050 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:56.050 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.308 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:56.308 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:56.566 [2024-07-23 08:34:08.883121] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:56.566 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:56.566 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:56.566 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:56.566 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:56.566 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:56.566 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:56.566 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:56.566 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:56.566 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:56.566 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:56.566 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.566 08:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:56.824 08:34:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:56.824 "name": "Existed_Raid", 00:20:56.824 "uuid": "6b612169-fd30-49f3-93ea-2f7136533115", 00:20:56.824 "strip_size_kb": 0, 00:20:56.824 "state": "configuring", 00:20:56.824 "raid_level": "raid1", 00:20:56.824 "superblock": true, 00:20:56.824 "num_base_bdevs": 4, 00:20:56.824 "num_base_bdevs_discovered": 3, 00:20:56.824 "num_base_bdevs_operational": 4, 00:20:56.824 "base_bdevs_list": [ 00:20:56.824 { 00:20:56.824 "name": "BaseBdev1", 00:20:56.824 "uuid": "4d3a4a4a-fe05-438d-b6aa-d15efb5c8eef", 00:20:56.824 "is_configured": true, 00:20:56.824 "data_offset": 2048, 00:20:56.824 "data_size": 63488 00:20:56.824 }, 00:20:56.824 { 00:20:56.824 "name": null, 00:20:56.824 "uuid": "bbadd96f-8a89-41c3-8aaa-283c9349a9a3", 00:20:56.824 "is_configured": false, 00:20:56.824 "data_offset": 2048, 00:20:56.824 "data_size": 63488 00:20:56.824 }, 00:20:56.824 { 00:20:56.824 "name": "BaseBdev3", 00:20:56.824 "uuid": "256a5791-01b8-4c54-bee4-2c66a245446b", 00:20:56.824 "is_configured": true, 00:20:56.824 "data_offset": 2048, 00:20:56.824 "data_size": 63488 00:20:56.824 }, 00:20:56.824 { 00:20:56.824 "name": "BaseBdev4", 00:20:56.824 "uuid": "9fe22eee-e8b8-4ff6-99ca-af4f69a15290", 00:20:56.824 "is_configured": true, 00:20:56.824 "data_offset": 2048, 00:20:56.824 "data_size": 63488 00:20:56.824 } 00:20:56.824 ] 00:20:56.824 }' 00:20:56.824 08:34:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:56.824 08:34:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:57.082 08:34:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.082 08:34:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:57.340 08:34:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:57.340 08:34:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:57.599 [2024-07-23 08:34:09.901799] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:57.599 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:57.599 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:57.599 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:57.599 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:57.599 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:57.599 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:57.599 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:57.599 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:57.599 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:57.599 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:57.599 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.599 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:57.857 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:57.857 "name": "Existed_Raid", 00:20:57.857 "uuid": "6b612169-fd30-49f3-93ea-2f7136533115", 00:20:57.857 "strip_size_kb": 0, 00:20:57.857 "state": "configuring", 00:20:57.857 "raid_level": "raid1", 00:20:57.857 "superblock": true, 00:20:57.857 "num_base_bdevs": 4, 00:20:57.857 "num_base_bdevs_discovered": 2, 00:20:57.857 "num_base_bdevs_operational": 4, 00:20:57.857 "base_bdevs_list": [ 00:20:57.857 { 00:20:57.857 "name": null, 00:20:57.857 "uuid": "4d3a4a4a-fe05-438d-b6aa-d15efb5c8eef", 00:20:57.857 "is_configured": false, 00:20:57.857 "data_offset": 2048, 00:20:57.857 "data_size": 63488 00:20:57.857 }, 00:20:57.857 { 00:20:57.857 "name": null, 00:20:57.857 "uuid": "bbadd96f-8a89-41c3-8aaa-283c9349a9a3", 00:20:57.857 "is_configured": false, 00:20:57.857 "data_offset": 2048, 00:20:57.857 "data_size": 63488 00:20:57.857 }, 00:20:57.857 { 00:20:57.857 "name": "BaseBdev3", 00:20:57.857 "uuid": "256a5791-01b8-4c54-bee4-2c66a245446b", 00:20:57.857 "is_configured": true, 00:20:57.858 "data_offset": 2048, 00:20:57.858 "data_size": 63488 00:20:57.858 }, 00:20:57.858 { 00:20:57.858 "name": "BaseBdev4", 00:20:57.858 "uuid": "9fe22eee-e8b8-4ff6-99ca-af4f69a15290", 00:20:57.858 "is_configured": true, 00:20:57.858 "data_offset": 2048, 00:20:57.858 "data_size": 63488 00:20:57.858 } 00:20:57.858 ] 00:20:57.858 }' 00:20:57.858 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:57.858 08:34:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:58.425 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.425 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:58.425 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:58.425 08:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:58.683 [2024-07-23 08:34:10.990332] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:58.683 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:58.683 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:58.683 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:58.683 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:58.683 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:58.683 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:58.683 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:58.683 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:58.683 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:58.683 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:58.683 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.683 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:58.683 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:58.683 "name": "Existed_Raid", 00:20:58.683 "uuid": "6b612169-fd30-49f3-93ea-2f7136533115", 00:20:58.683 "strip_size_kb": 0, 00:20:58.683 "state": "configuring", 00:20:58.683 "raid_level": "raid1", 00:20:58.684 "superblock": true, 00:20:58.684 "num_base_bdevs": 4, 00:20:58.684 "num_base_bdevs_discovered": 3, 00:20:58.684 "num_base_bdevs_operational": 4, 00:20:58.684 "base_bdevs_list": [ 00:20:58.684 { 00:20:58.684 "name": null, 00:20:58.684 "uuid": "4d3a4a4a-fe05-438d-b6aa-d15efb5c8eef", 00:20:58.684 "is_configured": false, 00:20:58.684 "data_offset": 2048, 00:20:58.684 "data_size": 63488 00:20:58.684 }, 00:20:58.684 { 00:20:58.684 "name": "BaseBdev2", 00:20:58.684 "uuid": "bbadd96f-8a89-41c3-8aaa-283c9349a9a3", 00:20:58.684 "is_configured": true, 00:20:58.684 "data_offset": 2048, 00:20:58.684 "data_size": 63488 00:20:58.684 }, 00:20:58.684 { 00:20:58.684 "name": "BaseBdev3", 00:20:58.684 "uuid": "256a5791-01b8-4c54-bee4-2c66a245446b", 00:20:58.684 "is_configured": true, 00:20:58.684 "data_offset": 2048, 00:20:58.684 "data_size": 63488 00:20:58.684 }, 00:20:58.684 { 00:20:58.684 "name": "BaseBdev4", 00:20:58.684 "uuid": "9fe22eee-e8b8-4ff6-99ca-af4f69a15290", 00:20:58.684 "is_configured": true, 00:20:58.684 "data_offset": 2048, 00:20:58.684 "data_size": 63488 00:20:58.684 } 00:20:58.684 ] 00:20:58.684 }' 00:20:58.684 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:58.684 08:34:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:59.251 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:59.251 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.508 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:59.508 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.508 08:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:59.508 08:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 4d3a4a4a-fe05-438d-b6aa-d15efb5c8eef 00:20:59.767 [2024-07-23 08:34:12.215473] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:59.767 [2024-07-23 08:34:12.215691] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037280 00:20:59.767 [2024-07-23 08:34:12.215712] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:59.767 [2024-07-23 08:34:12.215947] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c3a0 00:20:59.767 [2024-07-23 08:34:12.216115] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037280 00:20:59.767 [2024-07-23 08:34:12.216125] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000037280 00:20:59.767 [2024-07-23 08:34:12.216269] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:59.767 NewBaseBdev 00:20:59.767 08:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:59.767 08:34:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:59.767 08:34:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:59.767 08:34:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:59.767 08:34:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:59.767 08:34:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:59.767 08:34:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:00.025 08:34:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:00.284 [ 00:21:00.284 { 00:21:00.284 "name": "NewBaseBdev", 00:21:00.284 "aliases": [ 00:21:00.284 "4d3a4a4a-fe05-438d-b6aa-d15efb5c8eef" 00:21:00.284 ], 00:21:00.284 "product_name": "Malloc disk", 00:21:00.284 "block_size": 512, 00:21:00.284 "num_blocks": 65536, 00:21:00.284 "uuid": "4d3a4a4a-fe05-438d-b6aa-d15efb5c8eef", 00:21:00.284 "assigned_rate_limits": { 00:21:00.284 "rw_ios_per_sec": 0, 00:21:00.284 "rw_mbytes_per_sec": 0, 00:21:00.284 "r_mbytes_per_sec": 0, 00:21:00.284 "w_mbytes_per_sec": 0 00:21:00.284 }, 00:21:00.284 "claimed": true, 00:21:00.284 "claim_type": "exclusive_write", 00:21:00.284 "zoned": false, 00:21:00.284 "supported_io_types": { 00:21:00.284 "read": true, 00:21:00.284 "write": true, 00:21:00.284 "unmap": true, 00:21:00.284 "flush": true, 00:21:00.284 "reset": true, 00:21:00.284 "nvme_admin": false, 00:21:00.284 "nvme_io": false, 00:21:00.284 "nvme_io_md": false, 00:21:00.284 "write_zeroes": true, 00:21:00.284 "zcopy": true, 00:21:00.284 "get_zone_info": false, 00:21:00.284 "zone_management": false, 00:21:00.284 "zone_append": false, 00:21:00.284 "compare": false, 00:21:00.284 "compare_and_write": false, 00:21:00.284 "abort": true, 00:21:00.284 "seek_hole": false, 00:21:00.284 "seek_data": false, 00:21:00.284 "copy": true, 00:21:00.284 "nvme_iov_md": false 00:21:00.284 }, 00:21:00.284 "memory_domains": [ 00:21:00.284 { 00:21:00.284 "dma_device_id": "system", 00:21:00.284 "dma_device_type": 1 00:21:00.284 }, 00:21:00.284 { 00:21:00.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.284 "dma_device_type": 2 00:21:00.284 } 00:21:00.284 ], 00:21:00.284 "driver_specific": {} 00:21:00.284 } 00:21:00.284 ] 00:21:00.284 08:34:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:00.284 08:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:00.284 08:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:00.284 08:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:00.284 08:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:00.284 08:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:00.284 08:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:00.284 08:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:00.284 08:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:00.284 08:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:00.284 08:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:00.284 08:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.284 08:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:00.284 08:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:00.284 "name": "Existed_Raid", 00:21:00.284 "uuid": "6b612169-fd30-49f3-93ea-2f7136533115", 00:21:00.284 "strip_size_kb": 0, 00:21:00.284 "state": "online", 00:21:00.284 "raid_level": "raid1", 00:21:00.284 "superblock": true, 00:21:00.284 "num_base_bdevs": 4, 00:21:00.284 "num_base_bdevs_discovered": 4, 00:21:00.284 "num_base_bdevs_operational": 4, 00:21:00.284 "base_bdevs_list": [ 00:21:00.284 { 00:21:00.284 "name": "NewBaseBdev", 00:21:00.284 "uuid": "4d3a4a4a-fe05-438d-b6aa-d15efb5c8eef", 00:21:00.284 "is_configured": true, 00:21:00.284 "data_offset": 2048, 00:21:00.284 "data_size": 63488 00:21:00.284 }, 00:21:00.284 { 00:21:00.284 "name": "BaseBdev2", 00:21:00.284 "uuid": "bbadd96f-8a89-41c3-8aaa-283c9349a9a3", 00:21:00.284 "is_configured": true, 00:21:00.284 "data_offset": 2048, 00:21:00.284 "data_size": 63488 00:21:00.284 }, 00:21:00.284 { 00:21:00.284 "name": "BaseBdev3", 00:21:00.284 "uuid": "256a5791-01b8-4c54-bee4-2c66a245446b", 00:21:00.284 "is_configured": true, 00:21:00.284 "data_offset": 2048, 00:21:00.284 "data_size": 63488 00:21:00.284 }, 00:21:00.284 { 00:21:00.284 "name": "BaseBdev4", 00:21:00.284 "uuid": "9fe22eee-e8b8-4ff6-99ca-af4f69a15290", 00:21:00.284 "is_configured": true, 00:21:00.284 "data_offset": 2048, 00:21:00.284 "data_size": 63488 00:21:00.284 } 00:21:00.284 ] 00:21:00.284 }' 00:21:00.284 08:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:00.284 08:34:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:00.852 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:00.852 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:00.852 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:00.852 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:00.852 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:00.852 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:00.852 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:00.852 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:00.852 [2024-07-23 08:34:13.334783] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:00.852 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:00.852 "name": "Existed_Raid", 00:21:00.852 "aliases": [ 00:21:00.852 "6b612169-fd30-49f3-93ea-2f7136533115" 00:21:00.852 ], 00:21:00.852 "product_name": "Raid Volume", 00:21:00.852 "block_size": 512, 00:21:00.852 "num_blocks": 63488, 00:21:00.852 "uuid": "6b612169-fd30-49f3-93ea-2f7136533115", 00:21:00.852 "assigned_rate_limits": { 00:21:00.852 "rw_ios_per_sec": 0, 00:21:00.852 "rw_mbytes_per_sec": 0, 00:21:00.852 "r_mbytes_per_sec": 0, 00:21:00.852 "w_mbytes_per_sec": 0 00:21:00.852 }, 00:21:00.852 "claimed": false, 00:21:00.852 "zoned": false, 00:21:00.852 "supported_io_types": { 00:21:00.852 "read": true, 00:21:00.852 "write": true, 00:21:00.852 "unmap": false, 00:21:00.852 "flush": false, 00:21:00.852 "reset": true, 00:21:00.852 "nvme_admin": false, 00:21:00.852 "nvme_io": false, 00:21:00.852 "nvme_io_md": false, 00:21:00.852 "write_zeroes": true, 00:21:00.852 "zcopy": false, 00:21:00.852 "get_zone_info": false, 00:21:00.852 "zone_management": false, 00:21:00.852 "zone_append": false, 00:21:00.852 "compare": false, 00:21:00.852 "compare_and_write": false, 00:21:00.852 "abort": false, 00:21:00.852 "seek_hole": false, 00:21:00.852 "seek_data": false, 00:21:00.852 "copy": false, 00:21:00.852 "nvme_iov_md": false 00:21:00.852 }, 00:21:00.852 "memory_domains": [ 00:21:00.852 { 00:21:00.852 "dma_device_id": "system", 00:21:00.852 "dma_device_type": 1 00:21:00.852 }, 00:21:00.852 { 00:21:00.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.852 "dma_device_type": 2 00:21:00.852 }, 00:21:00.852 { 00:21:00.852 "dma_device_id": "system", 00:21:00.852 "dma_device_type": 1 00:21:00.852 }, 00:21:00.852 { 00:21:00.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.852 "dma_device_type": 2 00:21:00.852 }, 00:21:00.852 { 00:21:00.852 "dma_device_id": "system", 00:21:00.852 "dma_device_type": 1 00:21:00.852 }, 00:21:00.852 { 00:21:00.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.852 "dma_device_type": 2 00:21:00.852 }, 00:21:00.852 { 00:21:00.852 "dma_device_id": "system", 00:21:00.852 "dma_device_type": 1 00:21:00.852 }, 00:21:00.852 { 00:21:00.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.852 "dma_device_type": 2 00:21:00.852 } 00:21:00.852 ], 00:21:00.852 "driver_specific": { 00:21:00.852 "raid": { 00:21:00.852 "uuid": "6b612169-fd30-49f3-93ea-2f7136533115", 00:21:00.852 "strip_size_kb": 0, 00:21:00.852 "state": "online", 00:21:00.852 "raid_level": "raid1", 00:21:00.852 "superblock": true, 00:21:00.852 "num_base_bdevs": 4, 00:21:00.852 "num_base_bdevs_discovered": 4, 00:21:00.852 "num_base_bdevs_operational": 4, 00:21:00.852 "base_bdevs_list": [ 00:21:00.852 { 00:21:00.852 "name": "NewBaseBdev", 00:21:00.852 "uuid": "4d3a4a4a-fe05-438d-b6aa-d15efb5c8eef", 00:21:00.852 "is_configured": true, 00:21:00.852 "data_offset": 2048, 00:21:00.852 "data_size": 63488 00:21:00.852 }, 00:21:00.852 { 00:21:00.852 "name": "BaseBdev2", 00:21:00.852 "uuid": "bbadd96f-8a89-41c3-8aaa-283c9349a9a3", 00:21:00.852 "is_configured": true, 00:21:00.852 "data_offset": 2048, 00:21:00.852 "data_size": 63488 00:21:00.852 }, 00:21:00.852 { 00:21:00.852 "name": "BaseBdev3", 00:21:00.852 "uuid": "256a5791-01b8-4c54-bee4-2c66a245446b", 00:21:00.852 "is_configured": true, 00:21:00.852 "data_offset": 2048, 00:21:00.852 "data_size": 63488 00:21:00.852 }, 00:21:00.852 { 00:21:00.852 "name": "BaseBdev4", 00:21:00.852 "uuid": "9fe22eee-e8b8-4ff6-99ca-af4f69a15290", 00:21:00.852 "is_configured": true, 00:21:00.852 "data_offset": 2048, 00:21:00.852 "data_size": 63488 00:21:00.852 } 00:21:00.852 ] 00:21:00.852 } 00:21:00.852 } 00:21:00.852 }' 00:21:00.852 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:01.111 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:01.111 BaseBdev2 00:21:01.111 BaseBdev3 00:21:01.111 BaseBdev4' 00:21:01.111 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:01.111 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:01.111 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:01.111 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:01.111 "name": "NewBaseBdev", 00:21:01.111 "aliases": [ 00:21:01.111 "4d3a4a4a-fe05-438d-b6aa-d15efb5c8eef" 00:21:01.111 ], 00:21:01.111 "product_name": "Malloc disk", 00:21:01.111 "block_size": 512, 00:21:01.111 "num_blocks": 65536, 00:21:01.111 "uuid": "4d3a4a4a-fe05-438d-b6aa-d15efb5c8eef", 00:21:01.111 "assigned_rate_limits": { 00:21:01.111 "rw_ios_per_sec": 0, 00:21:01.111 "rw_mbytes_per_sec": 0, 00:21:01.111 "r_mbytes_per_sec": 0, 00:21:01.111 "w_mbytes_per_sec": 0 00:21:01.111 }, 00:21:01.111 "claimed": true, 00:21:01.111 "claim_type": "exclusive_write", 00:21:01.111 "zoned": false, 00:21:01.111 "supported_io_types": { 00:21:01.111 "read": true, 00:21:01.111 "write": true, 00:21:01.111 "unmap": true, 00:21:01.111 "flush": true, 00:21:01.111 "reset": true, 00:21:01.111 "nvme_admin": false, 00:21:01.111 "nvme_io": false, 00:21:01.111 "nvme_io_md": false, 00:21:01.111 "write_zeroes": true, 00:21:01.111 "zcopy": true, 00:21:01.111 "get_zone_info": false, 00:21:01.111 "zone_management": false, 00:21:01.111 "zone_append": false, 00:21:01.111 "compare": false, 00:21:01.111 "compare_and_write": false, 00:21:01.111 "abort": true, 00:21:01.111 "seek_hole": false, 00:21:01.111 "seek_data": false, 00:21:01.111 "copy": true, 00:21:01.111 "nvme_iov_md": false 00:21:01.111 }, 00:21:01.111 "memory_domains": [ 00:21:01.111 { 00:21:01.111 "dma_device_id": "system", 00:21:01.111 "dma_device_type": 1 00:21:01.111 }, 00:21:01.111 { 00:21:01.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.111 "dma_device_type": 2 00:21:01.111 } 00:21:01.111 ], 00:21:01.111 "driver_specific": {} 00:21:01.111 }' 00:21:01.111 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.111 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.370 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:01.370 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:01.370 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:01.370 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:01.370 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:01.370 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:01.370 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:01.370 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:01.370 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:01.370 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:01.370 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:01.370 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:01.370 08:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:01.627 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:01.627 "name": "BaseBdev2", 00:21:01.627 "aliases": [ 00:21:01.627 "bbadd96f-8a89-41c3-8aaa-283c9349a9a3" 00:21:01.627 ], 00:21:01.627 "product_name": "Malloc disk", 00:21:01.627 "block_size": 512, 00:21:01.627 "num_blocks": 65536, 00:21:01.627 "uuid": "bbadd96f-8a89-41c3-8aaa-283c9349a9a3", 00:21:01.628 "assigned_rate_limits": { 00:21:01.628 "rw_ios_per_sec": 0, 00:21:01.628 "rw_mbytes_per_sec": 0, 00:21:01.628 "r_mbytes_per_sec": 0, 00:21:01.628 "w_mbytes_per_sec": 0 00:21:01.628 }, 00:21:01.628 "claimed": true, 00:21:01.628 "claim_type": "exclusive_write", 00:21:01.628 "zoned": false, 00:21:01.628 "supported_io_types": { 00:21:01.628 "read": true, 00:21:01.628 "write": true, 00:21:01.628 "unmap": true, 00:21:01.628 "flush": true, 00:21:01.628 "reset": true, 00:21:01.628 "nvme_admin": false, 00:21:01.628 "nvme_io": false, 00:21:01.628 "nvme_io_md": false, 00:21:01.628 "write_zeroes": true, 00:21:01.628 "zcopy": true, 00:21:01.628 "get_zone_info": false, 00:21:01.628 "zone_management": false, 00:21:01.628 "zone_append": false, 00:21:01.628 "compare": false, 00:21:01.628 "compare_and_write": false, 00:21:01.628 "abort": true, 00:21:01.628 "seek_hole": false, 00:21:01.628 "seek_data": false, 00:21:01.628 "copy": true, 00:21:01.628 "nvme_iov_md": false 00:21:01.628 }, 00:21:01.628 "memory_domains": [ 00:21:01.628 { 00:21:01.628 "dma_device_id": "system", 00:21:01.628 "dma_device_type": 1 00:21:01.628 }, 00:21:01.628 { 00:21:01.628 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.628 "dma_device_type": 2 00:21:01.628 } 00:21:01.628 ], 00:21:01.628 "driver_specific": {} 00:21:01.628 }' 00:21:01.628 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.628 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.628 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:01.628 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:01.628 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:01.886 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:01.886 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:01.886 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:01.886 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:01.886 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:01.886 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:01.887 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:01.887 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:01.887 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:01.887 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:02.145 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:02.145 "name": "BaseBdev3", 00:21:02.145 "aliases": [ 00:21:02.145 "256a5791-01b8-4c54-bee4-2c66a245446b" 00:21:02.145 ], 00:21:02.145 "product_name": "Malloc disk", 00:21:02.145 "block_size": 512, 00:21:02.145 "num_blocks": 65536, 00:21:02.145 "uuid": "256a5791-01b8-4c54-bee4-2c66a245446b", 00:21:02.145 "assigned_rate_limits": { 00:21:02.145 "rw_ios_per_sec": 0, 00:21:02.145 "rw_mbytes_per_sec": 0, 00:21:02.145 "r_mbytes_per_sec": 0, 00:21:02.145 "w_mbytes_per_sec": 0 00:21:02.145 }, 00:21:02.145 "claimed": true, 00:21:02.145 "claim_type": "exclusive_write", 00:21:02.145 "zoned": false, 00:21:02.145 "supported_io_types": { 00:21:02.145 "read": true, 00:21:02.145 "write": true, 00:21:02.145 "unmap": true, 00:21:02.145 "flush": true, 00:21:02.145 "reset": true, 00:21:02.145 "nvme_admin": false, 00:21:02.145 "nvme_io": false, 00:21:02.145 "nvme_io_md": false, 00:21:02.145 "write_zeroes": true, 00:21:02.145 "zcopy": true, 00:21:02.145 "get_zone_info": false, 00:21:02.145 "zone_management": false, 00:21:02.145 "zone_append": false, 00:21:02.145 "compare": false, 00:21:02.145 "compare_and_write": false, 00:21:02.145 "abort": true, 00:21:02.145 "seek_hole": false, 00:21:02.145 "seek_data": false, 00:21:02.145 "copy": true, 00:21:02.145 "nvme_iov_md": false 00:21:02.145 }, 00:21:02.145 "memory_domains": [ 00:21:02.145 { 00:21:02.145 "dma_device_id": "system", 00:21:02.145 "dma_device_type": 1 00:21:02.145 }, 00:21:02.145 { 00:21:02.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:02.145 "dma_device_type": 2 00:21:02.145 } 00:21:02.145 ], 00:21:02.145 "driver_specific": {} 00:21:02.145 }' 00:21:02.145 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.145 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.145 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:02.145 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.145 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.145 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:02.145 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.145 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.145 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:02.145 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.404 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.404 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:02.404 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:02.404 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:02.404 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:02.404 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:02.404 "name": "BaseBdev4", 00:21:02.404 "aliases": [ 00:21:02.404 "9fe22eee-e8b8-4ff6-99ca-af4f69a15290" 00:21:02.404 ], 00:21:02.404 "product_name": "Malloc disk", 00:21:02.404 "block_size": 512, 00:21:02.404 "num_blocks": 65536, 00:21:02.404 "uuid": "9fe22eee-e8b8-4ff6-99ca-af4f69a15290", 00:21:02.404 "assigned_rate_limits": { 00:21:02.404 "rw_ios_per_sec": 0, 00:21:02.404 "rw_mbytes_per_sec": 0, 00:21:02.404 "r_mbytes_per_sec": 0, 00:21:02.404 "w_mbytes_per_sec": 0 00:21:02.404 }, 00:21:02.404 "claimed": true, 00:21:02.404 "claim_type": "exclusive_write", 00:21:02.404 "zoned": false, 00:21:02.404 "supported_io_types": { 00:21:02.404 "read": true, 00:21:02.404 "write": true, 00:21:02.404 "unmap": true, 00:21:02.404 "flush": true, 00:21:02.404 "reset": true, 00:21:02.404 "nvme_admin": false, 00:21:02.404 "nvme_io": false, 00:21:02.404 "nvme_io_md": false, 00:21:02.404 "write_zeroes": true, 00:21:02.404 "zcopy": true, 00:21:02.404 "get_zone_info": false, 00:21:02.404 "zone_management": false, 00:21:02.404 "zone_append": false, 00:21:02.404 "compare": false, 00:21:02.404 "compare_and_write": false, 00:21:02.404 "abort": true, 00:21:02.404 "seek_hole": false, 00:21:02.404 "seek_data": false, 00:21:02.404 "copy": true, 00:21:02.404 "nvme_iov_md": false 00:21:02.404 }, 00:21:02.404 "memory_domains": [ 00:21:02.404 { 00:21:02.404 "dma_device_id": "system", 00:21:02.404 "dma_device_type": 1 00:21:02.404 }, 00:21:02.404 { 00:21:02.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:02.404 "dma_device_type": 2 00:21:02.404 } 00:21:02.404 ], 00:21:02.404 "driver_specific": {} 00:21:02.404 }' 00:21:02.404 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.404 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.663 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:02.663 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.663 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.663 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:02.663 08:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.663 08:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.663 08:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:02.663 08:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.663 08:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.663 08:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:02.663 08:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:02.922 [2024-07-23 08:34:15.271630] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:02.922 [2024-07-23 08:34:15.271658] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:02.922 [2024-07-23 08:34:15.271735] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:02.922 [2024-07-23 08:34:15.272017] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:02.922 [2024-07-23 08:34:15.272032] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037280 name Existed_Raid, state offline 00:21:02.922 08:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1509992 00:21:02.922 08:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1509992 ']' 00:21:02.922 08:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1509992 00:21:02.922 08:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:21:02.922 08:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:02.922 08:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1509992 00:21:02.922 08:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:02.922 08:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:02.922 08:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1509992' 00:21:02.922 killing process with pid 1509992 00:21:02.922 08:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1509992 00:21:02.922 [2024-07-23 08:34:15.316048] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:02.922 08:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1509992 00:21:03.180 [2024-07-23 08:34:15.643019] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:04.558 08:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:21:04.558 00:21:04.558 real 0m26.507s 00:21:04.558 user 0m47.392s 00:21:04.558 sys 0m3.955s 00:21:04.558 08:34:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:04.558 08:34:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:04.558 ************************************ 00:21:04.558 END TEST raid_state_function_test_sb 00:21:04.558 ************************************ 00:21:04.558 08:34:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:04.558 08:34:16 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:21:04.558 08:34:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:21:04.558 08:34:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:04.558 08:34:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:04.558 ************************************ 00:21:04.558 START TEST raid_superblock_test 00:21:04.558 ************************************ 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1515457 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1515457 /var/tmp/spdk-raid.sock 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1515457 ']' 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:04.558 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:04.558 08:34:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:04.558 [2024-07-23 08:34:17.065328] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:21:04.558 [2024-07-23 08:34:17.065414] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1515457 ] 00:21:04.817 [2024-07-23 08:34:17.190877] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:05.075 [2024-07-23 08:34:17.437882] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:05.334 [2024-07-23 08:34:17.711530] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:05.334 [2024-07-23 08:34:17.711560] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:05.334 08:34:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:05.334 08:34:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:21:05.334 08:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:05.334 08:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:05.334 08:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:05.334 08:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:05.334 08:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:05.334 08:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:05.334 08:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:05.334 08:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:05.334 08:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:21:05.593 malloc1 00:21:05.593 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:05.852 [2024-07-23 08:34:18.191023] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:05.853 [2024-07-23 08:34:18.191076] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:05.853 [2024-07-23 08:34:18.191114] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:21:05.853 [2024-07-23 08:34:18.191126] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:05.853 [2024-07-23 08:34:18.193059] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:05.853 [2024-07-23 08:34:18.193086] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:05.853 pt1 00:21:05.853 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:05.853 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:05.853 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:21:05.853 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:21:05.853 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:05.853 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:05.853 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:05.853 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:05.853 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:21:06.112 malloc2 00:21:06.112 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:06.112 [2024-07-23 08:34:18.561224] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:06.112 [2024-07-23 08:34:18.561275] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:06.112 [2024-07-23 08:34:18.561295] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:21:06.112 [2024-07-23 08:34:18.561304] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:06.112 [2024-07-23 08:34:18.563340] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:06.112 [2024-07-23 08:34:18.563367] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:06.112 pt2 00:21:06.112 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:06.112 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:06.112 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:21:06.112 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:21:06.112 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:21:06.112 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:06.112 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:06.112 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:06.112 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:21:06.371 malloc3 00:21:06.371 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:06.670 [2024-07-23 08:34:18.916223] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:06.670 [2024-07-23 08:34:18.916275] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:06.671 [2024-07-23 08:34:18.916298] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036080 00:21:06.671 [2024-07-23 08:34:18.916307] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:06.671 [2024-07-23 08:34:18.918179] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:06.671 [2024-07-23 08:34:18.918205] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:06.671 pt3 00:21:06.671 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:06.671 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:06.671 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:21:06.671 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:21:06.671 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:21:06.671 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:06.671 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:06.671 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:06.671 08:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:21:06.671 malloc4 00:21:06.671 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:06.949 [2024-07-23 08:34:19.311457] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:06.949 [2024-07-23 08:34:19.311503] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:06.949 [2024-07-23 08:34:19.311520] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036c80 00:21:06.949 [2024-07-23 08:34:19.311529] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:06.949 [2024-07-23 08:34:19.313433] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:06.949 [2024-07-23 08:34:19.313458] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:06.949 pt4 00:21:06.949 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:06.949 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:06.949 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:21:07.208 [2024-07-23 08:34:19.487982] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:07.208 [2024-07-23 08:34:19.489556] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:07.208 [2024-07-23 08:34:19.489633] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:07.208 [2024-07-23 08:34:19.489674] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:07.208 [2024-07-23 08:34:19.489868] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037280 00:21:07.208 [2024-07-23 08:34:19.489878] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:07.208 [2024-07-23 08:34:19.490125] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:21:07.208 [2024-07-23 08:34:19.490319] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037280 00:21:07.208 [2024-07-23 08:34:19.490330] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037280 00:21:07.208 [2024-07-23 08:34:19.490479] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:07.208 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:07.208 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:07.208 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:07.208 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:07.208 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:07.208 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:07.208 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:07.208 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:07.208 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:07.208 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:07.208 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.208 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:07.208 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:07.208 "name": "raid_bdev1", 00:21:07.208 "uuid": "9dbf2e72-574b-4ed3-9363-e250f62a33e7", 00:21:07.208 "strip_size_kb": 0, 00:21:07.208 "state": "online", 00:21:07.208 "raid_level": "raid1", 00:21:07.208 "superblock": true, 00:21:07.208 "num_base_bdevs": 4, 00:21:07.208 "num_base_bdevs_discovered": 4, 00:21:07.208 "num_base_bdevs_operational": 4, 00:21:07.208 "base_bdevs_list": [ 00:21:07.208 { 00:21:07.208 "name": "pt1", 00:21:07.208 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:07.208 "is_configured": true, 00:21:07.209 "data_offset": 2048, 00:21:07.209 "data_size": 63488 00:21:07.209 }, 00:21:07.209 { 00:21:07.209 "name": "pt2", 00:21:07.209 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:07.209 "is_configured": true, 00:21:07.209 "data_offset": 2048, 00:21:07.209 "data_size": 63488 00:21:07.209 }, 00:21:07.209 { 00:21:07.209 "name": "pt3", 00:21:07.209 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:07.209 "is_configured": true, 00:21:07.209 "data_offset": 2048, 00:21:07.209 "data_size": 63488 00:21:07.209 }, 00:21:07.209 { 00:21:07.209 "name": "pt4", 00:21:07.209 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:07.209 "is_configured": true, 00:21:07.209 "data_offset": 2048, 00:21:07.209 "data_size": 63488 00:21:07.209 } 00:21:07.209 ] 00:21:07.209 }' 00:21:07.209 08:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:07.209 08:34:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:07.777 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:21:07.777 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:07.777 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:07.777 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:07.777 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:07.777 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:07.777 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:07.777 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:08.036 [2024-07-23 08:34:20.306422] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:08.036 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:08.036 "name": "raid_bdev1", 00:21:08.036 "aliases": [ 00:21:08.036 "9dbf2e72-574b-4ed3-9363-e250f62a33e7" 00:21:08.036 ], 00:21:08.036 "product_name": "Raid Volume", 00:21:08.036 "block_size": 512, 00:21:08.036 "num_blocks": 63488, 00:21:08.036 "uuid": "9dbf2e72-574b-4ed3-9363-e250f62a33e7", 00:21:08.036 "assigned_rate_limits": { 00:21:08.036 "rw_ios_per_sec": 0, 00:21:08.036 "rw_mbytes_per_sec": 0, 00:21:08.036 "r_mbytes_per_sec": 0, 00:21:08.036 "w_mbytes_per_sec": 0 00:21:08.036 }, 00:21:08.036 "claimed": false, 00:21:08.036 "zoned": false, 00:21:08.036 "supported_io_types": { 00:21:08.036 "read": true, 00:21:08.036 "write": true, 00:21:08.036 "unmap": false, 00:21:08.036 "flush": false, 00:21:08.036 "reset": true, 00:21:08.036 "nvme_admin": false, 00:21:08.036 "nvme_io": false, 00:21:08.036 "nvme_io_md": false, 00:21:08.036 "write_zeroes": true, 00:21:08.036 "zcopy": false, 00:21:08.036 "get_zone_info": false, 00:21:08.036 "zone_management": false, 00:21:08.036 "zone_append": false, 00:21:08.036 "compare": false, 00:21:08.036 "compare_and_write": false, 00:21:08.036 "abort": false, 00:21:08.036 "seek_hole": false, 00:21:08.036 "seek_data": false, 00:21:08.036 "copy": false, 00:21:08.036 "nvme_iov_md": false 00:21:08.036 }, 00:21:08.037 "memory_domains": [ 00:21:08.037 { 00:21:08.037 "dma_device_id": "system", 00:21:08.037 "dma_device_type": 1 00:21:08.037 }, 00:21:08.037 { 00:21:08.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.037 "dma_device_type": 2 00:21:08.037 }, 00:21:08.037 { 00:21:08.037 "dma_device_id": "system", 00:21:08.037 "dma_device_type": 1 00:21:08.037 }, 00:21:08.037 { 00:21:08.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.037 "dma_device_type": 2 00:21:08.037 }, 00:21:08.037 { 00:21:08.037 "dma_device_id": "system", 00:21:08.037 "dma_device_type": 1 00:21:08.037 }, 00:21:08.037 { 00:21:08.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.037 "dma_device_type": 2 00:21:08.037 }, 00:21:08.037 { 00:21:08.037 "dma_device_id": "system", 00:21:08.037 "dma_device_type": 1 00:21:08.037 }, 00:21:08.037 { 00:21:08.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.037 "dma_device_type": 2 00:21:08.037 } 00:21:08.037 ], 00:21:08.037 "driver_specific": { 00:21:08.037 "raid": { 00:21:08.037 "uuid": "9dbf2e72-574b-4ed3-9363-e250f62a33e7", 00:21:08.037 "strip_size_kb": 0, 00:21:08.037 "state": "online", 00:21:08.037 "raid_level": "raid1", 00:21:08.037 "superblock": true, 00:21:08.037 "num_base_bdevs": 4, 00:21:08.037 "num_base_bdevs_discovered": 4, 00:21:08.037 "num_base_bdevs_operational": 4, 00:21:08.037 "base_bdevs_list": [ 00:21:08.037 { 00:21:08.037 "name": "pt1", 00:21:08.037 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:08.037 "is_configured": true, 00:21:08.037 "data_offset": 2048, 00:21:08.037 "data_size": 63488 00:21:08.037 }, 00:21:08.037 { 00:21:08.037 "name": "pt2", 00:21:08.037 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:08.037 "is_configured": true, 00:21:08.037 "data_offset": 2048, 00:21:08.037 "data_size": 63488 00:21:08.037 }, 00:21:08.037 { 00:21:08.037 "name": "pt3", 00:21:08.037 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:08.037 "is_configured": true, 00:21:08.037 "data_offset": 2048, 00:21:08.037 "data_size": 63488 00:21:08.037 }, 00:21:08.037 { 00:21:08.037 "name": "pt4", 00:21:08.037 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:08.037 "is_configured": true, 00:21:08.037 "data_offset": 2048, 00:21:08.037 "data_size": 63488 00:21:08.037 } 00:21:08.037 ] 00:21:08.037 } 00:21:08.037 } 00:21:08.037 }' 00:21:08.037 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:08.037 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:08.037 pt2 00:21:08.037 pt3 00:21:08.037 pt4' 00:21:08.037 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:08.037 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:08.037 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:08.037 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:08.037 "name": "pt1", 00:21:08.037 "aliases": [ 00:21:08.037 "00000000-0000-0000-0000-000000000001" 00:21:08.037 ], 00:21:08.037 "product_name": "passthru", 00:21:08.037 "block_size": 512, 00:21:08.037 "num_blocks": 65536, 00:21:08.037 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:08.037 "assigned_rate_limits": { 00:21:08.037 "rw_ios_per_sec": 0, 00:21:08.037 "rw_mbytes_per_sec": 0, 00:21:08.037 "r_mbytes_per_sec": 0, 00:21:08.037 "w_mbytes_per_sec": 0 00:21:08.037 }, 00:21:08.037 "claimed": true, 00:21:08.037 "claim_type": "exclusive_write", 00:21:08.037 "zoned": false, 00:21:08.037 "supported_io_types": { 00:21:08.037 "read": true, 00:21:08.037 "write": true, 00:21:08.037 "unmap": true, 00:21:08.037 "flush": true, 00:21:08.037 "reset": true, 00:21:08.037 "nvme_admin": false, 00:21:08.037 "nvme_io": false, 00:21:08.037 "nvme_io_md": false, 00:21:08.037 "write_zeroes": true, 00:21:08.037 "zcopy": true, 00:21:08.037 "get_zone_info": false, 00:21:08.037 "zone_management": false, 00:21:08.037 "zone_append": false, 00:21:08.037 "compare": false, 00:21:08.037 "compare_and_write": false, 00:21:08.037 "abort": true, 00:21:08.037 "seek_hole": false, 00:21:08.037 "seek_data": false, 00:21:08.037 "copy": true, 00:21:08.037 "nvme_iov_md": false 00:21:08.037 }, 00:21:08.037 "memory_domains": [ 00:21:08.037 { 00:21:08.037 "dma_device_id": "system", 00:21:08.037 "dma_device_type": 1 00:21:08.037 }, 00:21:08.037 { 00:21:08.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.037 "dma_device_type": 2 00:21:08.037 } 00:21:08.037 ], 00:21:08.037 "driver_specific": { 00:21:08.037 "passthru": { 00:21:08.037 "name": "pt1", 00:21:08.037 "base_bdev_name": "malloc1" 00:21:08.037 } 00:21:08.037 } 00:21:08.037 }' 00:21:08.037 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:08.037 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:08.296 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:08.296 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:08.296 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:08.296 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:08.296 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:08.296 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:08.296 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:08.296 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:08.296 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:08.296 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:08.296 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:08.296 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:08.296 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:08.554 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:08.554 "name": "pt2", 00:21:08.554 "aliases": [ 00:21:08.554 "00000000-0000-0000-0000-000000000002" 00:21:08.554 ], 00:21:08.554 "product_name": "passthru", 00:21:08.554 "block_size": 512, 00:21:08.554 "num_blocks": 65536, 00:21:08.554 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:08.554 "assigned_rate_limits": { 00:21:08.554 "rw_ios_per_sec": 0, 00:21:08.554 "rw_mbytes_per_sec": 0, 00:21:08.554 "r_mbytes_per_sec": 0, 00:21:08.554 "w_mbytes_per_sec": 0 00:21:08.554 }, 00:21:08.554 "claimed": true, 00:21:08.554 "claim_type": "exclusive_write", 00:21:08.554 "zoned": false, 00:21:08.554 "supported_io_types": { 00:21:08.554 "read": true, 00:21:08.554 "write": true, 00:21:08.554 "unmap": true, 00:21:08.554 "flush": true, 00:21:08.554 "reset": true, 00:21:08.554 "nvme_admin": false, 00:21:08.554 "nvme_io": false, 00:21:08.554 "nvme_io_md": false, 00:21:08.554 "write_zeroes": true, 00:21:08.554 "zcopy": true, 00:21:08.554 "get_zone_info": false, 00:21:08.554 "zone_management": false, 00:21:08.554 "zone_append": false, 00:21:08.554 "compare": false, 00:21:08.554 "compare_and_write": false, 00:21:08.554 "abort": true, 00:21:08.554 "seek_hole": false, 00:21:08.554 "seek_data": false, 00:21:08.554 "copy": true, 00:21:08.554 "nvme_iov_md": false 00:21:08.554 }, 00:21:08.554 "memory_domains": [ 00:21:08.554 { 00:21:08.554 "dma_device_id": "system", 00:21:08.554 "dma_device_type": 1 00:21:08.554 }, 00:21:08.554 { 00:21:08.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.554 "dma_device_type": 2 00:21:08.554 } 00:21:08.554 ], 00:21:08.554 "driver_specific": { 00:21:08.554 "passthru": { 00:21:08.554 "name": "pt2", 00:21:08.554 "base_bdev_name": "malloc2" 00:21:08.554 } 00:21:08.554 } 00:21:08.554 }' 00:21:08.554 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:08.554 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:08.554 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:08.554 08:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:08.554 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:08.554 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:08.554 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:08.813 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:08.813 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:08.813 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:08.813 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:08.813 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:08.813 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:08.813 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:08.813 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:09.072 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:09.072 "name": "pt3", 00:21:09.072 "aliases": [ 00:21:09.072 "00000000-0000-0000-0000-000000000003" 00:21:09.072 ], 00:21:09.072 "product_name": "passthru", 00:21:09.072 "block_size": 512, 00:21:09.072 "num_blocks": 65536, 00:21:09.072 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:09.072 "assigned_rate_limits": { 00:21:09.072 "rw_ios_per_sec": 0, 00:21:09.072 "rw_mbytes_per_sec": 0, 00:21:09.072 "r_mbytes_per_sec": 0, 00:21:09.072 "w_mbytes_per_sec": 0 00:21:09.072 }, 00:21:09.072 "claimed": true, 00:21:09.072 "claim_type": "exclusive_write", 00:21:09.072 "zoned": false, 00:21:09.072 "supported_io_types": { 00:21:09.072 "read": true, 00:21:09.072 "write": true, 00:21:09.072 "unmap": true, 00:21:09.072 "flush": true, 00:21:09.072 "reset": true, 00:21:09.072 "nvme_admin": false, 00:21:09.072 "nvme_io": false, 00:21:09.072 "nvme_io_md": false, 00:21:09.072 "write_zeroes": true, 00:21:09.072 "zcopy": true, 00:21:09.072 "get_zone_info": false, 00:21:09.072 "zone_management": false, 00:21:09.072 "zone_append": false, 00:21:09.072 "compare": false, 00:21:09.072 "compare_and_write": false, 00:21:09.072 "abort": true, 00:21:09.072 "seek_hole": false, 00:21:09.072 "seek_data": false, 00:21:09.072 "copy": true, 00:21:09.072 "nvme_iov_md": false 00:21:09.072 }, 00:21:09.072 "memory_domains": [ 00:21:09.072 { 00:21:09.072 "dma_device_id": "system", 00:21:09.072 "dma_device_type": 1 00:21:09.072 }, 00:21:09.072 { 00:21:09.072 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:09.072 "dma_device_type": 2 00:21:09.072 } 00:21:09.072 ], 00:21:09.072 "driver_specific": { 00:21:09.072 "passthru": { 00:21:09.072 "name": "pt3", 00:21:09.072 "base_bdev_name": "malloc3" 00:21:09.072 } 00:21:09.072 } 00:21:09.072 }' 00:21:09.072 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:09.072 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:09.072 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:09.072 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:09.072 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:09.072 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:09.072 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:09.072 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:09.331 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:09.331 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:09.331 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:09.331 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:09.331 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:09.331 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:09.331 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:09.331 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:09.331 "name": "pt4", 00:21:09.331 "aliases": [ 00:21:09.331 "00000000-0000-0000-0000-000000000004" 00:21:09.331 ], 00:21:09.331 "product_name": "passthru", 00:21:09.331 "block_size": 512, 00:21:09.331 "num_blocks": 65536, 00:21:09.331 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:09.331 "assigned_rate_limits": { 00:21:09.331 "rw_ios_per_sec": 0, 00:21:09.331 "rw_mbytes_per_sec": 0, 00:21:09.331 "r_mbytes_per_sec": 0, 00:21:09.331 "w_mbytes_per_sec": 0 00:21:09.331 }, 00:21:09.331 "claimed": true, 00:21:09.331 "claim_type": "exclusive_write", 00:21:09.331 "zoned": false, 00:21:09.331 "supported_io_types": { 00:21:09.331 "read": true, 00:21:09.331 "write": true, 00:21:09.331 "unmap": true, 00:21:09.331 "flush": true, 00:21:09.331 "reset": true, 00:21:09.331 "nvme_admin": false, 00:21:09.331 "nvme_io": false, 00:21:09.331 "nvme_io_md": false, 00:21:09.331 "write_zeroes": true, 00:21:09.331 "zcopy": true, 00:21:09.331 "get_zone_info": false, 00:21:09.331 "zone_management": false, 00:21:09.331 "zone_append": false, 00:21:09.331 "compare": false, 00:21:09.331 "compare_and_write": false, 00:21:09.331 "abort": true, 00:21:09.331 "seek_hole": false, 00:21:09.331 "seek_data": false, 00:21:09.331 "copy": true, 00:21:09.331 "nvme_iov_md": false 00:21:09.331 }, 00:21:09.331 "memory_domains": [ 00:21:09.331 { 00:21:09.331 "dma_device_id": "system", 00:21:09.331 "dma_device_type": 1 00:21:09.331 }, 00:21:09.331 { 00:21:09.331 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:09.331 "dma_device_type": 2 00:21:09.331 } 00:21:09.331 ], 00:21:09.332 "driver_specific": { 00:21:09.332 "passthru": { 00:21:09.332 "name": "pt4", 00:21:09.332 "base_bdev_name": "malloc4" 00:21:09.332 } 00:21:09.332 } 00:21:09.332 }' 00:21:09.332 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:09.590 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:09.590 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:09.590 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:09.590 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:09.590 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:09.590 08:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:09.590 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:09.590 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:09.590 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:09.590 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:09.848 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:09.848 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:09.848 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:21:09.848 [2024-07-23 08:34:22.263569] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:09.848 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=9dbf2e72-574b-4ed3-9363-e250f62a33e7 00:21:09.848 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 9dbf2e72-574b-4ed3-9363-e250f62a33e7 ']' 00:21:09.848 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:10.107 [2024-07-23 08:34:22.419665] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:10.107 [2024-07-23 08:34:22.419689] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:10.107 [2024-07-23 08:34:22.419762] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:10.107 [2024-07-23 08:34:22.419842] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:10.107 [2024-07-23 08:34:22.419854] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037280 name raid_bdev1, state offline 00:21:10.107 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.107 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:21:10.107 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:21:10.107 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:21:10.107 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:10.107 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:10.366 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:10.366 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:10.625 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:10.625 08:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:10.625 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:10.625 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:10.886 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:10.886 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:11.145 [2024-07-23 08:34:23.606809] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:11.145 [2024-07-23 08:34:23.608434] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:11.145 [2024-07-23 08:34:23.608483] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:21:11.145 [2024-07-23 08:34:23.608517] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:21:11.145 [2024-07-23 08:34:23.608563] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:11.145 [2024-07-23 08:34:23.608619] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:11.145 [2024-07-23 08:34:23.608639] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:21:11.145 [2024-07-23 08:34:23.608659] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:21:11.145 [2024-07-23 08:34:23.608673] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:11.145 [2024-07-23 08:34:23.608684] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037880 name raid_bdev1, state configuring 00:21:11.145 request: 00:21:11.145 { 00:21:11.145 "name": "raid_bdev1", 00:21:11.145 "raid_level": "raid1", 00:21:11.145 "base_bdevs": [ 00:21:11.145 "malloc1", 00:21:11.145 "malloc2", 00:21:11.145 "malloc3", 00:21:11.145 "malloc4" 00:21:11.145 ], 00:21:11.145 "superblock": false, 00:21:11.145 "method": "bdev_raid_create", 00:21:11.145 "req_id": 1 00:21:11.145 } 00:21:11.145 Got JSON-RPC error response 00:21:11.145 response: 00:21:11.145 { 00:21:11.145 "code": -17, 00:21:11.145 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:11.145 } 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.145 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:21:11.403 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:21:11.403 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:21:11.404 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:11.663 [2024-07-23 08:34:23.935604] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:11.663 [2024-07-23 08:34:23.935700] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:11.663 [2024-07-23 08:34:23.935717] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037e80 00:21:11.663 [2024-07-23 08:34:23.935728] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:11.663 [2024-07-23 08:34:23.937700] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:11.663 [2024-07-23 08:34:23.937727] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:11.663 [2024-07-23 08:34:23.937818] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:11.663 [2024-07-23 08:34:23.937877] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:11.663 pt1 00:21:11.663 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:21:11.663 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:11.663 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:11.663 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:11.663 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:11.663 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:11.663 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:11.663 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:11.663 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:11.663 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:11.663 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.663 08:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:11.663 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:11.663 "name": "raid_bdev1", 00:21:11.663 "uuid": "9dbf2e72-574b-4ed3-9363-e250f62a33e7", 00:21:11.663 "strip_size_kb": 0, 00:21:11.663 "state": "configuring", 00:21:11.663 "raid_level": "raid1", 00:21:11.663 "superblock": true, 00:21:11.663 "num_base_bdevs": 4, 00:21:11.663 "num_base_bdevs_discovered": 1, 00:21:11.663 "num_base_bdevs_operational": 4, 00:21:11.663 "base_bdevs_list": [ 00:21:11.663 { 00:21:11.663 "name": "pt1", 00:21:11.663 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:11.663 "is_configured": true, 00:21:11.663 "data_offset": 2048, 00:21:11.663 "data_size": 63488 00:21:11.663 }, 00:21:11.663 { 00:21:11.663 "name": null, 00:21:11.663 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:11.663 "is_configured": false, 00:21:11.663 "data_offset": 2048, 00:21:11.663 "data_size": 63488 00:21:11.663 }, 00:21:11.663 { 00:21:11.663 "name": null, 00:21:11.663 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:11.663 "is_configured": false, 00:21:11.663 "data_offset": 2048, 00:21:11.663 "data_size": 63488 00:21:11.663 }, 00:21:11.663 { 00:21:11.663 "name": null, 00:21:11.663 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:11.663 "is_configured": false, 00:21:11.663 "data_offset": 2048, 00:21:11.663 "data_size": 63488 00:21:11.663 } 00:21:11.663 ] 00:21:11.663 }' 00:21:11.663 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:11.663 08:34:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:12.231 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:21:12.231 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:12.231 [2024-07-23 08:34:24.729773] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:12.231 [2024-07-23 08:34:24.729829] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:12.231 [2024-07-23 08:34:24.729848] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038780 00:21:12.231 [2024-07-23 08:34:24.729858] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:12.231 [2024-07-23 08:34:24.730297] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:12.231 [2024-07-23 08:34:24.730316] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:12.231 [2024-07-23 08:34:24.730387] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:12.231 [2024-07-23 08:34:24.730413] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:12.231 pt2 00:21:12.231 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:12.490 [2024-07-23 08:34:24.886225] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:21:12.490 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:21:12.490 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:12.490 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:12.490 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:12.490 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:12.490 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:12.490 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:12.490 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:12.490 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:12.490 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:12.490 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.490 08:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:12.750 08:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:12.750 "name": "raid_bdev1", 00:21:12.750 "uuid": "9dbf2e72-574b-4ed3-9363-e250f62a33e7", 00:21:12.750 "strip_size_kb": 0, 00:21:12.750 "state": "configuring", 00:21:12.750 "raid_level": "raid1", 00:21:12.750 "superblock": true, 00:21:12.750 "num_base_bdevs": 4, 00:21:12.750 "num_base_bdevs_discovered": 1, 00:21:12.750 "num_base_bdevs_operational": 4, 00:21:12.750 "base_bdevs_list": [ 00:21:12.750 { 00:21:12.750 "name": "pt1", 00:21:12.750 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:12.750 "is_configured": true, 00:21:12.750 "data_offset": 2048, 00:21:12.750 "data_size": 63488 00:21:12.750 }, 00:21:12.750 { 00:21:12.750 "name": null, 00:21:12.750 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:12.750 "is_configured": false, 00:21:12.750 "data_offset": 2048, 00:21:12.750 "data_size": 63488 00:21:12.750 }, 00:21:12.750 { 00:21:12.750 "name": null, 00:21:12.750 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:12.750 "is_configured": false, 00:21:12.750 "data_offset": 2048, 00:21:12.750 "data_size": 63488 00:21:12.750 }, 00:21:12.750 { 00:21:12.750 "name": null, 00:21:12.750 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:12.750 "is_configured": false, 00:21:12.750 "data_offset": 2048, 00:21:12.750 "data_size": 63488 00:21:12.750 } 00:21:12.750 ] 00:21:12.750 }' 00:21:12.750 08:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:12.750 08:34:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:13.317 08:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:21:13.317 08:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:13.317 08:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:13.317 [2024-07-23 08:34:25.704345] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:13.317 [2024-07-23 08:34:25.704401] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:13.317 [2024-07-23 08:34:25.704423] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038a80 00:21:13.317 [2024-07-23 08:34:25.704433] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:13.317 [2024-07-23 08:34:25.704904] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:13.317 [2024-07-23 08:34:25.704923] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:13.317 [2024-07-23 08:34:25.705000] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:13.317 [2024-07-23 08:34:25.705020] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:13.317 pt2 00:21:13.317 08:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:13.317 08:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:13.317 08:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:13.576 [2024-07-23 08:34:25.864778] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:13.576 [2024-07-23 08:34:25.864826] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:13.576 [2024-07-23 08:34:25.864860] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038d80 00:21:13.576 [2024-07-23 08:34:25.864869] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:13.576 [2024-07-23 08:34:25.865312] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:13.576 [2024-07-23 08:34:25.865328] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:13.576 [2024-07-23 08:34:25.865397] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:13.576 [2024-07-23 08:34:25.865418] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:13.576 pt3 00:21:13.576 08:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:13.576 08:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:13.576 08:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:13.576 [2024-07-23 08:34:26.021190] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:13.576 [2024-07-23 08:34:26.021241] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:13.576 [2024-07-23 08:34:26.021260] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039080 00:21:13.576 [2024-07-23 08:34:26.021268] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:13.576 [2024-07-23 08:34:26.021713] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:13.576 [2024-07-23 08:34:26.021732] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:13.576 [2024-07-23 08:34:26.021810] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:13.576 [2024-07-23 08:34:26.021832] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:13.576 [2024-07-23 08:34:26.022010] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000038480 00:21:13.576 [2024-07-23 08:34:26.022020] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:13.577 [2024-07-23 08:34:26.022247] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:21:13.577 [2024-07-23 08:34:26.022449] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000038480 00:21:13.577 [2024-07-23 08:34:26.022461] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000038480 00:21:13.577 [2024-07-23 08:34:26.022619] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:13.577 pt4 00:21:13.577 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:13.577 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:13.577 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:13.577 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:13.577 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:13.577 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:13.577 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:13.577 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:13.577 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:13.577 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:13.577 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:13.577 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:13.577 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.577 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:13.836 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:13.836 "name": "raid_bdev1", 00:21:13.836 "uuid": "9dbf2e72-574b-4ed3-9363-e250f62a33e7", 00:21:13.836 "strip_size_kb": 0, 00:21:13.836 "state": "online", 00:21:13.836 "raid_level": "raid1", 00:21:13.836 "superblock": true, 00:21:13.836 "num_base_bdevs": 4, 00:21:13.836 "num_base_bdevs_discovered": 4, 00:21:13.836 "num_base_bdevs_operational": 4, 00:21:13.836 "base_bdevs_list": [ 00:21:13.836 { 00:21:13.836 "name": "pt1", 00:21:13.836 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:13.836 "is_configured": true, 00:21:13.836 "data_offset": 2048, 00:21:13.836 "data_size": 63488 00:21:13.836 }, 00:21:13.836 { 00:21:13.836 "name": "pt2", 00:21:13.836 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:13.836 "is_configured": true, 00:21:13.836 "data_offset": 2048, 00:21:13.836 "data_size": 63488 00:21:13.836 }, 00:21:13.836 { 00:21:13.836 "name": "pt3", 00:21:13.836 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:13.836 "is_configured": true, 00:21:13.836 "data_offset": 2048, 00:21:13.836 "data_size": 63488 00:21:13.836 }, 00:21:13.836 { 00:21:13.836 "name": "pt4", 00:21:13.836 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:13.836 "is_configured": true, 00:21:13.836 "data_offset": 2048, 00:21:13.836 "data_size": 63488 00:21:13.836 } 00:21:13.836 ] 00:21:13.836 }' 00:21:13.836 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:13.836 08:34:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:14.402 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:21:14.402 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:14.402 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:14.402 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:14.402 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:14.402 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:14.402 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:14.402 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:14.402 [2024-07-23 08:34:26.807583] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:14.402 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:14.402 "name": "raid_bdev1", 00:21:14.402 "aliases": [ 00:21:14.402 "9dbf2e72-574b-4ed3-9363-e250f62a33e7" 00:21:14.402 ], 00:21:14.402 "product_name": "Raid Volume", 00:21:14.402 "block_size": 512, 00:21:14.402 "num_blocks": 63488, 00:21:14.402 "uuid": "9dbf2e72-574b-4ed3-9363-e250f62a33e7", 00:21:14.402 "assigned_rate_limits": { 00:21:14.402 "rw_ios_per_sec": 0, 00:21:14.402 "rw_mbytes_per_sec": 0, 00:21:14.402 "r_mbytes_per_sec": 0, 00:21:14.402 "w_mbytes_per_sec": 0 00:21:14.402 }, 00:21:14.402 "claimed": false, 00:21:14.402 "zoned": false, 00:21:14.402 "supported_io_types": { 00:21:14.402 "read": true, 00:21:14.402 "write": true, 00:21:14.402 "unmap": false, 00:21:14.402 "flush": false, 00:21:14.402 "reset": true, 00:21:14.402 "nvme_admin": false, 00:21:14.402 "nvme_io": false, 00:21:14.402 "nvme_io_md": false, 00:21:14.402 "write_zeroes": true, 00:21:14.402 "zcopy": false, 00:21:14.402 "get_zone_info": false, 00:21:14.402 "zone_management": false, 00:21:14.402 "zone_append": false, 00:21:14.402 "compare": false, 00:21:14.402 "compare_and_write": false, 00:21:14.402 "abort": false, 00:21:14.402 "seek_hole": false, 00:21:14.402 "seek_data": false, 00:21:14.402 "copy": false, 00:21:14.402 "nvme_iov_md": false 00:21:14.402 }, 00:21:14.402 "memory_domains": [ 00:21:14.402 { 00:21:14.402 "dma_device_id": "system", 00:21:14.402 "dma_device_type": 1 00:21:14.402 }, 00:21:14.402 { 00:21:14.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.402 "dma_device_type": 2 00:21:14.402 }, 00:21:14.402 { 00:21:14.402 "dma_device_id": "system", 00:21:14.402 "dma_device_type": 1 00:21:14.402 }, 00:21:14.402 { 00:21:14.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.402 "dma_device_type": 2 00:21:14.402 }, 00:21:14.402 { 00:21:14.402 "dma_device_id": "system", 00:21:14.402 "dma_device_type": 1 00:21:14.402 }, 00:21:14.402 { 00:21:14.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.402 "dma_device_type": 2 00:21:14.402 }, 00:21:14.402 { 00:21:14.402 "dma_device_id": "system", 00:21:14.402 "dma_device_type": 1 00:21:14.402 }, 00:21:14.402 { 00:21:14.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.402 "dma_device_type": 2 00:21:14.402 } 00:21:14.402 ], 00:21:14.402 "driver_specific": { 00:21:14.402 "raid": { 00:21:14.402 "uuid": "9dbf2e72-574b-4ed3-9363-e250f62a33e7", 00:21:14.402 "strip_size_kb": 0, 00:21:14.402 "state": "online", 00:21:14.402 "raid_level": "raid1", 00:21:14.402 "superblock": true, 00:21:14.402 "num_base_bdevs": 4, 00:21:14.402 "num_base_bdevs_discovered": 4, 00:21:14.402 "num_base_bdevs_operational": 4, 00:21:14.402 "base_bdevs_list": [ 00:21:14.402 { 00:21:14.402 "name": "pt1", 00:21:14.402 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:14.402 "is_configured": true, 00:21:14.402 "data_offset": 2048, 00:21:14.402 "data_size": 63488 00:21:14.402 }, 00:21:14.402 { 00:21:14.402 "name": "pt2", 00:21:14.402 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:14.402 "is_configured": true, 00:21:14.402 "data_offset": 2048, 00:21:14.402 "data_size": 63488 00:21:14.402 }, 00:21:14.402 { 00:21:14.402 "name": "pt3", 00:21:14.402 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:14.402 "is_configured": true, 00:21:14.402 "data_offset": 2048, 00:21:14.402 "data_size": 63488 00:21:14.402 }, 00:21:14.402 { 00:21:14.402 "name": "pt4", 00:21:14.402 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:14.402 "is_configured": true, 00:21:14.402 "data_offset": 2048, 00:21:14.402 "data_size": 63488 00:21:14.402 } 00:21:14.402 ] 00:21:14.402 } 00:21:14.402 } 00:21:14.402 }' 00:21:14.403 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:14.403 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:14.403 pt2 00:21:14.403 pt3 00:21:14.403 pt4' 00:21:14.403 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:14.403 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:14.403 08:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:14.661 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:14.661 "name": "pt1", 00:21:14.661 "aliases": [ 00:21:14.661 "00000000-0000-0000-0000-000000000001" 00:21:14.661 ], 00:21:14.661 "product_name": "passthru", 00:21:14.661 "block_size": 512, 00:21:14.661 "num_blocks": 65536, 00:21:14.661 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:14.661 "assigned_rate_limits": { 00:21:14.661 "rw_ios_per_sec": 0, 00:21:14.661 "rw_mbytes_per_sec": 0, 00:21:14.661 "r_mbytes_per_sec": 0, 00:21:14.661 "w_mbytes_per_sec": 0 00:21:14.661 }, 00:21:14.661 "claimed": true, 00:21:14.661 "claim_type": "exclusive_write", 00:21:14.661 "zoned": false, 00:21:14.661 "supported_io_types": { 00:21:14.661 "read": true, 00:21:14.661 "write": true, 00:21:14.661 "unmap": true, 00:21:14.661 "flush": true, 00:21:14.661 "reset": true, 00:21:14.661 "nvme_admin": false, 00:21:14.661 "nvme_io": false, 00:21:14.661 "nvme_io_md": false, 00:21:14.661 "write_zeroes": true, 00:21:14.661 "zcopy": true, 00:21:14.661 "get_zone_info": false, 00:21:14.661 "zone_management": false, 00:21:14.661 "zone_append": false, 00:21:14.661 "compare": false, 00:21:14.661 "compare_and_write": false, 00:21:14.661 "abort": true, 00:21:14.661 "seek_hole": false, 00:21:14.661 "seek_data": false, 00:21:14.661 "copy": true, 00:21:14.661 "nvme_iov_md": false 00:21:14.661 }, 00:21:14.661 "memory_domains": [ 00:21:14.661 { 00:21:14.661 "dma_device_id": "system", 00:21:14.661 "dma_device_type": 1 00:21:14.661 }, 00:21:14.661 { 00:21:14.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.661 "dma_device_type": 2 00:21:14.661 } 00:21:14.661 ], 00:21:14.661 "driver_specific": { 00:21:14.661 "passthru": { 00:21:14.661 "name": "pt1", 00:21:14.661 "base_bdev_name": "malloc1" 00:21:14.661 } 00:21:14.661 } 00:21:14.661 }' 00:21:14.661 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:14.661 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:14.661 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:14.661 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:14.661 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:14.919 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:14.919 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:14.919 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:14.919 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:14.919 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:14.919 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:14.919 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:14.919 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:14.919 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:14.919 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:15.178 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:15.178 "name": "pt2", 00:21:15.178 "aliases": [ 00:21:15.178 "00000000-0000-0000-0000-000000000002" 00:21:15.178 ], 00:21:15.178 "product_name": "passthru", 00:21:15.178 "block_size": 512, 00:21:15.178 "num_blocks": 65536, 00:21:15.178 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:15.178 "assigned_rate_limits": { 00:21:15.178 "rw_ios_per_sec": 0, 00:21:15.178 "rw_mbytes_per_sec": 0, 00:21:15.178 "r_mbytes_per_sec": 0, 00:21:15.178 "w_mbytes_per_sec": 0 00:21:15.178 }, 00:21:15.178 "claimed": true, 00:21:15.178 "claim_type": "exclusive_write", 00:21:15.178 "zoned": false, 00:21:15.178 "supported_io_types": { 00:21:15.178 "read": true, 00:21:15.178 "write": true, 00:21:15.178 "unmap": true, 00:21:15.178 "flush": true, 00:21:15.178 "reset": true, 00:21:15.178 "nvme_admin": false, 00:21:15.178 "nvme_io": false, 00:21:15.178 "nvme_io_md": false, 00:21:15.178 "write_zeroes": true, 00:21:15.178 "zcopy": true, 00:21:15.178 "get_zone_info": false, 00:21:15.178 "zone_management": false, 00:21:15.178 "zone_append": false, 00:21:15.178 "compare": false, 00:21:15.178 "compare_and_write": false, 00:21:15.178 "abort": true, 00:21:15.178 "seek_hole": false, 00:21:15.178 "seek_data": false, 00:21:15.178 "copy": true, 00:21:15.178 "nvme_iov_md": false 00:21:15.178 }, 00:21:15.178 "memory_domains": [ 00:21:15.178 { 00:21:15.178 "dma_device_id": "system", 00:21:15.178 "dma_device_type": 1 00:21:15.178 }, 00:21:15.178 { 00:21:15.178 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:15.178 "dma_device_type": 2 00:21:15.178 } 00:21:15.178 ], 00:21:15.178 "driver_specific": { 00:21:15.178 "passthru": { 00:21:15.178 "name": "pt2", 00:21:15.178 "base_bdev_name": "malloc2" 00:21:15.178 } 00:21:15.178 } 00:21:15.178 }' 00:21:15.178 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:15.178 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:15.178 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:15.179 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:15.179 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:15.179 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:15.179 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:15.179 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:15.437 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:15.437 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:15.437 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:15.437 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:15.437 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:15.437 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:15.437 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:15.437 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:15.437 "name": "pt3", 00:21:15.437 "aliases": [ 00:21:15.437 "00000000-0000-0000-0000-000000000003" 00:21:15.437 ], 00:21:15.437 "product_name": "passthru", 00:21:15.437 "block_size": 512, 00:21:15.437 "num_blocks": 65536, 00:21:15.437 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:15.437 "assigned_rate_limits": { 00:21:15.437 "rw_ios_per_sec": 0, 00:21:15.437 "rw_mbytes_per_sec": 0, 00:21:15.437 "r_mbytes_per_sec": 0, 00:21:15.437 "w_mbytes_per_sec": 0 00:21:15.437 }, 00:21:15.437 "claimed": true, 00:21:15.437 "claim_type": "exclusive_write", 00:21:15.437 "zoned": false, 00:21:15.437 "supported_io_types": { 00:21:15.437 "read": true, 00:21:15.437 "write": true, 00:21:15.437 "unmap": true, 00:21:15.437 "flush": true, 00:21:15.437 "reset": true, 00:21:15.438 "nvme_admin": false, 00:21:15.438 "nvme_io": false, 00:21:15.438 "nvme_io_md": false, 00:21:15.438 "write_zeroes": true, 00:21:15.438 "zcopy": true, 00:21:15.438 "get_zone_info": false, 00:21:15.438 "zone_management": false, 00:21:15.438 "zone_append": false, 00:21:15.438 "compare": false, 00:21:15.438 "compare_and_write": false, 00:21:15.438 "abort": true, 00:21:15.438 "seek_hole": false, 00:21:15.438 "seek_data": false, 00:21:15.438 "copy": true, 00:21:15.438 "nvme_iov_md": false 00:21:15.438 }, 00:21:15.438 "memory_domains": [ 00:21:15.438 { 00:21:15.438 "dma_device_id": "system", 00:21:15.438 "dma_device_type": 1 00:21:15.438 }, 00:21:15.438 { 00:21:15.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:15.438 "dma_device_type": 2 00:21:15.438 } 00:21:15.438 ], 00:21:15.438 "driver_specific": { 00:21:15.438 "passthru": { 00:21:15.438 "name": "pt3", 00:21:15.438 "base_bdev_name": "malloc3" 00:21:15.438 } 00:21:15.438 } 00:21:15.438 }' 00:21:15.438 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:15.438 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:15.696 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:15.696 08:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:15.696 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:15.696 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:15.696 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:15.696 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:15.696 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:15.696 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:15.696 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:15.696 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:15.696 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:15.696 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:15.696 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:15.955 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:15.955 "name": "pt4", 00:21:15.955 "aliases": [ 00:21:15.955 "00000000-0000-0000-0000-000000000004" 00:21:15.955 ], 00:21:15.955 "product_name": "passthru", 00:21:15.955 "block_size": 512, 00:21:15.955 "num_blocks": 65536, 00:21:15.955 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:15.955 "assigned_rate_limits": { 00:21:15.955 "rw_ios_per_sec": 0, 00:21:15.955 "rw_mbytes_per_sec": 0, 00:21:15.955 "r_mbytes_per_sec": 0, 00:21:15.955 "w_mbytes_per_sec": 0 00:21:15.955 }, 00:21:15.955 "claimed": true, 00:21:15.955 "claim_type": "exclusive_write", 00:21:15.955 "zoned": false, 00:21:15.955 "supported_io_types": { 00:21:15.955 "read": true, 00:21:15.955 "write": true, 00:21:15.955 "unmap": true, 00:21:15.955 "flush": true, 00:21:15.955 "reset": true, 00:21:15.955 "nvme_admin": false, 00:21:15.955 "nvme_io": false, 00:21:15.955 "nvme_io_md": false, 00:21:15.955 "write_zeroes": true, 00:21:15.955 "zcopy": true, 00:21:15.955 "get_zone_info": false, 00:21:15.955 "zone_management": false, 00:21:15.955 "zone_append": false, 00:21:15.955 "compare": false, 00:21:15.955 "compare_and_write": false, 00:21:15.955 "abort": true, 00:21:15.955 "seek_hole": false, 00:21:15.955 "seek_data": false, 00:21:15.955 "copy": true, 00:21:15.955 "nvme_iov_md": false 00:21:15.955 }, 00:21:15.955 "memory_domains": [ 00:21:15.955 { 00:21:15.955 "dma_device_id": "system", 00:21:15.955 "dma_device_type": 1 00:21:15.955 }, 00:21:15.955 { 00:21:15.955 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:15.955 "dma_device_type": 2 00:21:15.955 } 00:21:15.955 ], 00:21:15.955 "driver_specific": { 00:21:15.955 "passthru": { 00:21:15.955 "name": "pt4", 00:21:15.955 "base_bdev_name": "malloc4" 00:21:15.955 } 00:21:15.955 } 00:21:15.955 }' 00:21:15.955 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:15.955 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:16.213 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:16.213 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:16.213 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:16.213 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:16.213 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:16.213 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:16.213 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:16.213 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:16.213 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:16.213 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:16.213 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:16.213 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:21:16.472 [2024-07-23 08:34:28.853017] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:16.472 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 9dbf2e72-574b-4ed3-9363-e250f62a33e7 '!=' 9dbf2e72-574b-4ed3-9363-e250f62a33e7 ']' 00:21:16.472 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:21:16.472 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:16.472 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:16.472 08:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:16.730 [2024-07-23 08:34:29.017181] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:21:16.730 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:16.730 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:16.730 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:16.730 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:16.730 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:16.730 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:16.731 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:16.731 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:16.731 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:16.731 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:16.731 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.731 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:16.731 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:16.731 "name": "raid_bdev1", 00:21:16.731 "uuid": "9dbf2e72-574b-4ed3-9363-e250f62a33e7", 00:21:16.731 "strip_size_kb": 0, 00:21:16.731 "state": "online", 00:21:16.731 "raid_level": "raid1", 00:21:16.731 "superblock": true, 00:21:16.731 "num_base_bdevs": 4, 00:21:16.731 "num_base_bdevs_discovered": 3, 00:21:16.731 "num_base_bdevs_operational": 3, 00:21:16.731 "base_bdevs_list": [ 00:21:16.731 { 00:21:16.731 "name": null, 00:21:16.731 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:16.731 "is_configured": false, 00:21:16.731 "data_offset": 2048, 00:21:16.731 "data_size": 63488 00:21:16.731 }, 00:21:16.731 { 00:21:16.731 "name": "pt2", 00:21:16.731 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:16.731 "is_configured": true, 00:21:16.731 "data_offset": 2048, 00:21:16.731 "data_size": 63488 00:21:16.731 }, 00:21:16.731 { 00:21:16.731 "name": "pt3", 00:21:16.731 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:16.731 "is_configured": true, 00:21:16.731 "data_offset": 2048, 00:21:16.731 "data_size": 63488 00:21:16.731 }, 00:21:16.731 { 00:21:16.731 "name": "pt4", 00:21:16.731 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:16.731 "is_configured": true, 00:21:16.731 "data_offset": 2048, 00:21:16.731 "data_size": 63488 00:21:16.731 } 00:21:16.731 ] 00:21:16.731 }' 00:21:16.731 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:16.731 08:34:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:17.297 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:17.556 [2024-07-23 08:34:29.835307] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:17.556 [2024-07-23 08:34:29.835336] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:17.556 [2024-07-23 08:34:29.835405] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:17.556 [2024-07-23 08:34:29.835483] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:17.556 [2024-07-23 08:34:29.835495] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038480 name raid_bdev1, state offline 00:21:17.556 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.556 08:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:21:17.556 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:21:17.556 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:21:17.556 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:21:17.556 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:17.556 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:17.815 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:21:17.815 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:17.815 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:18.074 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:21:18.074 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:18.074 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:18.074 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:21:18.074 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:18.074 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:21:18.074 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:21:18.074 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:18.332 [2024-07-23 08:34:30.649445] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:18.332 [2024-07-23 08:34:30.649500] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:18.332 [2024-07-23 08:34:30.649534] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039380 00:21:18.332 [2024-07-23 08:34:30.649543] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:18.332 [2024-07-23 08:34:30.651572] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:18.332 [2024-07-23 08:34:30.651599] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:18.332 [2024-07-23 08:34:30.651691] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:18.332 [2024-07-23 08:34:30.651732] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:18.332 pt2 00:21:18.332 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:21:18.332 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:18.332 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:18.332 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:18.332 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:18.332 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:18.332 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:18.332 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:18.332 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:18.332 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:18.332 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.332 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.332 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:18.332 "name": "raid_bdev1", 00:21:18.332 "uuid": "9dbf2e72-574b-4ed3-9363-e250f62a33e7", 00:21:18.332 "strip_size_kb": 0, 00:21:18.332 "state": "configuring", 00:21:18.332 "raid_level": "raid1", 00:21:18.332 "superblock": true, 00:21:18.332 "num_base_bdevs": 4, 00:21:18.332 "num_base_bdevs_discovered": 1, 00:21:18.332 "num_base_bdevs_operational": 3, 00:21:18.332 "base_bdevs_list": [ 00:21:18.332 { 00:21:18.332 "name": null, 00:21:18.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:18.332 "is_configured": false, 00:21:18.332 "data_offset": 2048, 00:21:18.332 "data_size": 63488 00:21:18.332 }, 00:21:18.332 { 00:21:18.332 "name": "pt2", 00:21:18.332 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:18.332 "is_configured": true, 00:21:18.332 "data_offset": 2048, 00:21:18.332 "data_size": 63488 00:21:18.332 }, 00:21:18.332 { 00:21:18.332 "name": null, 00:21:18.332 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:18.332 "is_configured": false, 00:21:18.332 "data_offset": 2048, 00:21:18.332 "data_size": 63488 00:21:18.332 }, 00:21:18.333 { 00:21:18.333 "name": null, 00:21:18.333 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:18.333 "is_configured": false, 00:21:18.333 "data_offset": 2048, 00:21:18.333 "data_size": 63488 00:21:18.333 } 00:21:18.333 ] 00:21:18.333 }' 00:21:18.333 08:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:18.333 08:34:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:18.911 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:21:18.911 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:21:18.911 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:19.169 [2024-07-23 08:34:31.455586] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:19.169 [2024-07-23 08:34:31.455644] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:19.169 [2024-07-23 08:34:31.455680] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039c80 00:21:19.169 [2024-07-23 08:34:31.455689] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:19.169 [2024-07-23 08:34:31.456169] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:19.169 [2024-07-23 08:34:31.456187] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:19.169 [2024-07-23 08:34:31.456261] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:19.169 [2024-07-23 08:34:31.456282] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:19.169 pt3 00:21:19.169 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:21:19.169 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:19.169 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:19.169 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:19.169 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:19.169 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:19.169 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:19.169 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:19.169 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:19.169 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:19.169 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.169 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:19.169 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:19.169 "name": "raid_bdev1", 00:21:19.169 "uuid": "9dbf2e72-574b-4ed3-9363-e250f62a33e7", 00:21:19.169 "strip_size_kb": 0, 00:21:19.169 "state": "configuring", 00:21:19.169 "raid_level": "raid1", 00:21:19.169 "superblock": true, 00:21:19.169 "num_base_bdevs": 4, 00:21:19.169 "num_base_bdevs_discovered": 2, 00:21:19.169 "num_base_bdevs_operational": 3, 00:21:19.169 "base_bdevs_list": [ 00:21:19.169 { 00:21:19.169 "name": null, 00:21:19.169 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:19.169 "is_configured": false, 00:21:19.169 "data_offset": 2048, 00:21:19.169 "data_size": 63488 00:21:19.169 }, 00:21:19.169 { 00:21:19.170 "name": "pt2", 00:21:19.170 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:19.170 "is_configured": true, 00:21:19.170 "data_offset": 2048, 00:21:19.170 "data_size": 63488 00:21:19.170 }, 00:21:19.170 { 00:21:19.170 "name": "pt3", 00:21:19.170 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:19.170 "is_configured": true, 00:21:19.170 "data_offset": 2048, 00:21:19.170 "data_size": 63488 00:21:19.170 }, 00:21:19.170 { 00:21:19.170 "name": null, 00:21:19.170 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:19.170 "is_configured": false, 00:21:19.170 "data_offset": 2048, 00:21:19.170 "data_size": 63488 00:21:19.170 } 00:21:19.170 ] 00:21:19.170 }' 00:21:19.170 08:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:19.170 08:34:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:19.774 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:21:19.774 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:21:19.774 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:21:19.774 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:19.774 [2024-07-23 08:34:32.269765] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:19.774 [2024-07-23 08:34:32.269826] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:19.774 [2024-07-23 08:34:32.269863] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039f80 00:21:19.774 [2024-07-23 08:34:32.269872] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:19.774 [2024-07-23 08:34:32.270356] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:19.774 [2024-07-23 08:34:32.270374] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:19.774 [2024-07-23 08:34:32.270453] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:19.774 [2024-07-23 08:34:32.270475] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:19.774 [2024-07-23 08:34:32.270630] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000039980 00:21:19.774 [2024-07-23 08:34:32.270640] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:19.774 [2024-07-23 08:34:32.270867] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c130 00:21:19.774 [2024-07-23 08:34:32.271044] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000039980 00:21:19.774 [2024-07-23 08:34:32.271055] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000039980 00:21:19.774 [2024-07-23 08:34:32.271196] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:19.774 pt4 00:21:19.774 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:19.774 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:19.774 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:19.774 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:20.032 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:20.032 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:20.032 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:20.032 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:20.032 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:20.032 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:20.032 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.032 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:20.032 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:20.032 "name": "raid_bdev1", 00:21:20.032 "uuid": "9dbf2e72-574b-4ed3-9363-e250f62a33e7", 00:21:20.032 "strip_size_kb": 0, 00:21:20.032 "state": "online", 00:21:20.032 "raid_level": "raid1", 00:21:20.032 "superblock": true, 00:21:20.032 "num_base_bdevs": 4, 00:21:20.032 "num_base_bdevs_discovered": 3, 00:21:20.032 "num_base_bdevs_operational": 3, 00:21:20.032 "base_bdevs_list": [ 00:21:20.032 { 00:21:20.032 "name": null, 00:21:20.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:20.032 "is_configured": false, 00:21:20.032 "data_offset": 2048, 00:21:20.032 "data_size": 63488 00:21:20.032 }, 00:21:20.032 { 00:21:20.032 "name": "pt2", 00:21:20.032 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:20.032 "is_configured": true, 00:21:20.032 "data_offset": 2048, 00:21:20.032 "data_size": 63488 00:21:20.032 }, 00:21:20.032 { 00:21:20.032 "name": "pt3", 00:21:20.032 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:20.032 "is_configured": true, 00:21:20.032 "data_offset": 2048, 00:21:20.032 "data_size": 63488 00:21:20.032 }, 00:21:20.032 { 00:21:20.032 "name": "pt4", 00:21:20.032 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:20.032 "is_configured": true, 00:21:20.032 "data_offset": 2048, 00:21:20.032 "data_size": 63488 00:21:20.032 } 00:21:20.032 ] 00:21:20.032 }' 00:21:20.032 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:20.032 08:34:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:20.599 08:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:20.599 [2024-07-23 08:34:33.104028] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:20.599 [2024-07-23 08:34:33.104065] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:20.599 [2024-07-23 08:34:33.104136] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:20.599 [2024-07-23 08:34:33.104205] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:20.599 [2024-07-23 08:34:33.104216] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000039980 name raid_bdev1, state offline 00:21:20.858 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:21:20.858 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.858 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:21:20.858 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:21:20.858 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:21:20.858 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:21:20.858 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:21.117 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:21.117 [2024-07-23 08:34:33.597302] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:21.117 [2024-07-23 08:34:33.597361] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:21.117 [2024-07-23 08:34:33.597377] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600003a280 00:21:21.117 [2024-07-23 08:34:33.597412] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:21.117 [2024-07-23 08:34:33.599333] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:21.117 [2024-07-23 08:34:33.599362] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:21.117 [2024-07-23 08:34:33.599439] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:21.117 [2024-07-23 08:34:33.599480] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:21.117 [2024-07-23 08:34:33.599637] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:21:21.117 [2024-07-23 08:34:33.599652] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:21.117 [2024-07-23 08:34:33.599667] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x61600003a880 name raid_bdev1, state configuring 00:21:21.117 [2024-07-23 08:34:33.599731] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:21.117 [2024-07-23 08:34:33.599820] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:21.117 pt1 00:21:21.117 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:21:21.117 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:21:21.117 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:21.117 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:21.117 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:21.117 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:21.117 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:21.117 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:21.117 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:21.117 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:21.117 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:21.117 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.117 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:21.375 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:21.375 "name": "raid_bdev1", 00:21:21.375 "uuid": "9dbf2e72-574b-4ed3-9363-e250f62a33e7", 00:21:21.375 "strip_size_kb": 0, 00:21:21.375 "state": "configuring", 00:21:21.375 "raid_level": "raid1", 00:21:21.375 "superblock": true, 00:21:21.375 "num_base_bdevs": 4, 00:21:21.375 "num_base_bdevs_discovered": 2, 00:21:21.375 "num_base_bdevs_operational": 3, 00:21:21.375 "base_bdevs_list": [ 00:21:21.375 { 00:21:21.375 "name": null, 00:21:21.375 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:21.375 "is_configured": false, 00:21:21.375 "data_offset": 2048, 00:21:21.375 "data_size": 63488 00:21:21.375 }, 00:21:21.375 { 00:21:21.375 "name": "pt2", 00:21:21.375 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:21.375 "is_configured": true, 00:21:21.375 "data_offset": 2048, 00:21:21.375 "data_size": 63488 00:21:21.375 }, 00:21:21.375 { 00:21:21.375 "name": "pt3", 00:21:21.375 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:21.375 "is_configured": true, 00:21:21.375 "data_offset": 2048, 00:21:21.375 "data_size": 63488 00:21:21.375 }, 00:21:21.375 { 00:21:21.375 "name": null, 00:21:21.375 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:21.375 "is_configured": false, 00:21:21.375 "data_offset": 2048, 00:21:21.375 "data_size": 63488 00:21:21.375 } 00:21:21.375 ] 00:21:21.375 }' 00:21:21.375 08:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:21.375 08:34:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:21.939 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:21:21.939 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:21:21.939 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:21:21.939 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:22.197 [2024-07-23 08:34:34.539782] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:22.197 [2024-07-23 08:34:34.539841] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:22.197 [2024-07-23 08:34:34.539877] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600003ae80 00:21:22.197 [2024-07-23 08:34:34.539887] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:22.197 [2024-07-23 08:34:34.540345] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:22.197 [2024-07-23 08:34:34.540363] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:22.197 [2024-07-23 08:34:34.540447] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:22.197 [2024-07-23 08:34:34.540469] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:22.197 [2024-07-23 08:34:34.540630] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x61600003ab80 00:21:22.197 [2024-07-23 08:34:34.540641] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:22.197 [2024-07-23 08:34:34.540856] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c200 00:21:22.197 [2024-07-23 08:34:34.541018] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x61600003ab80 00:21:22.197 [2024-07-23 08:34:34.541029] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x61600003ab80 00:21:22.197 [2024-07-23 08:34:34.541169] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:22.197 pt4 00:21:22.197 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:22.197 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:22.197 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:22.197 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:22.197 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:22.197 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:22.197 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:22.197 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:22.197 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:22.197 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:22.197 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.197 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:22.197 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:22.197 "name": "raid_bdev1", 00:21:22.197 "uuid": "9dbf2e72-574b-4ed3-9363-e250f62a33e7", 00:21:22.197 "strip_size_kb": 0, 00:21:22.197 "state": "online", 00:21:22.197 "raid_level": "raid1", 00:21:22.197 "superblock": true, 00:21:22.197 "num_base_bdevs": 4, 00:21:22.197 "num_base_bdevs_discovered": 3, 00:21:22.197 "num_base_bdevs_operational": 3, 00:21:22.197 "base_bdevs_list": [ 00:21:22.197 { 00:21:22.197 "name": null, 00:21:22.197 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:22.197 "is_configured": false, 00:21:22.197 "data_offset": 2048, 00:21:22.197 "data_size": 63488 00:21:22.197 }, 00:21:22.197 { 00:21:22.197 "name": "pt2", 00:21:22.197 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:22.197 "is_configured": true, 00:21:22.197 "data_offset": 2048, 00:21:22.197 "data_size": 63488 00:21:22.197 }, 00:21:22.197 { 00:21:22.197 "name": "pt3", 00:21:22.197 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:22.197 "is_configured": true, 00:21:22.197 "data_offset": 2048, 00:21:22.197 "data_size": 63488 00:21:22.197 }, 00:21:22.197 { 00:21:22.197 "name": "pt4", 00:21:22.197 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:22.197 "is_configured": true, 00:21:22.197 "data_offset": 2048, 00:21:22.197 "data_size": 63488 00:21:22.197 } 00:21:22.197 ] 00:21:22.197 }' 00:21:22.197 08:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:22.197 08:34:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:22.764 08:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:21:22.764 08:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:21:23.022 08:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:21:23.022 08:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:21:23.022 08:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:23.022 [2024-07-23 08:34:35.506594] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:23.022 08:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 9dbf2e72-574b-4ed3-9363-e250f62a33e7 '!=' 9dbf2e72-574b-4ed3-9363-e250f62a33e7 ']' 00:21:23.022 08:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1515457 00:21:23.022 08:34:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1515457 ']' 00:21:23.022 08:34:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1515457 00:21:23.022 08:34:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:21:23.022 08:34:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:23.022 08:34:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1515457 00:21:23.281 08:34:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:23.281 08:34:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:23.281 08:34:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1515457' 00:21:23.281 killing process with pid 1515457 00:21:23.281 08:34:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1515457 00:21:23.281 08:34:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1515457 00:21:23.281 [2024-07-23 08:34:35.564930] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:23.281 [2024-07-23 08:34:35.565025] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:23.281 [2024-07-23 08:34:35.565102] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:23.281 [2024-07-23 08:34:35.565115] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x61600003ab80 name raid_bdev1, state offline 00:21:23.540 [2024-07-23 08:34:35.886598] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:24.917 08:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:21:24.918 00:21:24.918 real 0m20.198s 00:21:24.918 user 0m36.111s 00:21:24.918 sys 0m2.794s 00:21:24.918 08:34:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:24.918 08:34:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:24.918 ************************************ 00:21:24.918 END TEST raid_superblock_test 00:21:24.918 ************************************ 00:21:24.918 08:34:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:24.918 08:34:37 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:21:24.918 08:34:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:24.918 08:34:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:24.918 08:34:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:24.918 ************************************ 00:21:24.918 START TEST raid_read_error_test 00:21:24.918 ************************************ 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.mJvCFevK1H 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1519736 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1519736 /var/tmp/spdk-raid.sock 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1519736 ']' 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:24.918 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:24.918 08:34:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:24.918 [2024-07-23 08:34:37.323700] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:21:24.918 [2024-07-23 08:34:37.323799] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1519736 ] 00:21:25.176 [2024-07-23 08:34:37.445419] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:25.176 [2024-07-23 08:34:37.668916] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:25.435 [2024-07-23 08:34:37.953666] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:25.435 [2024-07-23 08:34:37.953696] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:25.694 08:34:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:25.694 08:34:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:21:25.694 08:34:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:25.694 08:34:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:25.953 BaseBdev1_malloc 00:21:25.954 08:34:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:25.954 true 00:21:25.954 08:34:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:26.212 [2024-07-23 08:34:38.617099] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:26.212 [2024-07-23 08:34:38.617156] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:26.212 [2024-07-23 08:34:38.617193] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:21:26.212 [2024-07-23 08:34:38.617204] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:26.212 [2024-07-23 08:34:38.619207] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:26.212 [2024-07-23 08:34:38.619236] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:26.212 BaseBdev1 00:21:26.212 08:34:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:26.212 08:34:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:26.470 BaseBdev2_malloc 00:21:26.470 08:34:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:26.470 true 00:21:26.728 08:34:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:26.728 [2024-07-23 08:34:39.164928] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:26.728 [2024-07-23 08:34:39.164981] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:26.728 [2024-07-23 08:34:39.165017] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:21:26.728 [2024-07-23 08:34:39.165032] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:26.728 [2024-07-23 08:34:39.167201] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:26.728 [2024-07-23 08:34:39.167230] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:26.728 BaseBdev2 00:21:26.728 08:34:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:26.728 08:34:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:26.986 BaseBdev3_malloc 00:21:26.986 08:34:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:27.245 true 00:21:27.245 08:34:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:27.245 [2024-07-23 08:34:39.721840] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:27.245 [2024-07-23 08:34:39.721891] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:27.245 [2024-07-23 08:34:39.721911] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036980 00:21:27.245 [2024-07-23 08:34:39.721922] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:27.245 [2024-07-23 08:34:39.723928] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:27.245 [2024-07-23 08:34:39.723957] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:27.245 BaseBdev3 00:21:27.245 08:34:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:27.245 08:34:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:27.503 BaseBdev4_malloc 00:21:27.503 08:34:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:27.761 true 00:21:27.761 08:34:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:27.761 [2024-07-23 08:34:40.267272] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:27.761 [2024-07-23 08:34:40.267325] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:27.761 [2024-07-23 08:34:40.267346] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037880 00:21:27.761 [2024-07-23 08:34:40.267357] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:27.761 [2024-07-23 08:34:40.269346] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:27.761 [2024-07-23 08:34:40.269374] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:27.761 BaseBdev4 00:21:28.018 08:34:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:28.018 [2024-07-23 08:34:40.423736] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:28.018 [2024-07-23 08:34:40.425406] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:28.018 [2024-07-23 08:34:40.425482] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:28.018 [2024-07-23 08:34:40.425544] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:28.018 [2024-07-23 08:34:40.425788] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037e80 00:21:28.018 [2024-07-23 08:34:40.425803] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:28.018 [2024-07-23 08:34:40.426044] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:21:28.018 [2024-07-23 08:34:40.426251] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037e80 00:21:28.018 [2024-07-23 08:34:40.426261] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037e80 00:21:28.018 [2024-07-23 08:34:40.426422] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:28.018 08:34:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:28.018 08:34:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:28.019 08:34:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:28.019 08:34:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:28.019 08:34:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:28.019 08:34:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:28.019 08:34:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:28.019 08:34:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:28.019 08:34:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:28.019 08:34:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:28.019 08:34:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.019 08:34:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:28.277 08:34:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:28.277 "name": "raid_bdev1", 00:21:28.277 "uuid": "f5e6ee14-49ae-42f6-a53f-45271aef2e33", 00:21:28.277 "strip_size_kb": 0, 00:21:28.277 "state": "online", 00:21:28.277 "raid_level": "raid1", 00:21:28.277 "superblock": true, 00:21:28.277 "num_base_bdevs": 4, 00:21:28.277 "num_base_bdevs_discovered": 4, 00:21:28.277 "num_base_bdevs_operational": 4, 00:21:28.277 "base_bdevs_list": [ 00:21:28.277 { 00:21:28.277 "name": "BaseBdev1", 00:21:28.277 "uuid": "c0aa4b34-89b6-58b5-b7e3-76ac38b8f064", 00:21:28.277 "is_configured": true, 00:21:28.277 "data_offset": 2048, 00:21:28.277 "data_size": 63488 00:21:28.277 }, 00:21:28.277 { 00:21:28.277 "name": "BaseBdev2", 00:21:28.277 "uuid": "0143d249-4697-5333-b8df-8c65f6e6f5c5", 00:21:28.277 "is_configured": true, 00:21:28.277 "data_offset": 2048, 00:21:28.277 "data_size": 63488 00:21:28.277 }, 00:21:28.277 { 00:21:28.277 "name": "BaseBdev3", 00:21:28.277 "uuid": "507bd47e-7475-585c-a785-84e1fc81fc0f", 00:21:28.277 "is_configured": true, 00:21:28.277 "data_offset": 2048, 00:21:28.277 "data_size": 63488 00:21:28.277 }, 00:21:28.277 { 00:21:28.277 "name": "BaseBdev4", 00:21:28.277 "uuid": "5d88093c-2d94-56f5-b048-2082221c886b", 00:21:28.277 "is_configured": true, 00:21:28.277 "data_offset": 2048, 00:21:28.277 "data_size": 63488 00:21:28.277 } 00:21:28.277 ] 00:21:28.277 }' 00:21:28.277 08:34:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:28.277 08:34:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:28.843 08:34:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:28.843 08:34:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:28.843 [2024-07-23 08:34:41.159047] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c130 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.779 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:30.037 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:30.037 "name": "raid_bdev1", 00:21:30.037 "uuid": "f5e6ee14-49ae-42f6-a53f-45271aef2e33", 00:21:30.037 "strip_size_kb": 0, 00:21:30.037 "state": "online", 00:21:30.037 "raid_level": "raid1", 00:21:30.037 "superblock": true, 00:21:30.037 "num_base_bdevs": 4, 00:21:30.037 "num_base_bdevs_discovered": 4, 00:21:30.037 "num_base_bdevs_operational": 4, 00:21:30.037 "base_bdevs_list": [ 00:21:30.037 { 00:21:30.037 "name": "BaseBdev1", 00:21:30.037 "uuid": "c0aa4b34-89b6-58b5-b7e3-76ac38b8f064", 00:21:30.037 "is_configured": true, 00:21:30.037 "data_offset": 2048, 00:21:30.037 "data_size": 63488 00:21:30.037 }, 00:21:30.037 { 00:21:30.037 "name": "BaseBdev2", 00:21:30.037 "uuid": "0143d249-4697-5333-b8df-8c65f6e6f5c5", 00:21:30.037 "is_configured": true, 00:21:30.037 "data_offset": 2048, 00:21:30.037 "data_size": 63488 00:21:30.037 }, 00:21:30.037 { 00:21:30.037 "name": "BaseBdev3", 00:21:30.037 "uuid": "507bd47e-7475-585c-a785-84e1fc81fc0f", 00:21:30.037 "is_configured": true, 00:21:30.037 "data_offset": 2048, 00:21:30.037 "data_size": 63488 00:21:30.037 }, 00:21:30.037 { 00:21:30.037 "name": "BaseBdev4", 00:21:30.037 "uuid": "5d88093c-2d94-56f5-b048-2082221c886b", 00:21:30.037 "is_configured": true, 00:21:30.037 "data_offset": 2048, 00:21:30.037 "data_size": 63488 00:21:30.037 } 00:21:30.037 ] 00:21:30.037 }' 00:21:30.037 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:30.037 08:34:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:30.604 08:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:30.604 [2024-07-23 08:34:43.069801] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:30.604 [2024-07-23 08:34:43.069838] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:30.604 [2024-07-23 08:34:43.072210] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:30.604 [2024-07-23 08:34:43.072268] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:30.604 [2024-07-23 08:34:43.072384] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:30.604 [2024-07-23 08:34:43.072398] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037e80 name raid_bdev1, state offline 00:21:30.604 0 00:21:30.604 08:34:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1519736 00:21:30.604 08:34:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1519736 ']' 00:21:30.604 08:34:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1519736 00:21:30.604 08:34:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:21:30.604 08:34:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:30.604 08:34:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1519736 00:21:30.863 08:34:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:30.863 08:34:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:30.863 08:34:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1519736' 00:21:30.863 killing process with pid 1519736 00:21:30.863 08:34:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1519736 00:21:30.863 [2024-07-23 08:34:43.127246] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:30.863 08:34:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1519736 00:21:31.121 [2024-07-23 08:34:43.415269] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:32.498 08:34:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.mJvCFevK1H 00:21:32.498 08:34:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:32.498 08:34:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:32.498 08:34:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:21:32.498 08:34:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:21:32.498 08:34:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:32.498 08:34:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:32.498 08:34:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:21:32.498 00:21:32.498 real 0m7.516s 00:21:32.498 user 0m10.752s 00:21:32.498 sys 0m1.039s 00:21:32.498 08:34:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:32.498 08:34:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:32.498 ************************************ 00:21:32.498 END TEST raid_read_error_test 00:21:32.498 ************************************ 00:21:32.498 08:34:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:32.498 08:34:44 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:21:32.498 08:34:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:32.498 08:34:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:32.498 08:34:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:32.498 ************************************ 00:21:32.498 START TEST raid_write_error_test 00:21:32.498 ************************************ 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.l2odFTcoC0 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1521322 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1521322 /var/tmp/spdk-raid.sock 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1521322 ']' 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:32.498 08:34:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:32.499 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:32.499 08:34:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:32.499 08:34:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:32.499 [2024-07-23 08:34:44.906249] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:21:32.499 [2024-07-23 08:34:44.906345] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1521322 ] 00:21:32.757 [2024-07-23 08:34:45.029035] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:32.757 [2024-07-23 08:34:45.257896] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:33.016 [2024-07-23 08:34:45.525477] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:33.016 [2024-07-23 08:34:45.525516] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:33.289 08:34:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:33.289 08:34:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:21:33.289 08:34:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:33.289 08:34:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:33.570 BaseBdev1_malloc 00:21:33.570 08:34:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:33.570 true 00:21:33.570 08:34:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:33.829 [2024-07-23 08:34:46.214272] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:33.829 [2024-07-23 08:34:46.214336] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:33.829 [2024-07-23 08:34:46.214357] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034b80 00:21:33.829 [2024-07-23 08:34:46.214367] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:33.829 [2024-07-23 08:34:46.216400] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:33.829 [2024-07-23 08:34:46.216431] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:33.829 BaseBdev1 00:21:33.829 08:34:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:33.829 08:34:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:34.087 BaseBdev2_malloc 00:21:34.087 08:34:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:34.087 true 00:21:34.087 08:34:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:34.346 [2024-07-23 08:34:46.752865] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:34.346 [2024-07-23 08:34:46.752927] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:34.346 [2024-07-23 08:34:46.752947] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035a80 00:21:34.346 [2024-07-23 08:34:46.752960] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:34.346 [2024-07-23 08:34:46.754992] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:34.346 [2024-07-23 08:34:46.755022] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:34.346 BaseBdev2 00:21:34.346 08:34:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:34.346 08:34:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:34.604 BaseBdev3_malloc 00:21:34.604 08:34:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:34.862 true 00:21:34.862 08:34:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:34.862 [2024-07-23 08:34:47.304002] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:34.862 [2024-07-23 08:34:47.304055] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:34.862 [2024-07-23 08:34:47.304074] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036980 00:21:34.862 [2024-07-23 08:34:47.304085] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:34.862 [2024-07-23 08:34:47.306066] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:34.862 [2024-07-23 08:34:47.306095] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:34.862 BaseBdev3 00:21:34.862 08:34:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:34.862 08:34:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:35.120 BaseBdev4_malloc 00:21:35.120 08:34:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:35.378 true 00:21:35.378 08:34:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:35.378 [2024-07-23 08:34:47.828749] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:35.378 [2024-07-23 08:34:47.828803] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:35.378 [2024-07-23 08:34:47.828840] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037880 00:21:35.378 [2024-07-23 08:34:47.828851] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:35.378 [2024-07-23 08:34:47.830837] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:35.378 [2024-07-23 08:34:47.830866] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:35.378 BaseBdev4 00:21:35.378 08:34:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:35.636 [2024-07-23 08:34:47.993215] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:35.636 [2024-07-23 08:34:47.994837] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:35.636 [2024-07-23 08:34:47.994915] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:35.636 [2024-07-23 08:34:47.994977] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:35.636 [2024-07-23 08:34:47.995215] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037e80 00:21:35.636 [2024-07-23 08:34:47.995229] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:35.636 [2024-07-23 08:34:47.995475] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:21:35.636 [2024-07-23 08:34:47.995695] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037e80 00:21:35.636 [2024-07-23 08:34:47.995706] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037e80 00:21:35.636 [2024-07-23 08:34:47.995871] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:35.636 08:34:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:35.636 08:34:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:35.636 08:34:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:35.636 08:34:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:35.636 08:34:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:35.636 08:34:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:35.636 08:34:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:35.636 08:34:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:35.636 08:34:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:35.636 08:34:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:35.636 08:34:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.636 08:34:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:35.895 08:34:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:35.895 "name": "raid_bdev1", 00:21:35.895 "uuid": "e95d77ad-8096-4030-8c21-846347c0c23f", 00:21:35.895 "strip_size_kb": 0, 00:21:35.895 "state": "online", 00:21:35.895 "raid_level": "raid1", 00:21:35.895 "superblock": true, 00:21:35.895 "num_base_bdevs": 4, 00:21:35.895 "num_base_bdevs_discovered": 4, 00:21:35.895 "num_base_bdevs_operational": 4, 00:21:35.895 "base_bdevs_list": [ 00:21:35.895 { 00:21:35.895 "name": "BaseBdev1", 00:21:35.895 "uuid": "fe4599f5-465e-56ab-bbf7-8d2fe40ec648", 00:21:35.895 "is_configured": true, 00:21:35.895 "data_offset": 2048, 00:21:35.895 "data_size": 63488 00:21:35.895 }, 00:21:35.895 { 00:21:35.895 "name": "BaseBdev2", 00:21:35.895 "uuid": "a7e46c98-d6e7-57f1-a32a-8c111e6e7682", 00:21:35.895 "is_configured": true, 00:21:35.895 "data_offset": 2048, 00:21:35.895 "data_size": 63488 00:21:35.895 }, 00:21:35.895 { 00:21:35.895 "name": "BaseBdev3", 00:21:35.895 "uuid": "bdc7cd7c-e01a-5988-8846-3139108aae95", 00:21:35.895 "is_configured": true, 00:21:35.895 "data_offset": 2048, 00:21:35.895 "data_size": 63488 00:21:35.895 }, 00:21:35.895 { 00:21:35.895 "name": "BaseBdev4", 00:21:35.895 "uuid": "576ba459-3837-56ca-bb38-74cec813bec3", 00:21:35.895 "is_configured": true, 00:21:35.895 "data_offset": 2048, 00:21:35.895 "data_size": 63488 00:21:35.895 } 00:21:35.895 ] 00:21:35.895 }' 00:21:35.895 08:34:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:35.895 08:34:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:36.462 08:34:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:36.462 08:34:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:36.462 [2024-07-23 08:34:48.752996] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c130 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:21:37.398 [2024-07-23 08:34:49.833950] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:21:37.398 [2024-07-23 08:34:49.834002] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:37.398 [2024-07-23 08:34:49.834246] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d00000c130 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.398 08:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:37.656 08:34:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:37.656 "name": "raid_bdev1", 00:21:37.656 "uuid": "e95d77ad-8096-4030-8c21-846347c0c23f", 00:21:37.656 "strip_size_kb": 0, 00:21:37.656 "state": "online", 00:21:37.656 "raid_level": "raid1", 00:21:37.656 "superblock": true, 00:21:37.656 "num_base_bdevs": 4, 00:21:37.656 "num_base_bdevs_discovered": 3, 00:21:37.656 "num_base_bdevs_operational": 3, 00:21:37.656 "base_bdevs_list": [ 00:21:37.656 { 00:21:37.656 "name": null, 00:21:37.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:37.656 "is_configured": false, 00:21:37.656 "data_offset": 2048, 00:21:37.656 "data_size": 63488 00:21:37.656 }, 00:21:37.656 { 00:21:37.656 "name": "BaseBdev2", 00:21:37.656 "uuid": "a7e46c98-d6e7-57f1-a32a-8c111e6e7682", 00:21:37.656 "is_configured": true, 00:21:37.656 "data_offset": 2048, 00:21:37.657 "data_size": 63488 00:21:37.657 }, 00:21:37.657 { 00:21:37.657 "name": "BaseBdev3", 00:21:37.657 "uuid": "bdc7cd7c-e01a-5988-8846-3139108aae95", 00:21:37.657 "is_configured": true, 00:21:37.657 "data_offset": 2048, 00:21:37.657 "data_size": 63488 00:21:37.657 }, 00:21:37.657 { 00:21:37.657 "name": "BaseBdev4", 00:21:37.657 "uuid": "576ba459-3837-56ca-bb38-74cec813bec3", 00:21:37.657 "is_configured": true, 00:21:37.657 "data_offset": 2048, 00:21:37.657 "data_size": 63488 00:21:37.657 } 00:21:37.657 ] 00:21:37.657 }' 00:21:37.657 08:34:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:37.657 08:34:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:38.223 08:34:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:38.223 [2024-07-23 08:34:50.680204] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:38.223 [2024-07-23 08:34:50.680239] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:38.223 [2024-07-23 08:34:50.682688] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:38.223 [2024-07-23 08:34:50.682729] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:38.223 [2024-07-23 08:34:50.682825] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:38.223 [2024-07-23 08:34:50.682836] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037e80 name raid_bdev1, state offline 00:21:38.223 0 00:21:38.223 08:34:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1521322 00:21:38.223 08:34:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1521322 ']' 00:21:38.223 08:34:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1521322 00:21:38.223 08:34:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:21:38.223 08:34:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:38.223 08:34:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1521322 00:21:38.223 08:34:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:38.223 08:34:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:38.223 08:34:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1521322' 00:21:38.223 killing process with pid 1521322 00:21:38.223 08:34:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1521322 00:21:38.223 [2024-07-23 08:34:50.738016] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:38.223 08:34:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1521322 00:21:38.482 [2024-07-23 08:34:50.991999] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:39.858 08:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.l2odFTcoC0 00:21:39.858 08:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:39.858 08:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:39.858 08:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:21:39.858 08:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:21:39.858 08:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:39.858 08:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:39.858 08:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:21:39.858 00:21:39.858 real 0m7.509s 00:21:39.858 user 0m10.807s 00:21:39.858 sys 0m0.989s 00:21:39.858 08:34:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:39.858 08:34:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:39.858 ************************************ 00:21:39.858 END TEST raid_write_error_test 00:21:39.858 ************************************ 00:21:39.858 08:34:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:39.858 08:34:52 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:21:39.858 08:34:52 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:21:39.858 08:34:52 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:21:39.858 08:34:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:21:39.858 08:34:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:39.858 08:34:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:40.116 ************************************ 00:21:40.116 START TEST raid_rebuild_test 00:21:40.116 ************************************ 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:21:40.116 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=1522702 00:21:40.117 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:40.117 08:34:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 1522702 /var/tmp/spdk-raid.sock 00:21:40.117 08:34:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 1522702 ']' 00:21:40.117 08:34:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:40.117 08:34:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:40.117 08:34:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:40.117 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:40.117 08:34:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:40.117 08:34:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:40.117 [2024-07-23 08:34:52.465387] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:21:40.117 [2024-07-23 08:34:52.465483] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1522702 ] 00:21:40.117 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:40.117 Zero copy mechanism will not be used. 00:21:40.117 [2024-07-23 08:34:52.584485] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:40.375 [2024-07-23 08:34:52.813364] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:40.634 [2024-07-23 08:34:53.105253] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:40.634 [2024-07-23 08:34:53.105292] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:40.892 08:34:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:40.892 08:34:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:21:40.892 08:34:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:40.892 08:34:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:41.151 BaseBdev1_malloc 00:21:41.151 08:34:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:41.151 [2024-07-23 08:34:53.600815] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:41.151 [2024-07-23 08:34:53.600868] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:41.151 [2024-07-23 08:34:53.600889] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:21:41.151 [2024-07-23 08:34:53.600906] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:41.151 [2024-07-23 08:34:53.602836] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:41.151 [2024-07-23 08:34:53.602865] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:41.151 BaseBdev1 00:21:41.151 08:34:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:41.151 08:34:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:41.410 BaseBdev2_malloc 00:21:41.410 08:34:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:41.669 [2024-07-23 08:34:53.961103] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:41.669 [2024-07-23 08:34:53.961154] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:41.669 [2024-07-23 08:34:53.961188] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:21:41.669 [2024-07-23 08:34:53.961201] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:41.669 [2024-07-23 08:34:53.963096] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:41.669 [2024-07-23 08:34:53.963123] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:41.669 BaseBdev2 00:21:41.669 08:34:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:41.669 spare_malloc 00:21:41.669 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:41.927 spare_delay 00:21:41.928 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:42.186 [2024-07-23 08:34:54.529603] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:42.186 [2024-07-23 08:34:54.529668] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:42.186 [2024-07-23 08:34:54.529691] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036680 00:21:42.186 [2024-07-23 08:34:54.529705] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:42.186 [2024-07-23 08:34:54.531989] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:42.186 [2024-07-23 08:34:54.532022] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:42.186 spare 00:21:42.186 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:21:42.186 [2024-07-23 08:34:54.702063] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:42.186 [2024-07-23 08:34:54.703722] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:42.186 [2024-07-23 08:34:54.703816] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036c80 00:21:42.186 [2024-07-23 08:34:54.703829] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:42.186 [2024-07-23 08:34:54.704116] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:21:42.186 [2024-07-23 08:34:54.704322] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036c80 00:21:42.186 [2024-07-23 08:34:54.704332] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036c80 00:21:42.186 [2024-07-23 08:34:54.704515] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:42.445 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:42.445 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:42.445 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:42.445 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:42.445 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:42.445 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:42.445 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:42.445 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:42.445 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:42.445 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:42.445 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.445 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:42.445 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:42.445 "name": "raid_bdev1", 00:21:42.445 "uuid": "202bb43a-cde2-40a5-9a89-4cd33f481858", 00:21:42.445 "strip_size_kb": 0, 00:21:42.445 "state": "online", 00:21:42.445 "raid_level": "raid1", 00:21:42.445 "superblock": false, 00:21:42.445 "num_base_bdevs": 2, 00:21:42.445 "num_base_bdevs_discovered": 2, 00:21:42.445 "num_base_bdevs_operational": 2, 00:21:42.445 "base_bdevs_list": [ 00:21:42.445 { 00:21:42.445 "name": "BaseBdev1", 00:21:42.445 "uuid": "0e77ed83-2133-5739-bfa5-b87e99f56849", 00:21:42.445 "is_configured": true, 00:21:42.445 "data_offset": 0, 00:21:42.445 "data_size": 65536 00:21:42.445 }, 00:21:42.445 { 00:21:42.445 "name": "BaseBdev2", 00:21:42.445 "uuid": "7cbb9918-a1a7-50e0-a24c-c64a7d9f1a99", 00:21:42.445 "is_configured": true, 00:21:42.445 "data_offset": 0, 00:21:42.445 "data_size": 65536 00:21:42.445 } 00:21:42.445 ] 00:21:42.445 }' 00:21:42.445 08:34:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:42.445 08:34:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:43.012 08:34:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:43.012 08:34:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:43.012 [2024-07-23 08:34:55.516426] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:43.270 08:34:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:21:43.530 [2024-07-23 08:34:55.897241] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:21:43.530 /dev/nbd0 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:43.530 1+0 records in 00:21:43.530 1+0 records out 00:21:43.530 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235485 s, 17.4 MB/s 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:21:43.530 08:34:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:21:47.718 65536+0 records in 00:21:47.718 65536+0 records out 00:21:47.718 33554432 bytes (34 MB, 32 MiB) copied, 3.80286 s, 8.8 MB/s 00:21:47.718 08:34:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:47.718 08:34:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:47.718 08:34:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:47.718 08:34:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:47.718 08:34:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:21:47.718 08:34:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:47.718 08:34:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:47.718 08:34:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:47.718 [2024-07-23 08:34:59.987457] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:47.718 08:34:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:47.718 08:34:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:47.719 08:34:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:47.719 08:34:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:47.719 08:34:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:47.719 08:34:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:21:47.719 08:34:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:21:47.719 08:34:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:47.719 [2024-07-23 08:35:00.152170] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:47.719 08:35:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:47.719 08:35:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:47.719 08:35:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:47.719 08:35:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:47.719 08:35:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:47.719 08:35:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:47.719 08:35:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:47.719 08:35:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:47.719 08:35:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:47.719 08:35:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:47.719 08:35:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.719 08:35:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:47.977 08:35:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:47.977 "name": "raid_bdev1", 00:21:47.977 "uuid": "202bb43a-cde2-40a5-9a89-4cd33f481858", 00:21:47.977 "strip_size_kb": 0, 00:21:47.977 "state": "online", 00:21:47.977 "raid_level": "raid1", 00:21:47.977 "superblock": false, 00:21:47.977 "num_base_bdevs": 2, 00:21:47.977 "num_base_bdevs_discovered": 1, 00:21:47.977 "num_base_bdevs_operational": 1, 00:21:47.977 "base_bdevs_list": [ 00:21:47.977 { 00:21:47.977 "name": null, 00:21:47.978 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.978 "is_configured": false, 00:21:47.978 "data_offset": 0, 00:21:47.978 "data_size": 65536 00:21:47.978 }, 00:21:47.978 { 00:21:47.978 "name": "BaseBdev2", 00:21:47.978 "uuid": "7cbb9918-a1a7-50e0-a24c-c64a7d9f1a99", 00:21:47.978 "is_configured": true, 00:21:47.978 "data_offset": 0, 00:21:47.978 "data_size": 65536 00:21:47.978 } 00:21:47.978 ] 00:21:47.978 }' 00:21:47.978 08:35:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:47.978 08:35:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:48.545 08:35:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:48.545 [2024-07-23 08:35:00.978323] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:48.545 [2024-07-23 08:35:00.997163] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000d12c70 00:21:48.545 [2024-07-23 08:35:00.998804] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:48.545 08:35:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:49.510 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:49.510 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:49.510 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:49.510 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:49.510 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:49.510 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.510 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:49.769 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:49.769 "name": "raid_bdev1", 00:21:49.769 "uuid": "202bb43a-cde2-40a5-9a89-4cd33f481858", 00:21:49.769 "strip_size_kb": 0, 00:21:49.769 "state": "online", 00:21:49.769 "raid_level": "raid1", 00:21:49.769 "superblock": false, 00:21:49.769 "num_base_bdevs": 2, 00:21:49.769 "num_base_bdevs_discovered": 2, 00:21:49.769 "num_base_bdevs_operational": 2, 00:21:49.769 "process": { 00:21:49.769 "type": "rebuild", 00:21:49.769 "target": "spare", 00:21:49.769 "progress": { 00:21:49.769 "blocks": 22528, 00:21:49.769 "percent": 34 00:21:49.769 } 00:21:49.769 }, 00:21:49.769 "base_bdevs_list": [ 00:21:49.769 { 00:21:49.769 "name": "spare", 00:21:49.769 "uuid": "28d9bc93-18a0-561c-9899-b19832c11159", 00:21:49.769 "is_configured": true, 00:21:49.769 "data_offset": 0, 00:21:49.769 "data_size": 65536 00:21:49.769 }, 00:21:49.769 { 00:21:49.769 "name": "BaseBdev2", 00:21:49.769 "uuid": "7cbb9918-a1a7-50e0-a24c-c64a7d9f1a99", 00:21:49.769 "is_configured": true, 00:21:49.769 "data_offset": 0, 00:21:49.769 "data_size": 65536 00:21:49.769 } 00:21:49.769 ] 00:21:49.769 }' 00:21:49.769 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:49.769 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:49.769 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:49.769 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:49.769 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:50.027 [2024-07-23 08:35:02.432373] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:50.027 [2024-07-23 08:35:02.511309] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:50.027 [2024-07-23 08:35:02.511373] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:50.027 [2024-07-23 08:35:02.511389] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:50.027 [2024-07-23 08:35:02.511399] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:50.286 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:50.286 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:50.286 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:50.286 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:50.286 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:50.286 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:50.286 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:50.286 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:50.286 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:50.286 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:50.286 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.286 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:50.286 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:50.286 "name": "raid_bdev1", 00:21:50.286 "uuid": "202bb43a-cde2-40a5-9a89-4cd33f481858", 00:21:50.286 "strip_size_kb": 0, 00:21:50.286 "state": "online", 00:21:50.286 "raid_level": "raid1", 00:21:50.286 "superblock": false, 00:21:50.286 "num_base_bdevs": 2, 00:21:50.286 "num_base_bdevs_discovered": 1, 00:21:50.286 "num_base_bdevs_operational": 1, 00:21:50.286 "base_bdevs_list": [ 00:21:50.286 { 00:21:50.286 "name": null, 00:21:50.286 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:50.286 "is_configured": false, 00:21:50.286 "data_offset": 0, 00:21:50.286 "data_size": 65536 00:21:50.286 }, 00:21:50.286 { 00:21:50.286 "name": "BaseBdev2", 00:21:50.286 "uuid": "7cbb9918-a1a7-50e0-a24c-c64a7d9f1a99", 00:21:50.286 "is_configured": true, 00:21:50.286 "data_offset": 0, 00:21:50.286 "data_size": 65536 00:21:50.286 } 00:21:50.286 ] 00:21:50.286 }' 00:21:50.286 08:35:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:50.286 08:35:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:50.852 08:35:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:50.852 08:35:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:50.852 08:35:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:50.852 08:35:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:50.852 08:35:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:50.852 08:35:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.852 08:35:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:51.111 08:35:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:51.111 "name": "raid_bdev1", 00:21:51.111 "uuid": "202bb43a-cde2-40a5-9a89-4cd33f481858", 00:21:51.111 "strip_size_kb": 0, 00:21:51.111 "state": "online", 00:21:51.111 "raid_level": "raid1", 00:21:51.111 "superblock": false, 00:21:51.111 "num_base_bdevs": 2, 00:21:51.111 "num_base_bdevs_discovered": 1, 00:21:51.111 "num_base_bdevs_operational": 1, 00:21:51.111 "base_bdevs_list": [ 00:21:51.111 { 00:21:51.111 "name": null, 00:21:51.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:51.111 "is_configured": false, 00:21:51.111 "data_offset": 0, 00:21:51.111 "data_size": 65536 00:21:51.111 }, 00:21:51.111 { 00:21:51.111 "name": "BaseBdev2", 00:21:51.111 "uuid": "7cbb9918-a1a7-50e0-a24c-c64a7d9f1a99", 00:21:51.111 "is_configured": true, 00:21:51.111 "data_offset": 0, 00:21:51.111 "data_size": 65536 00:21:51.111 } 00:21:51.111 ] 00:21:51.111 }' 00:21:51.111 08:35:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:51.111 08:35:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:51.111 08:35:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:51.111 08:35:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:51.111 08:35:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:51.111 [2024-07-23 08:35:03.622987] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:51.369 [2024-07-23 08:35:03.640403] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000d12d40 00:21:51.369 [2024-07-23 08:35:03.642019] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:51.369 08:35:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:52.306 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:52.306 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:52.306 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:52.306 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:52.306 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:52.306 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.306 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:52.564 "name": "raid_bdev1", 00:21:52.564 "uuid": "202bb43a-cde2-40a5-9a89-4cd33f481858", 00:21:52.564 "strip_size_kb": 0, 00:21:52.564 "state": "online", 00:21:52.564 "raid_level": "raid1", 00:21:52.564 "superblock": false, 00:21:52.564 "num_base_bdevs": 2, 00:21:52.564 "num_base_bdevs_discovered": 2, 00:21:52.564 "num_base_bdevs_operational": 2, 00:21:52.564 "process": { 00:21:52.564 "type": "rebuild", 00:21:52.564 "target": "spare", 00:21:52.564 "progress": { 00:21:52.564 "blocks": 22528, 00:21:52.564 "percent": 34 00:21:52.564 } 00:21:52.564 }, 00:21:52.564 "base_bdevs_list": [ 00:21:52.564 { 00:21:52.564 "name": "spare", 00:21:52.564 "uuid": "28d9bc93-18a0-561c-9899-b19832c11159", 00:21:52.564 "is_configured": true, 00:21:52.564 "data_offset": 0, 00:21:52.564 "data_size": 65536 00:21:52.564 }, 00:21:52.564 { 00:21:52.564 "name": "BaseBdev2", 00:21:52.564 "uuid": "7cbb9918-a1a7-50e0-a24c-c64a7d9f1a99", 00:21:52.564 "is_configured": true, 00:21:52.564 "data_offset": 0, 00:21:52.564 "data_size": 65536 00:21:52.564 } 00:21:52.564 ] 00:21:52.564 }' 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=657 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.564 08:35:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:52.564 08:35:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:52.564 "name": "raid_bdev1", 00:21:52.564 "uuid": "202bb43a-cde2-40a5-9a89-4cd33f481858", 00:21:52.564 "strip_size_kb": 0, 00:21:52.564 "state": "online", 00:21:52.564 "raid_level": "raid1", 00:21:52.564 "superblock": false, 00:21:52.564 "num_base_bdevs": 2, 00:21:52.564 "num_base_bdevs_discovered": 2, 00:21:52.564 "num_base_bdevs_operational": 2, 00:21:52.564 "process": { 00:21:52.564 "type": "rebuild", 00:21:52.564 "target": "spare", 00:21:52.564 "progress": { 00:21:52.564 "blocks": 28672, 00:21:52.564 "percent": 43 00:21:52.564 } 00:21:52.564 }, 00:21:52.564 "base_bdevs_list": [ 00:21:52.564 { 00:21:52.564 "name": "spare", 00:21:52.564 "uuid": "28d9bc93-18a0-561c-9899-b19832c11159", 00:21:52.564 "is_configured": true, 00:21:52.564 "data_offset": 0, 00:21:52.564 "data_size": 65536 00:21:52.564 }, 00:21:52.564 { 00:21:52.564 "name": "BaseBdev2", 00:21:52.564 "uuid": "7cbb9918-a1a7-50e0-a24c-c64a7d9f1a99", 00:21:52.564 "is_configured": true, 00:21:52.564 "data_offset": 0, 00:21:52.564 "data_size": 65536 00:21:52.564 } 00:21:52.564 ] 00:21:52.564 }' 00:21:52.564 08:35:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:52.822 08:35:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:52.822 08:35:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:52.822 08:35:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:52.822 08:35:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:53.756 08:35:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:53.757 08:35:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:53.757 08:35:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:53.757 08:35:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:53.757 08:35:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:53.757 08:35:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:53.757 08:35:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:53.757 08:35:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.015 08:35:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:54.015 "name": "raid_bdev1", 00:21:54.015 "uuid": "202bb43a-cde2-40a5-9a89-4cd33f481858", 00:21:54.015 "strip_size_kb": 0, 00:21:54.015 "state": "online", 00:21:54.015 "raid_level": "raid1", 00:21:54.015 "superblock": false, 00:21:54.015 "num_base_bdevs": 2, 00:21:54.015 "num_base_bdevs_discovered": 2, 00:21:54.015 "num_base_bdevs_operational": 2, 00:21:54.015 "process": { 00:21:54.015 "type": "rebuild", 00:21:54.015 "target": "spare", 00:21:54.015 "progress": { 00:21:54.015 "blocks": 53248, 00:21:54.015 "percent": 81 00:21:54.015 } 00:21:54.015 }, 00:21:54.015 "base_bdevs_list": [ 00:21:54.015 { 00:21:54.015 "name": "spare", 00:21:54.015 "uuid": "28d9bc93-18a0-561c-9899-b19832c11159", 00:21:54.015 "is_configured": true, 00:21:54.015 "data_offset": 0, 00:21:54.015 "data_size": 65536 00:21:54.015 }, 00:21:54.015 { 00:21:54.015 "name": "BaseBdev2", 00:21:54.015 "uuid": "7cbb9918-a1a7-50e0-a24c-c64a7d9f1a99", 00:21:54.015 "is_configured": true, 00:21:54.015 "data_offset": 0, 00:21:54.015 "data_size": 65536 00:21:54.015 } 00:21:54.015 ] 00:21:54.015 }' 00:21:54.015 08:35:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:54.015 08:35:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:54.015 08:35:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:54.015 08:35:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:54.015 08:35:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:54.582 [2024-07-23 08:35:06.866971] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:54.582 [2024-07-23 08:35:06.867030] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:54.582 [2024-07-23 08:35:06.867067] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:55.148 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:55.148 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:55.148 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:55.148 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:55.148 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:55.148 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:55.148 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.148 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:55.148 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:55.148 "name": "raid_bdev1", 00:21:55.148 "uuid": "202bb43a-cde2-40a5-9a89-4cd33f481858", 00:21:55.148 "strip_size_kb": 0, 00:21:55.148 "state": "online", 00:21:55.148 "raid_level": "raid1", 00:21:55.148 "superblock": false, 00:21:55.148 "num_base_bdevs": 2, 00:21:55.148 "num_base_bdevs_discovered": 2, 00:21:55.148 "num_base_bdevs_operational": 2, 00:21:55.148 "base_bdevs_list": [ 00:21:55.148 { 00:21:55.148 "name": "spare", 00:21:55.148 "uuid": "28d9bc93-18a0-561c-9899-b19832c11159", 00:21:55.148 "is_configured": true, 00:21:55.148 "data_offset": 0, 00:21:55.148 "data_size": 65536 00:21:55.148 }, 00:21:55.148 { 00:21:55.148 "name": "BaseBdev2", 00:21:55.148 "uuid": "7cbb9918-a1a7-50e0-a24c-c64a7d9f1a99", 00:21:55.148 "is_configured": true, 00:21:55.148 "data_offset": 0, 00:21:55.148 "data_size": 65536 00:21:55.148 } 00:21:55.148 ] 00:21:55.148 }' 00:21:55.148 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:55.149 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:55.149 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:55.406 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:55.406 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:21:55.406 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:55.406 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:55.406 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:55.406 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:55.406 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:55.406 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.406 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:55.406 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:55.406 "name": "raid_bdev1", 00:21:55.406 "uuid": "202bb43a-cde2-40a5-9a89-4cd33f481858", 00:21:55.406 "strip_size_kb": 0, 00:21:55.406 "state": "online", 00:21:55.406 "raid_level": "raid1", 00:21:55.406 "superblock": false, 00:21:55.406 "num_base_bdevs": 2, 00:21:55.406 "num_base_bdevs_discovered": 2, 00:21:55.406 "num_base_bdevs_operational": 2, 00:21:55.406 "base_bdevs_list": [ 00:21:55.406 { 00:21:55.406 "name": "spare", 00:21:55.406 "uuid": "28d9bc93-18a0-561c-9899-b19832c11159", 00:21:55.406 "is_configured": true, 00:21:55.406 "data_offset": 0, 00:21:55.406 "data_size": 65536 00:21:55.406 }, 00:21:55.406 { 00:21:55.406 "name": "BaseBdev2", 00:21:55.406 "uuid": "7cbb9918-a1a7-50e0-a24c-c64a7d9f1a99", 00:21:55.406 "is_configured": true, 00:21:55.406 "data_offset": 0, 00:21:55.406 "data_size": 65536 00:21:55.406 } 00:21:55.406 ] 00:21:55.406 }' 00:21:55.406 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:55.407 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:55.407 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:55.664 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:55.664 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:55.664 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:55.664 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:55.664 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:55.664 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:55.664 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:55.664 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:55.664 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:55.664 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:55.664 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:55.664 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.664 08:35:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:55.664 08:35:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:55.664 "name": "raid_bdev1", 00:21:55.664 "uuid": "202bb43a-cde2-40a5-9a89-4cd33f481858", 00:21:55.664 "strip_size_kb": 0, 00:21:55.664 "state": "online", 00:21:55.664 "raid_level": "raid1", 00:21:55.664 "superblock": false, 00:21:55.664 "num_base_bdevs": 2, 00:21:55.664 "num_base_bdevs_discovered": 2, 00:21:55.664 "num_base_bdevs_operational": 2, 00:21:55.664 "base_bdevs_list": [ 00:21:55.664 { 00:21:55.664 "name": "spare", 00:21:55.664 "uuid": "28d9bc93-18a0-561c-9899-b19832c11159", 00:21:55.664 "is_configured": true, 00:21:55.664 "data_offset": 0, 00:21:55.664 "data_size": 65536 00:21:55.664 }, 00:21:55.664 { 00:21:55.665 "name": "BaseBdev2", 00:21:55.665 "uuid": "7cbb9918-a1a7-50e0-a24c-c64a7d9f1a99", 00:21:55.665 "is_configured": true, 00:21:55.665 "data_offset": 0, 00:21:55.665 "data_size": 65536 00:21:55.665 } 00:21:55.665 ] 00:21:55.665 }' 00:21:55.665 08:35:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:55.665 08:35:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:56.232 08:35:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:56.232 [2024-07-23 08:35:08.739114] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:56.232 [2024-07-23 08:35:08.739143] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:56.232 [2024-07-23 08:35:08.739219] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:56.232 [2024-07-23 08:35:08.739284] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:56.232 [2024-07-23 08:35:08.739295] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036c80 name raid_bdev1, state offline 00:21:56.491 08:35:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.491 08:35:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:21:56.491 08:35:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:56.491 08:35:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:56.491 08:35:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:21:56.491 08:35:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:21:56.491 08:35:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:56.491 08:35:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:21:56.491 08:35:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:56.491 08:35:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:56.491 08:35:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:56.491 08:35:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:21:56.491 08:35:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:56.491 08:35:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:56.491 08:35:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:21:56.750 /dev/nbd0 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:56.750 1+0 records in 00:21:56.750 1+0 records out 00:21:56.750 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000204362 s, 20.0 MB/s 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:56.750 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:21:57.009 /dev/nbd1 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:57.009 1+0 records in 00:21:57.009 1+0 records out 00:21:57.009 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257377 s, 15.9 MB/s 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:57.009 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:21:57.268 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:57.268 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:57.268 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:57.268 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:57.268 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:57.268 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:57.268 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:57.268 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:57.268 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:21:57.268 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:21:57.268 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:57.268 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 1522702 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 1522702 ']' 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 1522702 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1522702 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1522702' 00:21:57.527 killing process with pid 1522702 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 1522702 00:21:57.527 Received shutdown signal, test time was about 60.000000 seconds 00:21:57.527 00:21:57.527 Latency(us) 00:21:57.527 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:57.527 =================================================================================================================== 00:21:57.527 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:21:57.527 [2024-07-23 08:35:09.962084] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:57.527 08:35:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 1522702 00:21:57.786 [2024-07-23 08:35:10.203388] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:21:59.163 00:21:59.163 real 0m19.106s 00:21:59.163 user 0m25.121s 00:21:59.163 sys 0m3.179s 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:59.163 ************************************ 00:21:59.163 END TEST raid_rebuild_test 00:21:59.163 ************************************ 00:21:59.163 08:35:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:59.163 08:35:11 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:21:59.163 08:35:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:21:59.163 08:35:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:59.163 08:35:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:59.163 ************************************ 00:21:59.163 START TEST raid_rebuild_test_sb 00:21:59.163 ************************************ 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=1526451 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 1526451 /var/tmp/spdk-raid.sock 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1526451 ']' 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:59.163 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:59.163 08:35:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:59.163 [2024-07-23 08:35:11.638640] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:21:59.163 [2024-07-23 08:35:11.638736] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1526451 ] 00:21:59.163 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:59.163 Zero copy mechanism will not be used. 00:21:59.422 [2024-07-23 08:35:11.762491] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:59.684 [2024-07-23 08:35:11.991015] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:59.942 [2024-07-23 08:35:12.263019] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:59.942 [2024-07-23 08:35:12.263049] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:59.942 08:35:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:59.942 08:35:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:21:59.942 08:35:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:59.942 08:35:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:00.200 BaseBdev1_malloc 00:22:00.200 08:35:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:00.459 [2024-07-23 08:35:12.769711] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:00.459 [2024-07-23 08:35:12.769764] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:00.459 [2024-07-23 08:35:12.769785] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:22:00.459 [2024-07-23 08:35:12.769799] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:00.459 [2024-07-23 08:35:12.771711] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:00.459 [2024-07-23 08:35:12.771747] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:00.459 BaseBdev1 00:22:00.459 08:35:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:00.459 08:35:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:00.459 BaseBdev2_malloc 00:22:00.718 08:35:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:00.718 [2024-07-23 08:35:13.143828] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:00.718 [2024-07-23 08:35:13.143885] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:00.718 [2024-07-23 08:35:13.143905] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:22:00.718 [2024-07-23 08:35:13.143917] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:00.718 [2024-07-23 08:35:13.145829] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:00.718 [2024-07-23 08:35:13.145861] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:00.718 BaseBdev2 00:22:00.718 08:35:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:00.977 spare_malloc 00:22:00.977 08:35:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:01.236 spare_delay 00:22:01.236 08:35:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:01.236 [2024-07-23 08:35:13.689384] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:01.236 [2024-07-23 08:35:13.689444] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:01.236 [2024-07-23 08:35:13.689465] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036680 00:22:01.236 [2024-07-23 08:35:13.689476] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:01.236 [2024-07-23 08:35:13.691483] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:01.236 [2024-07-23 08:35:13.691516] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:01.236 spare 00:22:01.236 08:35:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:01.495 [2024-07-23 08:35:13.865903] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:01.495 [2024-07-23 08:35:13.867557] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:01.495 [2024-07-23 08:35:13.867760] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036c80 00:22:01.495 [2024-07-23 08:35:13.867779] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:01.495 [2024-07-23 08:35:13.868072] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:22:01.495 [2024-07-23 08:35:13.868279] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036c80 00:22:01.495 [2024-07-23 08:35:13.868289] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036c80 00:22:01.495 [2024-07-23 08:35:13.868454] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:01.495 08:35:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:01.495 08:35:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:01.495 08:35:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:01.495 08:35:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:01.495 08:35:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:01.495 08:35:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:01.495 08:35:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:01.495 08:35:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:01.495 08:35:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:01.495 08:35:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:01.495 08:35:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.495 08:35:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:01.754 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:01.754 "name": "raid_bdev1", 00:22:01.754 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:01.754 "strip_size_kb": 0, 00:22:01.754 "state": "online", 00:22:01.754 "raid_level": "raid1", 00:22:01.754 "superblock": true, 00:22:01.754 "num_base_bdevs": 2, 00:22:01.754 "num_base_bdevs_discovered": 2, 00:22:01.754 "num_base_bdevs_operational": 2, 00:22:01.754 "base_bdevs_list": [ 00:22:01.754 { 00:22:01.754 "name": "BaseBdev1", 00:22:01.754 "uuid": "be5c8440-841d-569e-b03a-bce42fc2047d", 00:22:01.754 "is_configured": true, 00:22:01.754 "data_offset": 2048, 00:22:01.754 "data_size": 63488 00:22:01.754 }, 00:22:01.754 { 00:22:01.754 "name": "BaseBdev2", 00:22:01.754 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:01.754 "is_configured": true, 00:22:01.754 "data_offset": 2048, 00:22:01.754 "data_size": 63488 00:22:01.754 } 00:22:01.754 ] 00:22:01.754 }' 00:22:01.754 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:01.754 08:35:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:02.322 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:02.322 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:02.322 [2024-07-23 08:35:14.704344] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:02.322 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:22:02.322 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.322 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:02.592 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:22:02.592 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:02.592 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:02.592 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:02.593 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:02.593 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:02.593 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:02.593 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:02.593 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:02.593 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:02.593 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:22:02.593 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:02.593 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:02.593 08:35:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:02.593 [2024-07-23 08:35:15.097161] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:22:02.868 /dev/nbd0 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:02.868 1+0 records in 00:22:02.868 1+0 records out 00:22:02.868 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000236705 s, 17.3 MB/s 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:02.868 08:35:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:22:06.151 63488+0 records in 00:22:06.151 63488+0 records out 00:22:06.151 32505856 bytes (33 MB, 31 MiB) copied, 3.37782 s, 9.6 MB/s 00:22:06.151 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:06.151 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:06.151 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:06.151 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:06.151 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:22:06.151 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:06.151 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:06.410 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:06.410 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:06.410 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:06.410 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:06.410 [2024-07-23 08:35:18.756388] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:06.410 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:06.410 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:06.410 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:06.410 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:06.410 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:06.410 [2024-07-23 08:35:18.924909] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:06.668 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:06.668 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:06.668 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:06.668 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:06.668 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:06.668 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:06.668 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:06.668 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:06.668 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:06.668 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:06.668 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:06.668 08:35:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.668 08:35:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:06.668 "name": "raid_bdev1", 00:22:06.668 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:06.668 "strip_size_kb": 0, 00:22:06.668 "state": "online", 00:22:06.668 "raid_level": "raid1", 00:22:06.668 "superblock": true, 00:22:06.668 "num_base_bdevs": 2, 00:22:06.668 "num_base_bdevs_discovered": 1, 00:22:06.668 "num_base_bdevs_operational": 1, 00:22:06.668 "base_bdevs_list": [ 00:22:06.668 { 00:22:06.668 "name": null, 00:22:06.668 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:06.668 "is_configured": false, 00:22:06.668 "data_offset": 2048, 00:22:06.668 "data_size": 63488 00:22:06.668 }, 00:22:06.668 { 00:22:06.668 "name": "BaseBdev2", 00:22:06.668 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:06.668 "is_configured": true, 00:22:06.668 "data_offset": 2048, 00:22:06.668 "data_size": 63488 00:22:06.668 } 00:22:06.668 ] 00:22:06.668 }' 00:22:06.668 08:35:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:06.668 08:35:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:07.234 08:35:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:07.493 [2024-07-23 08:35:19.783195] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:07.493 [2024-07-23 08:35:19.802222] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000ca9410 00:22:07.493 [2024-07-23 08:35:19.803829] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:07.493 08:35:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:08.429 08:35:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:08.429 08:35:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:08.429 08:35:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:08.429 08:35:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:08.429 08:35:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:08.429 08:35:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.430 08:35:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:08.689 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:08.689 "name": "raid_bdev1", 00:22:08.689 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:08.689 "strip_size_kb": 0, 00:22:08.689 "state": "online", 00:22:08.689 "raid_level": "raid1", 00:22:08.689 "superblock": true, 00:22:08.689 "num_base_bdevs": 2, 00:22:08.689 "num_base_bdevs_discovered": 2, 00:22:08.689 "num_base_bdevs_operational": 2, 00:22:08.689 "process": { 00:22:08.689 "type": "rebuild", 00:22:08.689 "target": "spare", 00:22:08.689 "progress": { 00:22:08.689 "blocks": 22528, 00:22:08.689 "percent": 35 00:22:08.689 } 00:22:08.689 }, 00:22:08.689 "base_bdevs_list": [ 00:22:08.689 { 00:22:08.689 "name": "spare", 00:22:08.689 "uuid": "731445f2-409d-5b12-814c-59504afeb93b", 00:22:08.689 "is_configured": true, 00:22:08.689 "data_offset": 2048, 00:22:08.689 "data_size": 63488 00:22:08.689 }, 00:22:08.689 { 00:22:08.689 "name": "BaseBdev2", 00:22:08.689 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:08.689 "is_configured": true, 00:22:08.689 "data_offset": 2048, 00:22:08.689 "data_size": 63488 00:22:08.689 } 00:22:08.689 ] 00:22:08.689 }' 00:22:08.689 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:08.689 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:08.689 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:08.689 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:08.689 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:08.948 [2024-07-23 08:35:21.229480] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:08.948 [2024-07-23 08:35:21.315617] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:08.948 [2024-07-23 08:35:21.315666] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:08.948 [2024-07-23 08:35:21.315681] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:08.948 [2024-07-23 08:35:21.315694] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:08.948 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:08.948 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:08.948 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:08.948 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:08.948 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:08.948 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:08.948 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:08.948 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:08.948 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:08.948 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:08.948 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.948 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:09.207 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:09.207 "name": "raid_bdev1", 00:22:09.207 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:09.207 "strip_size_kb": 0, 00:22:09.207 "state": "online", 00:22:09.207 "raid_level": "raid1", 00:22:09.207 "superblock": true, 00:22:09.207 "num_base_bdevs": 2, 00:22:09.207 "num_base_bdevs_discovered": 1, 00:22:09.207 "num_base_bdevs_operational": 1, 00:22:09.207 "base_bdevs_list": [ 00:22:09.207 { 00:22:09.207 "name": null, 00:22:09.207 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:09.207 "is_configured": false, 00:22:09.207 "data_offset": 2048, 00:22:09.207 "data_size": 63488 00:22:09.207 }, 00:22:09.207 { 00:22:09.207 "name": "BaseBdev2", 00:22:09.207 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:09.207 "is_configured": true, 00:22:09.207 "data_offset": 2048, 00:22:09.207 "data_size": 63488 00:22:09.207 } 00:22:09.207 ] 00:22:09.207 }' 00:22:09.207 08:35:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:09.207 08:35:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:09.775 08:35:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:09.775 08:35:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:09.775 08:35:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:09.775 08:35:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:09.775 08:35:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:09.775 08:35:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:09.775 08:35:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.775 08:35:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:09.775 "name": "raid_bdev1", 00:22:09.775 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:09.775 "strip_size_kb": 0, 00:22:09.775 "state": "online", 00:22:09.775 "raid_level": "raid1", 00:22:09.775 "superblock": true, 00:22:09.775 "num_base_bdevs": 2, 00:22:09.775 "num_base_bdevs_discovered": 1, 00:22:09.775 "num_base_bdevs_operational": 1, 00:22:09.775 "base_bdevs_list": [ 00:22:09.775 { 00:22:09.775 "name": null, 00:22:09.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:09.775 "is_configured": false, 00:22:09.775 "data_offset": 2048, 00:22:09.775 "data_size": 63488 00:22:09.775 }, 00:22:09.775 { 00:22:09.775 "name": "BaseBdev2", 00:22:09.775 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:09.775 "is_configured": true, 00:22:09.775 "data_offset": 2048, 00:22:09.775 "data_size": 63488 00:22:09.775 } 00:22:09.775 ] 00:22:09.775 }' 00:22:09.775 08:35:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:09.775 08:35:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:09.775 08:35:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:09.775 08:35:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:09.775 08:35:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:10.034 [2024-07-23 08:35:22.429934] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:10.034 [2024-07-23 08:35:22.448996] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000ca94e0 00:22:10.034 [2024-07-23 08:35:22.450601] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:10.034 08:35:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:10.972 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:10.972 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:10.972 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:10.972 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:10.972 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:10.972 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.972 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:11.231 "name": "raid_bdev1", 00:22:11.231 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:11.231 "strip_size_kb": 0, 00:22:11.231 "state": "online", 00:22:11.231 "raid_level": "raid1", 00:22:11.231 "superblock": true, 00:22:11.231 "num_base_bdevs": 2, 00:22:11.231 "num_base_bdevs_discovered": 2, 00:22:11.231 "num_base_bdevs_operational": 2, 00:22:11.231 "process": { 00:22:11.231 "type": "rebuild", 00:22:11.231 "target": "spare", 00:22:11.231 "progress": { 00:22:11.231 "blocks": 22528, 00:22:11.231 "percent": 35 00:22:11.231 } 00:22:11.231 }, 00:22:11.231 "base_bdevs_list": [ 00:22:11.231 { 00:22:11.231 "name": "spare", 00:22:11.231 "uuid": "731445f2-409d-5b12-814c-59504afeb93b", 00:22:11.231 "is_configured": true, 00:22:11.231 "data_offset": 2048, 00:22:11.231 "data_size": 63488 00:22:11.231 }, 00:22:11.231 { 00:22:11.231 "name": "BaseBdev2", 00:22:11.231 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:11.231 "is_configured": true, 00:22:11.231 "data_offset": 2048, 00:22:11.231 "data_size": 63488 00:22:11.231 } 00:22:11.231 ] 00:22:11.231 }' 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:22:11.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=676 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:11.231 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.232 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:11.490 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:11.490 "name": "raid_bdev1", 00:22:11.490 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:11.490 "strip_size_kb": 0, 00:22:11.490 "state": "online", 00:22:11.490 "raid_level": "raid1", 00:22:11.490 "superblock": true, 00:22:11.490 "num_base_bdevs": 2, 00:22:11.490 "num_base_bdevs_discovered": 2, 00:22:11.490 "num_base_bdevs_operational": 2, 00:22:11.490 "process": { 00:22:11.490 "type": "rebuild", 00:22:11.490 "target": "spare", 00:22:11.490 "progress": { 00:22:11.490 "blocks": 28672, 00:22:11.490 "percent": 45 00:22:11.490 } 00:22:11.490 }, 00:22:11.490 "base_bdevs_list": [ 00:22:11.490 { 00:22:11.490 "name": "spare", 00:22:11.490 "uuid": "731445f2-409d-5b12-814c-59504afeb93b", 00:22:11.490 "is_configured": true, 00:22:11.490 "data_offset": 2048, 00:22:11.490 "data_size": 63488 00:22:11.490 }, 00:22:11.490 { 00:22:11.490 "name": "BaseBdev2", 00:22:11.490 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:11.491 "is_configured": true, 00:22:11.491 "data_offset": 2048, 00:22:11.491 "data_size": 63488 00:22:11.491 } 00:22:11.491 ] 00:22:11.491 }' 00:22:11.491 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:11.491 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:11.491 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:11.491 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:11.491 08:35:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:12.864 08:35:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:12.864 08:35:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:12.864 08:35:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:12.864 08:35:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:12.864 08:35:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:12.864 08:35:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:12.864 08:35:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.864 08:35:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.864 08:35:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:12.864 "name": "raid_bdev1", 00:22:12.864 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:12.864 "strip_size_kb": 0, 00:22:12.864 "state": "online", 00:22:12.864 "raid_level": "raid1", 00:22:12.864 "superblock": true, 00:22:12.864 "num_base_bdevs": 2, 00:22:12.864 "num_base_bdevs_discovered": 2, 00:22:12.864 "num_base_bdevs_operational": 2, 00:22:12.864 "process": { 00:22:12.864 "type": "rebuild", 00:22:12.864 "target": "spare", 00:22:12.864 "progress": { 00:22:12.864 "blocks": 53248, 00:22:12.864 "percent": 83 00:22:12.864 } 00:22:12.864 }, 00:22:12.864 "base_bdevs_list": [ 00:22:12.864 { 00:22:12.864 "name": "spare", 00:22:12.864 "uuid": "731445f2-409d-5b12-814c-59504afeb93b", 00:22:12.864 "is_configured": true, 00:22:12.864 "data_offset": 2048, 00:22:12.864 "data_size": 63488 00:22:12.864 }, 00:22:12.864 { 00:22:12.864 "name": "BaseBdev2", 00:22:12.864 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:12.864 "is_configured": true, 00:22:12.864 "data_offset": 2048, 00:22:12.864 "data_size": 63488 00:22:12.864 } 00:22:12.864 ] 00:22:12.864 }' 00:22:12.864 08:35:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:12.864 08:35:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:12.864 08:35:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:12.864 08:35:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:12.864 08:35:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:13.122 [2024-07-23 08:35:25.576309] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:13.122 [2024-07-23 08:35:25.576385] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:13.122 [2024-07-23 08:35:25.576489] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:14.056 "name": "raid_bdev1", 00:22:14.056 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:14.056 "strip_size_kb": 0, 00:22:14.056 "state": "online", 00:22:14.056 "raid_level": "raid1", 00:22:14.056 "superblock": true, 00:22:14.056 "num_base_bdevs": 2, 00:22:14.056 "num_base_bdevs_discovered": 2, 00:22:14.056 "num_base_bdevs_operational": 2, 00:22:14.056 "base_bdevs_list": [ 00:22:14.056 { 00:22:14.056 "name": "spare", 00:22:14.056 "uuid": "731445f2-409d-5b12-814c-59504afeb93b", 00:22:14.056 "is_configured": true, 00:22:14.056 "data_offset": 2048, 00:22:14.056 "data_size": 63488 00:22:14.056 }, 00:22:14.056 { 00:22:14.056 "name": "BaseBdev2", 00:22:14.056 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:14.056 "is_configured": true, 00:22:14.056 "data_offset": 2048, 00:22:14.056 "data_size": 63488 00:22:14.056 } 00:22:14.056 ] 00:22:14.056 }' 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.056 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:14.314 "name": "raid_bdev1", 00:22:14.314 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:14.314 "strip_size_kb": 0, 00:22:14.314 "state": "online", 00:22:14.314 "raid_level": "raid1", 00:22:14.314 "superblock": true, 00:22:14.314 "num_base_bdevs": 2, 00:22:14.314 "num_base_bdevs_discovered": 2, 00:22:14.314 "num_base_bdevs_operational": 2, 00:22:14.314 "base_bdevs_list": [ 00:22:14.314 { 00:22:14.314 "name": "spare", 00:22:14.314 "uuid": "731445f2-409d-5b12-814c-59504afeb93b", 00:22:14.314 "is_configured": true, 00:22:14.314 "data_offset": 2048, 00:22:14.314 "data_size": 63488 00:22:14.314 }, 00:22:14.314 { 00:22:14.314 "name": "BaseBdev2", 00:22:14.314 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:14.314 "is_configured": true, 00:22:14.314 "data_offset": 2048, 00:22:14.314 "data_size": 63488 00:22:14.314 } 00:22:14.314 ] 00:22:14.314 }' 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.314 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.572 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:14.572 "name": "raid_bdev1", 00:22:14.572 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:14.572 "strip_size_kb": 0, 00:22:14.572 "state": "online", 00:22:14.572 "raid_level": "raid1", 00:22:14.572 "superblock": true, 00:22:14.572 "num_base_bdevs": 2, 00:22:14.572 "num_base_bdevs_discovered": 2, 00:22:14.572 "num_base_bdevs_operational": 2, 00:22:14.572 "base_bdevs_list": [ 00:22:14.572 { 00:22:14.572 "name": "spare", 00:22:14.572 "uuid": "731445f2-409d-5b12-814c-59504afeb93b", 00:22:14.572 "is_configured": true, 00:22:14.572 "data_offset": 2048, 00:22:14.572 "data_size": 63488 00:22:14.572 }, 00:22:14.572 { 00:22:14.572 "name": "BaseBdev2", 00:22:14.572 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:14.572 "is_configured": true, 00:22:14.572 "data_offset": 2048, 00:22:14.572 "data_size": 63488 00:22:14.572 } 00:22:14.572 ] 00:22:14.572 }' 00:22:14.572 08:35:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:14.572 08:35:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:15.138 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:15.138 [2024-07-23 08:35:27.540312] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:15.138 [2024-07-23 08:35:27.540343] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:15.138 [2024-07-23 08:35:27.540412] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:15.138 [2024-07-23 08:35:27.540470] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:15.138 [2024-07-23 08:35:27.540481] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036c80 name raid_bdev1, state offline 00:22:15.138 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.138 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:22:15.396 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:15.396 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:15.396 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:22:15.396 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:15.396 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:15.396 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:15.396 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:15.396 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:15.396 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:15.396 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:22:15.397 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:15.397 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:15.397 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:15.655 /dev/nbd0 00:22:15.655 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:15.655 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:15.655 08:35:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:15.655 08:35:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:22:15.655 08:35:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:15.655 08:35:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:15.655 08:35:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:15.655 08:35:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:22:15.655 08:35:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:15.655 08:35:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:15.655 08:35:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:15.655 1+0 records in 00:22:15.655 1+0 records out 00:22:15.655 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229299 s, 17.9 MB/s 00:22:15.655 08:35:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:15.655 08:35:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:22:15.656 08:35:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:15.656 08:35:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:15.656 08:35:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:22:15.656 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:15.656 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:15.656 08:35:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:15.656 /dev/nbd1 00:22:15.656 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:15.656 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:15.656 08:35:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:22:15.656 08:35:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:22:15.656 08:35:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:15.656 08:35:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:15.656 08:35:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:22:15.656 08:35:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:22:15.656 08:35:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:15.656 08:35:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:15.656 08:35:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:15.656 1+0 records in 00:22:15.656 1+0 records out 00:22:15.656 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000199425 s, 20.5 MB/s 00:22:15.914 08:35:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:15.914 08:35:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:22:15.914 08:35:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:15.914 08:35:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:15.914 08:35:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:22:15.914 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:15.914 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:15.914 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:15.914 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:15.914 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:15.914 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:15.914 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:15.914 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:22:15.914 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:15.914 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:16.172 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:16.172 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:16.172 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:16.172 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:16.172 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:16.172 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:16.172 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:16.172 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:16.172 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:16.172 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:16.430 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:16.430 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:16.430 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:16.430 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:16.430 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:16.430 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:16.430 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:16.430 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:16.430 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:22:16.430 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:16.430 08:35:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:16.687 [2024-07-23 08:35:29.068831] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:16.687 [2024-07-23 08:35:29.068887] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:16.687 [2024-07-23 08:35:29.068908] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038480 00:22:16.687 [2024-07-23 08:35:29.068918] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:16.687 [2024-07-23 08:35:29.070849] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:16.687 [2024-07-23 08:35:29.070875] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:16.687 [2024-07-23 08:35:29.070963] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:16.687 [2024-07-23 08:35:29.071017] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:16.688 [2024-07-23 08:35:29.071183] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:16.688 spare 00:22:16.688 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:16.688 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:16.688 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:16.688 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:16.688 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:16.688 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:16.688 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:16.688 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:16.688 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:16.688 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:16.688 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:16.688 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:16.688 [2024-07-23 08:35:29.171516] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000038a80 00:22:16.688 [2024-07-23 08:35:29.171547] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:16.688 [2024-07-23 08:35:29.171846] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000cc7b90 00:22:16.688 [2024-07-23 08:35:29.172054] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000038a80 00:22:16.688 [2024-07-23 08:35:29.172065] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000038a80 00:22:16.688 [2024-07-23 08:35:29.172231] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:16.945 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:16.945 "name": "raid_bdev1", 00:22:16.945 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:16.945 "strip_size_kb": 0, 00:22:16.945 "state": "online", 00:22:16.945 "raid_level": "raid1", 00:22:16.945 "superblock": true, 00:22:16.945 "num_base_bdevs": 2, 00:22:16.945 "num_base_bdevs_discovered": 2, 00:22:16.945 "num_base_bdevs_operational": 2, 00:22:16.945 "base_bdevs_list": [ 00:22:16.945 { 00:22:16.945 "name": "spare", 00:22:16.945 "uuid": "731445f2-409d-5b12-814c-59504afeb93b", 00:22:16.945 "is_configured": true, 00:22:16.945 "data_offset": 2048, 00:22:16.945 "data_size": 63488 00:22:16.945 }, 00:22:16.945 { 00:22:16.945 "name": "BaseBdev2", 00:22:16.945 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:16.945 "is_configured": true, 00:22:16.945 "data_offset": 2048, 00:22:16.945 "data_size": 63488 00:22:16.945 } 00:22:16.945 ] 00:22:16.945 }' 00:22:16.945 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:16.945 08:35:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:17.513 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:17.513 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:17.513 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:17.513 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:17.513 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:17.513 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:17.513 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.513 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:17.513 "name": "raid_bdev1", 00:22:17.513 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:17.513 "strip_size_kb": 0, 00:22:17.513 "state": "online", 00:22:17.513 "raid_level": "raid1", 00:22:17.513 "superblock": true, 00:22:17.513 "num_base_bdevs": 2, 00:22:17.513 "num_base_bdevs_discovered": 2, 00:22:17.513 "num_base_bdevs_operational": 2, 00:22:17.513 "base_bdevs_list": [ 00:22:17.513 { 00:22:17.513 "name": "spare", 00:22:17.513 "uuid": "731445f2-409d-5b12-814c-59504afeb93b", 00:22:17.513 "is_configured": true, 00:22:17.513 "data_offset": 2048, 00:22:17.513 "data_size": 63488 00:22:17.513 }, 00:22:17.513 { 00:22:17.513 "name": "BaseBdev2", 00:22:17.513 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:17.513 "is_configured": true, 00:22:17.513 "data_offset": 2048, 00:22:17.513 "data_size": 63488 00:22:17.513 } 00:22:17.513 ] 00:22:17.513 }' 00:22:17.513 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:17.513 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:17.513 08:35:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:17.806 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:17.806 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.806 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:22:17.806 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:22:17.806 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:18.074 [2024-07-23 08:35:30.380328] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:18.074 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:18.074 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:18.074 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:18.074 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:18.074 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:18.074 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:18.074 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:18.074 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:18.074 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:18.074 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:18.074 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:18.074 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.074 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:18.074 "name": "raid_bdev1", 00:22:18.074 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:18.074 "strip_size_kb": 0, 00:22:18.074 "state": "online", 00:22:18.074 "raid_level": "raid1", 00:22:18.074 "superblock": true, 00:22:18.074 "num_base_bdevs": 2, 00:22:18.074 "num_base_bdevs_discovered": 1, 00:22:18.074 "num_base_bdevs_operational": 1, 00:22:18.074 "base_bdevs_list": [ 00:22:18.074 { 00:22:18.074 "name": null, 00:22:18.074 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.074 "is_configured": false, 00:22:18.074 "data_offset": 2048, 00:22:18.074 "data_size": 63488 00:22:18.074 }, 00:22:18.074 { 00:22:18.074 "name": "BaseBdev2", 00:22:18.074 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:18.074 "is_configured": true, 00:22:18.074 "data_offset": 2048, 00:22:18.074 "data_size": 63488 00:22:18.074 } 00:22:18.074 ] 00:22:18.074 }' 00:22:18.074 08:35:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:18.074 08:35:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:18.640 08:35:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:18.898 [2024-07-23 08:35:31.194497] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:18.898 [2024-07-23 08:35:31.194687] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:18.898 [2024-07-23 08:35:31.194706] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:18.898 [2024-07-23 08:35:31.194733] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:18.898 [2024-07-23 08:35:31.213534] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000cc7c60 00:22:18.898 [2024-07-23 08:35:31.215134] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:18.898 08:35:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:22:19.834 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:19.834 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:19.834 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:19.834 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:19.834 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:19.834 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.834 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.092 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:20.092 "name": "raid_bdev1", 00:22:20.092 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:20.092 "strip_size_kb": 0, 00:22:20.092 "state": "online", 00:22:20.092 "raid_level": "raid1", 00:22:20.092 "superblock": true, 00:22:20.092 "num_base_bdevs": 2, 00:22:20.093 "num_base_bdevs_discovered": 2, 00:22:20.093 "num_base_bdevs_operational": 2, 00:22:20.093 "process": { 00:22:20.093 "type": "rebuild", 00:22:20.093 "target": "spare", 00:22:20.093 "progress": { 00:22:20.093 "blocks": 22528, 00:22:20.093 "percent": 35 00:22:20.093 } 00:22:20.093 }, 00:22:20.093 "base_bdevs_list": [ 00:22:20.093 { 00:22:20.093 "name": "spare", 00:22:20.093 "uuid": "731445f2-409d-5b12-814c-59504afeb93b", 00:22:20.093 "is_configured": true, 00:22:20.093 "data_offset": 2048, 00:22:20.093 "data_size": 63488 00:22:20.093 }, 00:22:20.093 { 00:22:20.093 "name": "BaseBdev2", 00:22:20.093 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:20.093 "is_configured": true, 00:22:20.093 "data_offset": 2048, 00:22:20.093 "data_size": 63488 00:22:20.093 } 00:22:20.093 ] 00:22:20.093 }' 00:22:20.093 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:20.093 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:20.093 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:20.093 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:20.093 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:20.351 [2024-07-23 08:35:32.640687] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:20.351 [2024-07-23 08:35:32.727015] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:20.351 [2024-07-23 08:35:32.727067] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:20.351 [2024-07-23 08:35:32.727081] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:20.351 [2024-07-23 08:35:32.727090] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:20.351 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:20.351 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:20.351 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:20.351 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:20.351 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:20.351 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:20.351 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:20.351 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:20.351 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:20.351 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:20.351 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.351 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.610 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:20.610 "name": "raid_bdev1", 00:22:20.610 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:20.610 "strip_size_kb": 0, 00:22:20.610 "state": "online", 00:22:20.610 "raid_level": "raid1", 00:22:20.610 "superblock": true, 00:22:20.610 "num_base_bdevs": 2, 00:22:20.610 "num_base_bdevs_discovered": 1, 00:22:20.610 "num_base_bdevs_operational": 1, 00:22:20.610 "base_bdevs_list": [ 00:22:20.610 { 00:22:20.610 "name": null, 00:22:20.610 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.610 "is_configured": false, 00:22:20.610 "data_offset": 2048, 00:22:20.610 "data_size": 63488 00:22:20.610 }, 00:22:20.610 { 00:22:20.610 "name": "BaseBdev2", 00:22:20.610 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:20.610 "is_configured": true, 00:22:20.610 "data_offset": 2048, 00:22:20.610 "data_size": 63488 00:22:20.610 } 00:22:20.610 ] 00:22:20.610 }' 00:22:20.610 08:35:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:20.610 08:35:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:21.176 08:35:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:21.176 [2024-07-23 08:35:33.576239] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:21.176 [2024-07-23 08:35:33.576301] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:21.176 [2024-07-23 08:35:33.576323] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039080 00:22:21.176 [2024-07-23 08:35:33.576335] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:21.176 [2024-07-23 08:35:33.576810] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:21.176 [2024-07-23 08:35:33.576835] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:21.176 [2024-07-23 08:35:33.576919] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:21.176 [2024-07-23 08:35:33.576934] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:21.176 [2024-07-23 08:35:33.576947] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:21.176 [2024-07-23 08:35:33.576969] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:21.176 [2024-07-23 08:35:33.593657] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000cc7d30 00:22:21.176 spare 00:22:21.176 [2024-07-23 08:35:33.595296] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:21.176 08:35:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:22:22.110 08:35:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:22.110 08:35:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:22.110 08:35:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:22.110 08:35:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:22.110 08:35:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:22.110 08:35:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.110 08:35:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:22.368 08:35:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:22.368 "name": "raid_bdev1", 00:22:22.368 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:22.368 "strip_size_kb": 0, 00:22:22.368 "state": "online", 00:22:22.368 "raid_level": "raid1", 00:22:22.368 "superblock": true, 00:22:22.368 "num_base_bdevs": 2, 00:22:22.368 "num_base_bdevs_discovered": 2, 00:22:22.368 "num_base_bdevs_operational": 2, 00:22:22.368 "process": { 00:22:22.368 "type": "rebuild", 00:22:22.368 "target": "spare", 00:22:22.368 "progress": { 00:22:22.368 "blocks": 22528, 00:22:22.368 "percent": 35 00:22:22.368 } 00:22:22.368 }, 00:22:22.368 "base_bdevs_list": [ 00:22:22.368 { 00:22:22.368 "name": "spare", 00:22:22.368 "uuid": "731445f2-409d-5b12-814c-59504afeb93b", 00:22:22.368 "is_configured": true, 00:22:22.368 "data_offset": 2048, 00:22:22.369 "data_size": 63488 00:22:22.369 }, 00:22:22.369 { 00:22:22.369 "name": "BaseBdev2", 00:22:22.369 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:22.369 "is_configured": true, 00:22:22.369 "data_offset": 2048, 00:22:22.369 "data_size": 63488 00:22:22.369 } 00:22:22.369 ] 00:22:22.369 }' 00:22:22.369 08:35:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:22.369 08:35:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:22.369 08:35:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:22.369 08:35:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:22.369 08:35:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:22.627 [2024-07-23 08:35:35.033023] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:22.627 [2024-07-23 08:35:35.107274] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:22.627 [2024-07-23 08:35:35.107322] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:22.627 [2024-07-23 08:35:35.107338] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:22.627 [2024-07-23 08:35:35.107347] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:22.885 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:22.885 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:22.885 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:22.885 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:22.885 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:22.885 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:22.885 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:22.885 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:22.885 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:22.885 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:22.885 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.885 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:22.885 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:22.885 "name": "raid_bdev1", 00:22:22.885 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:22.885 "strip_size_kb": 0, 00:22:22.885 "state": "online", 00:22:22.885 "raid_level": "raid1", 00:22:22.885 "superblock": true, 00:22:22.885 "num_base_bdevs": 2, 00:22:22.885 "num_base_bdevs_discovered": 1, 00:22:22.885 "num_base_bdevs_operational": 1, 00:22:22.885 "base_bdevs_list": [ 00:22:22.885 { 00:22:22.885 "name": null, 00:22:22.885 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:22.885 "is_configured": false, 00:22:22.885 "data_offset": 2048, 00:22:22.885 "data_size": 63488 00:22:22.885 }, 00:22:22.885 { 00:22:22.885 "name": "BaseBdev2", 00:22:22.885 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:22.885 "is_configured": true, 00:22:22.885 "data_offset": 2048, 00:22:22.885 "data_size": 63488 00:22:22.885 } 00:22:22.885 ] 00:22:22.885 }' 00:22:22.885 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:22.885 08:35:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:23.451 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:23.451 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:23.451 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:23.451 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:23.451 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:23.451 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.451 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:23.451 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:23.451 "name": "raid_bdev1", 00:22:23.451 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:23.451 "strip_size_kb": 0, 00:22:23.451 "state": "online", 00:22:23.451 "raid_level": "raid1", 00:22:23.451 "superblock": true, 00:22:23.451 "num_base_bdevs": 2, 00:22:23.451 "num_base_bdevs_discovered": 1, 00:22:23.452 "num_base_bdevs_operational": 1, 00:22:23.452 "base_bdevs_list": [ 00:22:23.452 { 00:22:23.452 "name": null, 00:22:23.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.452 "is_configured": false, 00:22:23.452 "data_offset": 2048, 00:22:23.452 "data_size": 63488 00:22:23.452 }, 00:22:23.452 { 00:22:23.452 "name": "BaseBdev2", 00:22:23.452 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:23.452 "is_configured": true, 00:22:23.452 "data_offset": 2048, 00:22:23.452 "data_size": 63488 00:22:23.452 } 00:22:23.452 ] 00:22:23.452 }' 00:22:23.452 08:35:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:23.710 08:35:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:23.710 08:35:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:23.710 08:35:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:23.710 08:35:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:23.710 08:35:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:23.968 [2024-07-23 08:35:36.370593] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:23.968 [2024-07-23 08:35:36.370655] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:23.968 [2024-07-23 08:35:36.370678] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039680 00:22:23.968 [2024-07-23 08:35:36.370688] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:23.968 [2024-07-23 08:35:36.371143] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:23.968 [2024-07-23 08:35:36.371162] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:23.968 [2024-07-23 08:35:36.371241] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:23.968 [2024-07-23 08:35:36.371254] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:23.968 [2024-07-23 08:35:36.371266] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:23.968 BaseBdev1 00:22:23.968 08:35:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:22:24.902 08:35:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:24.902 08:35:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:24.902 08:35:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:24.902 08:35:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:24.902 08:35:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:24.902 08:35:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:24.902 08:35:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:24.902 08:35:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:24.902 08:35:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:24.902 08:35:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:24.902 08:35:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.902 08:35:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:25.160 08:35:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:25.160 "name": "raid_bdev1", 00:22:25.160 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:25.160 "strip_size_kb": 0, 00:22:25.160 "state": "online", 00:22:25.160 "raid_level": "raid1", 00:22:25.160 "superblock": true, 00:22:25.160 "num_base_bdevs": 2, 00:22:25.160 "num_base_bdevs_discovered": 1, 00:22:25.160 "num_base_bdevs_operational": 1, 00:22:25.160 "base_bdevs_list": [ 00:22:25.160 { 00:22:25.160 "name": null, 00:22:25.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:25.160 "is_configured": false, 00:22:25.160 "data_offset": 2048, 00:22:25.160 "data_size": 63488 00:22:25.160 }, 00:22:25.160 { 00:22:25.160 "name": "BaseBdev2", 00:22:25.160 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:25.160 "is_configured": true, 00:22:25.160 "data_offset": 2048, 00:22:25.160 "data_size": 63488 00:22:25.160 } 00:22:25.160 ] 00:22:25.160 }' 00:22:25.160 08:35:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:25.160 08:35:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:25.726 08:35:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:25.726 08:35:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:25.726 08:35:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:25.726 08:35:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:25.726 08:35:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:25.726 08:35:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:25.726 08:35:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:25.726 08:35:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:25.726 "name": "raid_bdev1", 00:22:25.726 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:25.726 "strip_size_kb": 0, 00:22:25.726 "state": "online", 00:22:25.726 "raid_level": "raid1", 00:22:25.726 "superblock": true, 00:22:25.726 "num_base_bdevs": 2, 00:22:25.726 "num_base_bdevs_discovered": 1, 00:22:25.726 "num_base_bdevs_operational": 1, 00:22:25.726 "base_bdevs_list": [ 00:22:25.726 { 00:22:25.726 "name": null, 00:22:25.726 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:25.726 "is_configured": false, 00:22:25.726 "data_offset": 2048, 00:22:25.726 "data_size": 63488 00:22:25.726 }, 00:22:25.726 { 00:22:25.726 "name": "BaseBdev2", 00:22:25.726 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:25.726 "is_configured": true, 00:22:25.726 "data_offset": 2048, 00:22:25.726 "data_size": 63488 00:22:25.726 } 00:22:25.726 ] 00:22:25.726 }' 00:22:25.726 08:35:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:25.984 08:35:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:25.984 08:35:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:25.984 08:35:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:25.984 08:35:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:25.984 08:35:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:22:25.984 08:35:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:25.984 08:35:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:25.984 08:35:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:25.984 08:35:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:25.984 08:35:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:25.984 08:35:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:25.984 08:35:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:25.984 08:35:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:25.984 08:35:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:25.984 08:35:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:25.984 [2024-07-23 08:35:38.476322] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:25.984 [2024-07-23 08:35:38.476479] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:25.985 [2024-07-23 08:35:38.476493] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:25.985 request: 00:22:25.985 { 00:22:25.985 "base_bdev": "BaseBdev1", 00:22:25.985 "raid_bdev": "raid_bdev1", 00:22:25.985 "method": "bdev_raid_add_base_bdev", 00:22:25.985 "req_id": 1 00:22:25.985 } 00:22:25.985 Got JSON-RPC error response 00:22:25.985 response: 00:22:25.985 { 00:22:25.985 "code": -22, 00:22:25.985 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:22:25.985 } 00:22:25.985 08:35:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:22:25.985 08:35:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:25.985 08:35:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:25.985 08:35:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:25.985 08:35:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:22:27.359 08:35:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:27.359 08:35:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:27.359 08:35:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:27.359 08:35:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:27.359 08:35:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:27.359 08:35:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:27.359 08:35:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:27.359 08:35:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:27.359 08:35:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:27.359 08:35:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:27.359 08:35:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.359 08:35:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:27.359 08:35:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:27.359 "name": "raid_bdev1", 00:22:27.359 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:27.359 "strip_size_kb": 0, 00:22:27.359 "state": "online", 00:22:27.359 "raid_level": "raid1", 00:22:27.359 "superblock": true, 00:22:27.359 "num_base_bdevs": 2, 00:22:27.359 "num_base_bdevs_discovered": 1, 00:22:27.359 "num_base_bdevs_operational": 1, 00:22:27.359 "base_bdevs_list": [ 00:22:27.359 { 00:22:27.359 "name": null, 00:22:27.359 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:27.359 "is_configured": false, 00:22:27.359 "data_offset": 2048, 00:22:27.359 "data_size": 63488 00:22:27.359 }, 00:22:27.359 { 00:22:27.359 "name": "BaseBdev2", 00:22:27.359 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:27.359 "is_configured": true, 00:22:27.359 "data_offset": 2048, 00:22:27.359 "data_size": 63488 00:22:27.359 } 00:22:27.359 ] 00:22:27.359 }' 00:22:27.359 08:35:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:27.359 08:35:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:27.617 08:35:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:27.617 08:35:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:27.617 08:35:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:27.617 08:35:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:27.617 08:35:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:27.617 08:35:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:27.617 08:35:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.875 08:35:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:27.875 "name": "raid_bdev1", 00:22:27.875 "uuid": "5ef5eea1-f5ea-400c-98cd-a13b52d07fed", 00:22:27.875 "strip_size_kb": 0, 00:22:27.875 "state": "online", 00:22:27.875 "raid_level": "raid1", 00:22:27.875 "superblock": true, 00:22:27.875 "num_base_bdevs": 2, 00:22:27.875 "num_base_bdevs_discovered": 1, 00:22:27.875 "num_base_bdevs_operational": 1, 00:22:27.875 "base_bdevs_list": [ 00:22:27.875 { 00:22:27.875 "name": null, 00:22:27.875 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:27.875 "is_configured": false, 00:22:27.875 "data_offset": 2048, 00:22:27.875 "data_size": 63488 00:22:27.875 }, 00:22:27.875 { 00:22:27.875 "name": "BaseBdev2", 00:22:27.875 "uuid": "45750116-56f6-5593-9a17-0e98f4b2faa6", 00:22:27.875 "is_configured": true, 00:22:27.875 "data_offset": 2048, 00:22:27.875 "data_size": 63488 00:22:27.875 } 00:22:27.875 ] 00:22:27.875 }' 00:22:27.875 08:35:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:27.875 08:35:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:27.875 08:35:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:27.875 08:35:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:27.875 08:35:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 1526451 00:22:27.875 08:35:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1526451 ']' 00:22:27.875 08:35:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 1526451 00:22:27.875 08:35:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:22:27.875 08:35:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:27.875 08:35:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1526451 00:22:28.134 08:35:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:28.134 08:35:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:28.134 08:35:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1526451' 00:22:28.134 killing process with pid 1526451 00:22:28.134 08:35:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 1526451 00:22:28.134 Received shutdown signal, test time was about 60.000000 seconds 00:22:28.134 00:22:28.134 Latency(us) 00:22:28.134 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:28.134 =================================================================================================================== 00:22:28.134 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:28.134 [2024-07-23 08:35:40.422889] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:28.134 [2024-07-23 08:35:40.423002] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:28.134 08:35:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 1526451 00:22:28.134 [2024-07-23 08:35:40.423054] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:28.134 [2024-07-23 08:35:40.423065] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038a80 name raid_bdev1, state offline 00:22:28.392 [2024-07-23 08:35:40.664031] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:29.767 08:35:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:22:29.767 00:22:29.767 real 0m30.390s 00:22:29.767 user 0m43.501s 00:22:29.767 sys 0m4.112s 00:22:29.767 08:35:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:29.767 08:35:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:29.767 ************************************ 00:22:29.767 END TEST raid_rebuild_test_sb 00:22:29.767 ************************************ 00:22:29.767 08:35:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:29.767 08:35:41 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:22:29.767 08:35:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:29.767 08:35:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:29.767 08:35:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:29.767 ************************************ 00:22:29.767 START TEST raid_rebuild_test_io 00:22:29.767 ************************************ 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1532444 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1532444 /var/tmp/spdk-raid.sock 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 1532444 ']' 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:29.767 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:29.767 08:35:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:29.767 [2024-07-23 08:35:42.102685] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:22:29.767 [2024-07-23 08:35:42.102773] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1532444 ] 00:22:29.767 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:29.767 Zero copy mechanism will not be used. 00:22:29.767 [2024-07-23 08:35:42.225773] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:30.025 [2024-07-23 08:35:42.438788] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:30.284 [2024-07-23 08:35:42.723500] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:30.284 [2024-07-23 08:35:42.723531] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:30.542 08:35:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:30.542 08:35:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:22:30.542 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:30.542 08:35:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:30.542 BaseBdev1_malloc 00:22:30.800 08:35:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:30.800 [2024-07-23 08:35:43.223400] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:30.800 [2024-07-23 08:35:43.223456] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:30.800 [2024-07-23 08:35:43.223478] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:22:30.800 [2024-07-23 08:35:43.223491] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:30.800 [2024-07-23 08:35:43.225347] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:30.800 [2024-07-23 08:35:43.225379] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:30.800 BaseBdev1 00:22:30.800 08:35:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:30.800 08:35:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:31.058 BaseBdev2_malloc 00:22:31.058 08:35:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:31.316 [2024-07-23 08:35:43.611196] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:31.316 [2024-07-23 08:35:43.611251] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:31.316 [2024-07-23 08:35:43.611269] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:22:31.316 [2024-07-23 08:35:43.611282] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:31.316 [2024-07-23 08:35:43.613228] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:31.316 [2024-07-23 08:35:43.613259] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:31.316 BaseBdev2 00:22:31.316 08:35:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:31.316 spare_malloc 00:22:31.316 08:35:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:31.576 spare_delay 00:22:31.576 08:35:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:31.852 [2024-07-23 08:35:44.162697] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:31.852 [2024-07-23 08:35:44.162750] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:31.852 [2024-07-23 08:35:44.162770] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036680 00:22:31.852 [2024-07-23 08:35:44.162780] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:31.852 [2024-07-23 08:35:44.164746] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:31.852 [2024-07-23 08:35:44.164776] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:31.852 spare 00:22:31.852 08:35:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:31.852 [2024-07-23 08:35:44.335182] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:31.852 [2024-07-23 08:35:44.336790] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:31.852 [2024-07-23 08:35:44.336885] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036c80 00:22:31.852 [2024-07-23 08:35:44.336906] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:31.852 [2024-07-23 08:35:44.337171] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:22:31.852 [2024-07-23 08:35:44.337373] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036c80 00:22:31.852 [2024-07-23 08:35:44.337384] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036c80 00:22:31.852 [2024-07-23 08:35:44.337572] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:31.853 08:35:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:31.853 08:35:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:31.853 08:35:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:31.853 08:35:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:31.853 08:35:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:31.853 08:35:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:31.853 08:35:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:31.853 08:35:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:31.853 08:35:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:31.853 08:35:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:31.853 08:35:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.853 08:35:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.125 08:35:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:32.126 "name": "raid_bdev1", 00:22:32.126 "uuid": "02cf7048-c1b7-4762-bd0f-cba91bfb336b", 00:22:32.126 "strip_size_kb": 0, 00:22:32.126 "state": "online", 00:22:32.126 "raid_level": "raid1", 00:22:32.126 "superblock": false, 00:22:32.126 "num_base_bdevs": 2, 00:22:32.126 "num_base_bdevs_discovered": 2, 00:22:32.126 "num_base_bdevs_operational": 2, 00:22:32.126 "base_bdevs_list": [ 00:22:32.126 { 00:22:32.126 "name": "BaseBdev1", 00:22:32.126 "uuid": "796e8ed1-bd7c-5072-9487-f765a7530615", 00:22:32.126 "is_configured": true, 00:22:32.126 "data_offset": 0, 00:22:32.126 "data_size": 65536 00:22:32.126 }, 00:22:32.126 { 00:22:32.126 "name": "BaseBdev2", 00:22:32.126 "uuid": "acacc850-11fa-5805-902c-cbe6a65a84e6", 00:22:32.126 "is_configured": true, 00:22:32.126 "data_offset": 0, 00:22:32.126 "data_size": 65536 00:22:32.126 } 00:22:32.126 ] 00:22:32.126 }' 00:22:32.126 08:35:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:32.126 08:35:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:32.692 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:32.692 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:32.692 [2024-07-23 08:35:45.153563] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:32.692 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:22:32.692 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.692 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:32.949 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:22:32.949 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:22:32.949 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:32.949 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:32.949 [2024-07-23 08:35:45.423489] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c130 00:22:32.949 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:32.949 Zero copy mechanism will not be used. 00:22:32.949 Running I/O for 60 seconds... 00:22:33.207 [2024-07-23 08:35:45.499070] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:33.207 [2024-07-23 08:35:45.499259] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d00000c130 00:22:33.207 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:33.207 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:33.207 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:33.207 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:33.207 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:33.207 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:33.207 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:33.207 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:33.207 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:33.207 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:33.207 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.207 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:33.207 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:33.207 "name": "raid_bdev1", 00:22:33.207 "uuid": "02cf7048-c1b7-4762-bd0f-cba91bfb336b", 00:22:33.207 "strip_size_kb": 0, 00:22:33.207 "state": "online", 00:22:33.207 "raid_level": "raid1", 00:22:33.207 "superblock": false, 00:22:33.207 "num_base_bdevs": 2, 00:22:33.207 "num_base_bdevs_discovered": 1, 00:22:33.207 "num_base_bdevs_operational": 1, 00:22:33.207 "base_bdevs_list": [ 00:22:33.207 { 00:22:33.207 "name": null, 00:22:33.207 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:33.207 "is_configured": false, 00:22:33.207 "data_offset": 0, 00:22:33.207 "data_size": 65536 00:22:33.207 }, 00:22:33.207 { 00:22:33.207 "name": "BaseBdev2", 00:22:33.207 "uuid": "acacc850-11fa-5805-902c-cbe6a65a84e6", 00:22:33.207 "is_configured": true, 00:22:33.207 "data_offset": 0, 00:22:33.207 "data_size": 65536 00:22:33.207 } 00:22:33.207 ] 00:22:33.207 }' 00:22:33.207 08:35:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:33.207 08:35:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:33.773 08:35:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:34.030 [2024-07-23 08:35:46.304743] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:34.030 [2024-07-23 08:35:46.341539] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c200 00:22:34.030 [2024-07-23 08:35:46.343178] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:34.030 08:35:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:34.030 [2024-07-23 08:35:46.467840] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:34.030 [2024-07-23 08:35:46.468118] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:34.287 [2024-07-23 08:35:46.678526] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:34.287 [2024-07-23 08:35:46.678788] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:34.544 [2024-07-23 08:35:47.000571] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:34.801 [2024-07-23 08:35:47.131988] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:35.058 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:35.058 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:35.058 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:35.058 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:35.058 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:35.058 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.058 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:35.058 [2024-07-23 08:35:47.375785] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:35.058 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:35.058 "name": "raid_bdev1", 00:22:35.058 "uuid": "02cf7048-c1b7-4762-bd0f-cba91bfb336b", 00:22:35.058 "strip_size_kb": 0, 00:22:35.058 "state": "online", 00:22:35.058 "raid_level": "raid1", 00:22:35.058 "superblock": false, 00:22:35.058 "num_base_bdevs": 2, 00:22:35.058 "num_base_bdevs_discovered": 2, 00:22:35.058 "num_base_bdevs_operational": 2, 00:22:35.058 "process": { 00:22:35.058 "type": "rebuild", 00:22:35.058 "target": "spare", 00:22:35.058 "progress": { 00:22:35.058 "blocks": 14336, 00:22:35.058 "percent": 21 00:22:35.058 } 00:22:35.058 }, 00:22:35.058 "base_bdevs_list": [ 00:22:35.058 { 00:22:35.058 "name": "spare", 00:22:35.058 "uuid": "cbf5c7c7-349b-51ba-ad49-ee7e73c03f79", 00:22:35.058 "is_configured": true, 00:22:35.058 "data_offset": 0, 00:22:35.058 "data_size": 65536 00:22:35.058 }, 00:22:35.058 { 00:22:35.058 "name": "BaseBdev2", 00:22:35.058 "uuid": "acacc850-11fa-5805-902c-cbe6a65a84e6", 00:22:35.058 "is_configured": true, 00:22:35.058 "data_offset": 0, 00:22:35.058 "data_size": 65536 00:22:35.058 } 00:22:35.058 ] 00:22:35.058 }' 00:22:35.058 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:35.058 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:35.058 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:35.316 [2024-07-23 08:35:47.591299] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:35.316 [2024-07-23 08:35:47.591570] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:35.316 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:35.316 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:35.316 [2024-07-23 08:35:47.765650] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:35.574 [2024-07-23 08:35:47.908235] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:35.574 [2024-07-23 08:35:47.916552] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:35.574 [2024-07-23 08:35:47.916585] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:35.574 [2024-07-23 08:35:47.916597] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:35.574 [2024-07-23 08:35:47.968242] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d00000c130 00:22:35.574 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:35.574 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:35.574 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:35.574 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:35.574 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:35.574 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:35.574 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:35.574 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:35.574 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:35.574 08:35:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:35.574 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.574 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:35.832 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:35.832 "name": "raid_bdev1", 00:22:35.832 "uuid": "02cf7048-c1b7-4762-bd0f-cba91bfb336b", 00:22:35.832 "strip_size_kb": 0, 00:22:35.832 "state": "online", 00:22:35.832 "raid_level": "raid1", 00:22:35.832 "superblock": false, 00:22:35.832 "num_base_bdevs": 2, 00:22:35.832 "num_base_bdevs_discovered": 1, 00:22:35.832 "num_base_bdevs_operational": 1, 00:22:35.832 "base_bdevs_list": [ 00:22:35.832 { 00:22:35.832 "name": null, 00:22:35.832 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:35.832 "is_configured": false, 00:22:35.832 "data_offset": 0, 00:22:35.832 "data_size": 65536 00:22:35.832 }, 00:22:35.832 { 00:22:35.832 "name": "BaseBdev2", 00:22:35.832 "uuid": "acacc850-11fa-5805-902c-cbe6a65a84e6", 00:22:35.832 "is_configured": true, 00:22:35.832 "data_offset": 0, 00:22:35.832 "data_size": 65536 00:22:35.832 } 00:22:35.832 ] 00:22:35.832 }' 00:22:35.832 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:35.832 08:35:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:36.398 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:36.398 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:36.398 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:36.398 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:36.398 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:36.398 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.398 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:36.398 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:36.398 "name": "raid_bdev1", 00:22:36.398 "uuid": "02cf7048-c1b7-4762-bd0f-cba91bfb336b", 00:22:36.398 "strip_size_kb": 0, 00:22:36.398 "state": "online", 00:22:36.398 "raid_level": "raid1", 00:22:36.398 "superblock": false, 00:22:36.398 "num_base_bdevs": 2, 00:22:36.398 "num_base_bdevs_discovered": 1, 00:22:36.398 "num_base_bdevs_operational": 1, 00:22:36.398 "base_bdevs_list": [ 00:22:36.398 { 00:22:36.398 "name": null, 00:22:36.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.398 "is_configured": false, 00:22:36.398 "data_offset": 0, 00:22:36.398 "data_size": 65536 00:22:36.398 }, 00:22:36.398 { 00:22:36.398 "name": "BaseBdev2", 00:22:36.398 "uuid": "acacc850-11fa-5805-902c-cbe6a65a84e6", 00:22:36.398 "is_configured": true, 00:22:36.398 "data_offset": 0, 00:22:36.398 "data_size": 65536 00:22:36.398 } 00:22:36.398 ] 00:22:36.398 }' 00:22:36.398 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:36.398 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:36.398 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:36.656 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:36.656 08:35:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:36.656 [2024-07-23 08:35:49.090645] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:36.656 [2024-07-23 08:35:49.141599] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c2d0 00:22:36.656 [2024-07-23 08:35:49.143242] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:36.656 08:35:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:36.914 [2024-07-23 08:35:49.262451] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:36.914 [2024-07-23 08:35:49.262864] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:37.177 [2024-07-23 08:35:49.487599] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:37.177 [2024-07-23 08:35:49.487895] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:37.436 [2024-07-23 08:35:49.723137] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:37.436 [2024-07-23 08:35:49.723538] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:37.436 [2024-07-23 08:35:49.931551] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:37.436 [2024-07-23 08:35:49.931731] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:37.694 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:37.694 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:37.694 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:37.694 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:37.694 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:37.694 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.694 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:37.952 [2024-07-23 08:35:50.261094] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:37.952 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:37.952 "name": "raid_bdev1", 00:22:37.952 "uuid": "02cf7048-c1b7-4762-bd0f-cba91bfb336b", 00:22:37.952 "strip_size_kb": 0, 00:22:37.952 "state": "online", 00:22:37.952 "raid_level": "raid1", 00:22:37.952 "superblock": false, 00:22:37.952 "num_base_bdevs": 2, 00:22:37.952 "num_base_bdevs_discovered": 2, 00:22:37.952 "num_base_bdevs_operational": 2, 00:22:37.952 "process": { 00:22:37.952 "type": "rebuild", 00:22:37.952 "target": "spare", 00:22:37.952 "progress": { 00:22:37.952 "blocks": 14336, 00:22:37.952 "percent": 21 00:22:37.952 } 00:22:37.952 }, 00:22:37.952 "base_bdevs_list": [ 00:22:37.952 { 00:22:37.952 "name": "spare", 00:22:37.952 "uuid": "cbf5c7c7-349b-51ba-ad49-ee7e73c03f79", 00:22:37.952 "is_configured": true, 00:22:37.952 "data_offset": 0, 00:22:37.952 "data_size": 65536 00:22:37.952 }, 00:22:37.952 { 00:22:37.952 "name": "BaseBdev2", 00:22:37.952 "uuid": "acacc850-11fa-5805-902c-cbe6a65a84e6", 00:22:37.952 "is_configured": true, 00:22:37.952 "data_offset": 0, 00:22:37.952 "data_size": 65536 00:22:37.952 } 00:22:37.952 ] 00:22:37.952 }' 00:22:37.952 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:37.952 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:37.952 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:37.953 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:37.953 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:22:37.953 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:37.953 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:37.953 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:37.953 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=703 00:22:37.953 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:37.953 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:37.953 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:37.953 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:37.953 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:37.953 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:37.953 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.953 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:38.211 [2024-07-23 08:35:50.476347] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:38.211 [2024-07-23 08:35:50.476624] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:38.211 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:38.211 "name": "raid_bdev1", 00:22:38.211 "uuid": "02cf7048-c1b7-4762-bd0f-cba91bfb336b", 00:22:38.211 "strip_size_kb": 0, 00:22:38.211 "state": "online", 00:22:38.211 "raid_level": "raid1", 00:22:38.211 "superblock": false, 00:22:38.211 "num_base_bdevs": 2, 00:22:38.211 "num_base_bdevs_discovered": 2, 00:22:38.211 "num_base_bdevs_operational": 2, 00:22:38.211 "process": { 00:22:38.211 "type": "rebuild", 00:22:38.211 "target": "spare", 00:22:38.211 "progress": { 00:22:38.211 "blocks": 16384, 00:22:38.211 "percent": 25 00:22:38.211 } 00:22:38.211 }, 00:22:38.211 "base_bdevs_list": [ 00:22:38.211 { 00:22:38.211 "name": "spare", 00:22:38.211 "uuid": "cbf5c7c7-349b-51ba-ad49-ee7e73c03f79", 00:22:38.211 "is_configured": true, 00:22:38.211 "data_offset": 0, 00:22:38.211 "data_size": 65536 00:22:38.211 }, 00:22:38.211 { 00:22:38.211 "name": "BaseBdev2", 00:22:38.211 "uuid": "acacc850-11fa-5805-902c-cbe6a65a84e6", 00:22:38.211 "is_configured": true, 00:22:38.211 "data_offset": 0, 00:22:38.211 "data_size": 65536 00:22:38.211 } 00:22:38.211 ] 00:22:38.211 }' 00:22:38.211 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:38.211 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:38.211 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:38.211 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:38.211 08:35:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:38.470 [2024-07-23 08:35:50.817031] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:22:38.470 [2024-07-23 08:35:50.947912] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:22:39.036 [2024-07-23 08:35:51.379649] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:22:39.294 08:35:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:39.294 08:35:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:39.294 08:35:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:39.294 08:35:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:39.294 08:35:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:39.294 08:35:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:39.294 08:35:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.294 08:35:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:39.294 [2024-07-23 08:35:51.797004] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:22:39.553 08:35:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:39.553 "name": "raid_bdev1", 00:22:39.553 "uuid": "02cf7048-c1b7-4762-bd0f-cba91bfb336b", 00:22:39.553 "strip_size_kb": 0, 00:22:39.553 "state": "online", 00:22:39.553 "raid_level": "raid1", 00:22:39.553 "superblock": false, 00:22:39.553 "num_base_bdevs": 2, 00:22:39.553 "num_base_bdevs_discovered": 2, 00:22:39.553 "num_base_bdevs_operational": 2, 00:22:39.553 "process": { 00:22:39.553 "type": "rebuild", 00:22:39.553 "target": "spare", 00:22:39.553 "progress": { 00:22:39.553 "blocks": 34816, 00:22:39.553 "percent": 53 00:22:39.553 } 00:22:39.553 }, 00:22:39.553 "base_bdevs_list": [ 00:22:39.553 { 00:22:39.553 "name": "spare", 00:22:39.553 "uuid": "cbf5c7c7-349b-51ba-ad49-ee7e73c03f79", 00:22:39.553 "is_configured": true, 00:22:39.553 "data_offset": 0, 00:22:39.553 "data_size": 65536 00:22:39.553 }, 00:22:39.553 { 00:22:39.553 "name": "BaseBdev2", 00:22:39.553 "uuid": "acacc850-11fa-5805-902c-cbe6a65a84e6", 00:22:39.553 "is_configured": true, 00:22:39.553 "data_offset": 0, 00:22:39.553 "data_size": 65536 00:22:39.553 } 00:22:39.553 ] 00:22:39.553 }' 00:22:39.553 08:35:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:39.553 08:35:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:39.553 08:35:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:39.553 08:35:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:39.553 08:35:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:39.553 [2024-07-23 08:35:52.020127] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:22:39.812 [2024-07-23 08:35:52.128779] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:22:40.071 [2024-07-23 08:35:52.341024] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:22:40.329 [2024-07-23 08:35:52.768784] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:22:40.588 08:35:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:40.588 08:35:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:40.588 08:35:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:40.588 08:35:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:40.588 08:35:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:40.588 08:35:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:40.588 08:35:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.588 08:35:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:40.588 [2024-07-23 08:35:52.983380] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:22:40.588 08:35:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:40.588 "name": "raid_bdev1", 00:22:40.588 "uuid": "02cf7048-c1b7-4762-bd0f-cba91bfb336b", 00:22:40.588 "strip_size_kb": 0, 00:22:40.588 "state": "online", 00:22:40.588 "raid_level": "raid1", 00:22:40.588 "superblock": false, 00:22:40.588 "num_base_bdevs": 2, 00:22:40.588 "num_base_bdevs_discovered": 2, 00:22:40.588 "num_base_bdevs_operational": 2, 00:22:40.588 "process": { 00:22:40.588 "type": "rebuild", 00:22:40.588 "target": "spare", 00:22:40.588 "progress": { 00:22:40.588 "blocks": 53248, 00:22:40.588 "percent": 81 00:22:40.588 } 00:22:40.588 }, 00:22:40.588 "base_bdevs_list": [ 00:22:40.588 { 00:22:40.588 "name": "spare", 00:22:40.588 "uuid": "cbf5c7c7-349b-51ba-ad49-ee7e73c03f79", 00:22:40.588 "is_configured": true, 00:22:40.588 "data_offset": 0, 00:22:40.588 "data_size": 65536 00:22:40.588 }, 00:22:40.588 { 00:22:40.588 "name": "BaseBdev2", 00:22:40.588 "uuid": "acacc850-11fa-5805-902c-cbe6a65a84e6", 00:22:40.588 "is_configured": true, 00:22:40.588 "data_offset": 0, 00:22:40.588 "data_size": 65536 00:22:40.588 } 00:22:40.588 ] 00:22:40.588 }' 00:22:40.588 08:35:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:40.846 08:35:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:40.846 08:35:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:40.846 08:35:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:40.846 08:35:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:40.846 [2024-07-23 08:35:53.201146] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:22:41.410 [2024-07-23 08:35:53.638851] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:41.410 [2024-07-23 08:35:53.739153] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:41.410 [2024-07-23 08:35:53.741019] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:41.669 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:41.669 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:41.669 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:41.669 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:41.669 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:41.928 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:41.928 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.928 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:41.928 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:41.928 "name": "raid_bdev1", 00:22:41.928 "uuid": "02cf7048-c1b7-4762-bd0f-cba91bfb336b", 00:22:41.928 "strip_size_kb": 0, 00:22:41.928 "state": "online", 00:22:41.928 "raid_level": "raid1", 00:22:41.928 "superblock": false, 00:22:41.928 "num_base_bdevs": 2, 00:22:41.928 "num_base_bdevs_discovered": 2, 00:22:41.928 "num_base_bdevs_operational": 2, 00:22:41.928 "base_bdevs_list": [ 00:22:41.928 { 00:22:41.928 "name": "spare", 00:22:41.928 "uuid": "cbf5c7c7-349b-51ba-ad49-ee7e73c03f79", 00:22:41.928 "is_configured": true, 00:22:41.928 "data_offset": 0, 00:22:41.928 "data_size": 65536 00:22:41.928 }, 00:22:41.928 { 00:22:41.928 "name": "BaseBdev2", 00:22:41.928 "uuid": "acacc850-11fa-5805-902c-cbe6a65a84e6", 00:22:41.928 "is_configured": true, 00:22:41.928 "data_offset": 0, 00:22:41.928 "data_size": 65536 00:22:41.928 } 00:22:41.928 ] 00:22:41.928 }' 00:22:41.928 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:41.928 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:41.928 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:42.187 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:42.187 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:22:42.187 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:42.187 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:42.187 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:42.187 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:42.187 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:42.187 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.187 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.187 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:42.187 "name": "raid_bdev1", 00:22:42.187 "uuid": "02cf7048-c1b7-4762-bd0f-cba91bfb336b", 00:22:42.187 "strip_size_kb": 0, 00:22:42.187 "state": "online", 00:22:42.187 "raid_level": "raid1", 00:22:42.187 "superblock": false, 00:22:42.187 "num_base_bdevs": 2, 00:22:42.187 "num_base_bdevs_discovered": 2, 00:22:42.187 "num_base_bdevs_operational": 2, 00:22:42.187 "base_bdevs_list": [ 00:22:42.187 { 00:22:42.187 "name": "spare", 00:22:42.187 "uuid": "cbf5c7c7-349b-51ba-ad49-ee7e73c03f79", 00:22:42.187 "is_configured": true, 00:22:42.187 "data_offset": 0, 00:22:42.187 "data_size": 65536 00:22:42.187 }, 00:22:42.187 { 00:22:42.187 "name": "BaseBdev2", 00:22:42.187 "uuid": "acacc850-11fa-5805-902c-cbe6a65a84e6", 00:22:42.187 "is_configured": true, 00:22:42.187 "data_offset": 0, 00:22:42.187 "data_size": 65536 00:22:42.187 } 00:22:42.187 ] 00:22:42.187 }' 00:22:42.187 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:42.187 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:42.187 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:42.446 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:42.446 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:42.446 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:42.446 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:42.446 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:42.446 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:42.446 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:42.446 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:42.446 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:42.446 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:42.446 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:42.446 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.446 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.446 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:42.446 "name": "raid_bdev1", 00:22:42.446 "uuid": "02cf7048-c1b7-4762-bd0f-cba91bfb336b", 00:22:42.446 "strip_size_kb": 0, 00:22:42.446 "state": "online", 00:22:42.446 "raid_level": "raid1", 00:22:42.446 "superblock": false, 00:22:42.446 "num_base_bdevs": 2, 00:22:42.446 "num_base_bdevs_discovered": 2, 00:22:42.446 "num_base_bdevs_operational": 2, 00:22:42.446 "base_bdevs_list": [ 00:22:42.446 { 00:22:42.446 "name": "spare", 00:22:42.446 "uuid": "cbf5c7c7-349b-51ba-ad49-ee7e73c03f79", 00:22:42.446 "is_configured": true, 00:22:42.446 "data_offset": 0, 00:22:42.446 "data_size": 65536 00:22:42.446 }, 00:22:42.446 { 00:22:42.446 "name": "BaseBdev2", 00:22:42.446 "uuid": "acacc850-11fa-5805-902c-cbe6a65a84e6", 00:22:42.446 "is_configured": true, 00:22:42.446 "data_offset": 0, 00:22:42.446 "data_size": 65536 00:22:42.446 } 00:22:42.446 ] 00:22:42.446 }' 00:22:42.446 08:35:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:42.446 08:35:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:43.013 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:43.013 [2024-07-23 08:35:55.492072] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:43.013 [2024-07-23 08:35:55.492105] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:43.271 00:22:43.271 Latency(us) 00:22:43.271 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:43.271 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:22:43.271 raid_bdev1 : 10.12 108.02 324.07 0.00 0.00 12530.86 282.82 108352.85 00:22:43.271 =================================================================================================================== 00:22:43.271 Total : 108.02 324.07 0.00 0.00 12530.86 282.82 108352.85 00:22:43.271 [2024-07-23 08:35:55.588632] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:43.271 [2024-07-23 08:35:55.588671] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:43.271 [2024-07-23 08:35:55.588740] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:43.271 [2024-07-23 08:35:55.588753] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036c80 name raid_bdev1, state offline 00:22:43.271 0 00:22:43.271 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.271 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:22:43.271 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:43.271 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:43.271 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:22:43.271 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:22:43.271 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:43.271 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:22:43.271 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:43.271 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:43.271 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:43.271 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:22:43.271 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:43.271 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:43.271 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:22:43.530 /dev/nbd0 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:43.530 1+0 records in 00:22:43.530 1+0 records out 00:22:43.530 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000215453 s, 19.0 MB/s 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:22:43.530 08:35:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:43.530 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:43.530 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:22:43.789 /dev/nbd1 00:22:43.789 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:43.789 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:43.789 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:22:43.789 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:22:43.789 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:43.789 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:43.789 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:22:43.789 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:22:43.789 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:43.789 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:43.789 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:43.789 1+0 records in 00:22:43.789 1+0 records out 00:22:43.789 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240841 s, 17.0 MB/s 00:22:43.789 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:43.789 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:22:43.789 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:43.789 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:43.790 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:22:43.790 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:43.790 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:43.790 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:22:44.049 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:22:44.049 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:44.049 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:22:44.049 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:44.049 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:22:44.049 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:44.049 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 1532444 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 1532444 ']' 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 1532444 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1532444 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1532444' 00:22:44.308 killing process with pid 1532444 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 1532444 00:22:44.308 Received shutdown signal, test time was about 11.359917 seconds 00:22:44.308 00:22:44.308 Latency(us) 00:22:44.308 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:44.308 =================================================================================================================== 00:22:44.308 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:22:44.308 [2024-07-23 08:35:56.812840] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:44.308 08:35:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 1532444 00:22:44.598 [2024-07-23 08:35:56.979537] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:45.974 08:35:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:22:45.974 00:22:45.974 real 0m16.303s 00:22:45.974 user 0m23.276s 00:22:45.974 sys 0m1.973s 00:22:45.974 08:35:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:45.974 08:35:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:45.974 ************************************ 00:22:45.974 END TEST raid_rebuild_test_io 00:22:45.974 ************************************ 00:22:45.974 08:35:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:45.974 08:35:58 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:22:45.974 08:35:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:45.974 08:35:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:45.974 08:35:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:45.974 ************************************ 00:22:45.974 START TEST raid_rebuild_test_sb_io 00:22:45.974 ************************************ 00:22:45.974 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:22:45.974 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:45.974 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:45.974 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:22:45.974 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:22:45.974 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:45.974 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:45.974 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:45.974 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1535690 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1535690 /var/tmp/spdk-raid.sock 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 1535690 ']' 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:45.975 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:45.975 08:35:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:45.975 [2024-07-23 08:35:58.461847] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:22:45.975 [2024-07-23 08:35:58.461940] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1535690 ] 00:22:45.975 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:45.975 Zero copy mechanism will not be used. 00:22:46.233 [2024-07-23 08:35:58.586272] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:46.491 [2024-07-23 08:35:58.809685] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:46.749 [2024-07-23 08:35:59.110237] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:46.749 [2024-07-23 08:35:59.110267] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:46.749 08:35:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:46.749 08:35:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:22:46.749 08:35:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:46.749 08:35:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:47.008 BaseBdev1_malloc 00:22:47.008 08:35:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:47.267 [2024-07-23 08:35:59.604409] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:47.267 [2024-07-23 08:35:59.604465] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:47.267 [2024-07-23 08:35:59.604486] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:22:47.267 [2024-07-23 08:35:59.604500] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:47.267 [2024-07-23 08:35:59.606360] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:47.267 [2024-07-23 08:35:59.606392] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:47.267 BaseBdev1 00:22:47.267 08:35:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:47.267 08:35:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:47.527 BaseBdev2_malloc 00:22:47.527 08:35:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:47.527 [2024-07-23 08:35:59.965094] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:47.527 [2024-07-23 08:35:59.965143] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:47.527 [2024-07-23 08:35:59.965161] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:22:47.527 [2024-07-23 08:35:59.965174] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:47.527 [2024-07-23 08:35:59.966967] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:47.527 [2024-07-23 08:35:59.966998] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:47.527 BaseBdev2 00:22:47.527 08:35:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:47.786 spare_malloc 00:22:47.786 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:48.045 spare_delay 00:22:48.045 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:48.045 [2024-07-23 08:36:00.495905] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:48.045 [2024-07-23 08:36:00.495957] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:48.045 [2024-07-23 08:36:00.495978] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036680 00:22:48.045 [2024-07-23 08:36:00.495989] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:48.045 [2024-07-23 08:36:00.497943] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:48.045 [2024-07-23 08:36:00.497974] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:48.045 spare 00:22:48.045 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:48.304 [2024-07-23 08:36:00.652336] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:48.304 [2024-07-23 08:36:00.653948] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:48.304 [2024-07-23 08:36:00.654136] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036c80 00:22:48.304 [2024-07-23 08:36:00.654151] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:48.304 [2024-07-23 08:36:00.654401] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:22:48.304 [2024-07-23 08:36:00.654619] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036c80 00:22:48.304 [2024-07-23 08:36:00.654631] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036c80 00:22:48.304 [2024-07-23 08:36:00.654792] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:48.304 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:48.304 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:48.304 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:48.304 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:48.304 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:48.304 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:48.304 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:48.304 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:48.304 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:48.304 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:48.304 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.304 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:48.563 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:48.563 "name": "raid_bdev1", 00:22:48.563 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:22:48.563 "strip_size_kb": 0, 00:22:48.563 "state": "online", 00:22:48.563 "raid_level": "raid1", 00:22:48.563 "superblock": true, 00:22:48.563 "num_base_bdevs": 2, 00:22:48.563 "num_base_bdevs_discovered": 2, 00:22:48.563 "num_base_bdevs_operational": 2, 00:22:48.563 "base_bdevs_list": [ 00:22:48.563 { 00:22:48.563 "name": "BaseBdev1", 00:22:48.563 "uuid": "4f5b9cd4-0232-55ac-b2f0-2052bd74baba", 00:22:48.563 "is_configured": true, 00:22:48.563 "data_offset": 2048, 00:22:48.563 "data_size": 63488 00:22:48.563 }, 00:22:48.563 { 00:22:48.563 "name": "BaseBdev2", 00:22:48.563 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:22:48.563 "is_configured": true, 00:22:48.563 "data_offset": 2048, 00:22:48.563 "data_size": 63488 00:22:48.563 } 00:22:48.563 ] 00:22:48.563 }' 00:22:48.563 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:48.563 08:36:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:48.821 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:48.821 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:49.080 [2024-07-23 08:36:01.450662] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:49.080 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:22:49.080 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.080 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:49.338 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:22:49.338 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:22:49.338 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:49.338 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:49.338 [2024-07-23 08:36:01.742336] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c130 00:22:49.338 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:49.338 Zero copy mechanism will not be used. 00:22:49.338 Running I/O for 60 seconds... 00:22:49.338 [2024-07-23 08:36:01.803646] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:49.338 [2024-07-23 08:36:01.809415] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d00000c130 00:22:49.338 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:49.338 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:49.338 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:49.338 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:49.338 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:49.338 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:49.338 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:49.338 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:49.338 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:49.338 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:49.338 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.338 08:36:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.598 08:36:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:49.598 "name": "raid_bdev1", 00:22:49.598 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:22:49.598 "strip_size_kb": 0, 00:22:49.598 "state": "online", 00:22:49.598 "raid_level": "raid1", 00:22:49.598 "superblock": true, 00:22:49.598 "num_base_bdevs": 2, 00:22:49.598 "num_base_bdevs_discovered": 1, 00:22:49.598 "num_base_bdevs_operational": 1, 00:22:49.598 "base_bdevs_list": [ 00:22:49.598 { 00:22:49.598 "name": null, 00:22:49.598 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:49.598 "is_configured": false, 00:22:49.598 "data_offset": 2048, 00:22:49.598 "data_size": 63488 00:22:49.598 }, 00:22:49.598 { 00:22:49.598 "name": "BaseBdev2", 00:22:49.598 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:22:49.598 "is_configured": true, 00:22:49.598 "data_offset": 2048, 00:22:49.598 "data_size": 63488 00:22:49.598 } 00:22:49.598 ] 00:22:49.598 }' 00:22:49.598 08:36:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:49.598 08:36:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:50.166 08:36:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:50.166 [2024-07-23 08:36:02.629446] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:50.166 08:36:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:50.425 [2024-07-23 08:36:02.693933] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c200 00:22:50.425 [2024-07-23 08:36:02.695622] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:50.425 [2024-07-23 08:36:02.819340] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:50.425 [2024-07-23 08:36:02.819865] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:50.684 [2024-07-23 08:36:03.043559] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:50.684 [2024-07-23 08:36:03.043863] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:51.253 [2024-07-23 08:36:03.528310] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:51.253 08:36:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:51.253 08:36:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:51.253 08:36:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:51.253 08:36:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:51.253 08:36:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:51.253 08:36:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.253 08:36:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:51.512 [2024-07-23 08:36:03.777080] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:51.512 08:36:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:51.512 "name": "raid_bdev1", 00:22:51.512 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:22:51.512 "strip_size_kb": 0, 00:22:51.512 "state": "online", 00:22:51.512 "raid_level": "raid1", 00:22:51.512 "superblock": true, 00:22:51.512 "num_base_bdevs": 2, 00:22:51.512 "num_base_bdevs_discovered": 2, 00:22:51.512 "num_base_bdevs_operational": 2, 00:22:51.512 "process": { 00:22:51.512 "type": "rebuild", 00:22:51.512 "target": "spare", 00:22:51.512 "progress": { 00:22:51.512 "blocks": 14336, 00:22:51.512 "percent": 22 00:22:51.512 } 00:22:51.512 }, 00:22:51.512 "base_bdevs_list": [ 00:22:51.512 { 00:22:51.512 "name": "spare", 00:22:51.512 "uuid": "ab6210c7-5c73-5e5d-b6d4-af1baa91f7ea", 00:22:51.512 "is_configured": true, 00:22:51.512 "data_offset": 2048, 00:22:51.512 "data_size": 63488 00:22:51.512 }, 00:22:51.512 { 00:22:51.512 "name": "BaseBdev2", 00:22:51.512 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:22:51.512 "is_configured": true, 00:22:51.512 "data_offset": 2048, 00:22:51.512 "data_size": 63488 00:22:51.512 } 00:22:51.512 ] 00:22:51.512 }' 00:22:51.512 08:36:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:51.512 08:36:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:51.512 08:36:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:51.512 08:36:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:51.512 08:36:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:51.770 [2024-07-23 08:36:04.091514] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:51.771 [2024-07-23 08:36:04.116756] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:22:51.771 [2024-07-23 08:36:04.225080] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:51.771 [2024-07-23 08:36:04.226894] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:51.771 [2024-07-23 08:36:04.226926] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:51.771 [2024-07-23 08:36:04.226939] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:51.771 [2024-07-23 08:36:04.276368] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d00000c130 00:22:52.030 08:36:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:52.030 08:36:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:52.030 08:36:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:52.030 08:36:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:52.030 08:36:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:52.030 08:36:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:52.030 08:36:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:52.030 08:36:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:52.030 08:36:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:52.030 08:36:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:52.030 08:36:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.030 08:36:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:52.030 08:36:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:52.030 "name": "raid_bdev1", 00:22:52.030 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:22:52.030 "strip_size_kb": 0, 00:22:52.030 "state": "online", 00:22:52.030 "raid_level": "raid1", 00:22:52.030 "superblock": true, 00:22:52.030 "num_base_bdevs": 2, 00:22:52.030 "num_base_bdevs_discovered": 1, 00:22:52.030 "num_base_bdevs_operational": 1, 00:22:52.030 "base_bdevs_list": [ 00:22:52.030 { 00:22:52.030 "name": null, 00:22:52.030 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:52.030 "is_configured": false, 00:22:52.030 "data_offset": 2048, 00:22:52.030 "data_size": 63488 00:22:52.030 }, 00:22:52.030 { 00:22:52.030 "name": "BaseBdev2", 00:22:52.030 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:22:52.030 "is_configured": true, 00:22:52.030 "data_offset": 2048, 00:22:52.030 "data_size": 63488 00:22:52.030 } 00:22:52.030 ] 00:22:52.030 }' 00:22:52.030 08:36:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:52.030 08:36:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:52.596 08:36:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:52.596 08:36:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:52.596 08:36:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:52.596 08:36:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:52.596 08:36:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:52.597 08:36:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.597 08:36:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:52.855 08:36:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:52.855 "name": "raid_bdev1", 00:22:52.855 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:22:52.855 "strip_size_kb": 0, 00:22:52.855 "state": "online", 00:22:52.855 "raid_level": "raid1", 00:22:52.855 "superblock": true, 00:22:52.855 "num_base_bdevs": 2, 00:22:52.855 "num_base_bdevs_discovered": 1, 00:22:52.855 "num_base_bdevs_operational": 1, 00:22:52.855 "base_bdevs_list": [ 00:22:52.855 { 00:22:52.855 "name": null, 00:22:52.855 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:52.855 "is_configured": false, 00:22:52.855 "data_offset": 2048, 00:22:52.855 "data_size": 63488 00:22:52.855 }, 00:22:52.855 { 00:22:52.855 "name": "BaseBdev2", 00:22:52.855 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:22:52.855 "is_configured": true, 00:22:52.855 "data_offset": 2048, 00:22:52.855 "data_size": 63488 00:22:52.855 } 00:22:52.855 ] 00:22:52.855 }' 00:22:52.855 08:36:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:52.855 08:36:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:52.855 08:36:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:52.855 08:36:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:52.855 08:36:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:53.114 [2024-07-23 08:36:05.442380] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:53.114 08:36:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:53.114 [2024-07-23 08:36:05.509681] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c2d0 00:22:53.114 [2024-07-23 08:36:05.511310] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:53.373 [2024-07-23 08:36:05.636720] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:53.373 [2024-07-23 08:36:05.637212] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:53.373 [2024-07-23 08:36:05.866887] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:53.373 [2024-07-23 08:36:05.867176] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:53.941 [2024-07-23 08:36:06.223057] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:53.941 [2024-07-23 08:36:06.448879] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:54.199 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:54.199 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:54.199 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:54.199 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:54.199 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:54.199 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.199 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.199 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:54.199 "name": "raid_bdev1", 00:22:54.199 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:22:54.199 "strip_size_kb": 0, 00:22:54.199 "state": "online", 00:22:54.199 "raid_level": "raid1", 00:22:54.199 "superblock": true, 00:22:54.199 "num_base_bdevs": 2, 00:22:54.199 "num_base_bdevs_discovered": 2, 00:22:54.199 "num_base_bdevs_operational": 2, 00:22:54.199 "process": { 00:22:54.199 "type": "rebuild", 00:22:54.199 "target": "spare", 00:22:54.199 "progress": { 00:22:54.199 "blocks": 10240, 00:22:54.199 "percent": 16 00:22:54.199 } 00:22:54.199 }, 00:22:54.199 "base_bdevs_list": [ 00:22:54.199 { 00:22:54.199 "name": "spare", 00:22:54.199 "uuid": "ab6210c7-5c73-5e5d-b6d4-af1baa91f7ea", 00:22:54.199 "is_configured": true, 00:22:54.199 "data_offset": 2048, 00:22:54.199 "data_size": 63488 00:22:54.199 }, 00:22:54.199 { 00:22:54.199 "name": "BaseBdev2", 00:22:54.199 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:22:54.199 "is_configured": true, 00:22:54.199 "data_offset": 2048, 00:22:54.199 "data_size": 63488 00:22:54.199 } 00:22:54.199 ] 00:22:54.199 }' 00:22:54.199 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:54.199 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:54.199 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:22:54.458 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=719 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.458 [2024-07-23 08:36:06.779576] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:54.458 [2024-07-23 08:36:06.779931] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:54.458 "name": "raid_bdev1", 00:22:54.458 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:22:54.458 "strip_size_kb": 0, 00:22:54.458 "state": "online", 00:22:54.458 "raid_level": "raid1", 00:22:54.458 "superblock": true, 00:22:54.458 "num_base_bdevs": 2, 00:22:54.458 "num_base_bdevs_discovered": 2, 00:22:54.458 "num_base_bdevs_operational": 2, 00:22:54.458 "process": { 00:22:54.458 "type": "rebuild", 00:22:54.458 "target": "spare", 00:22:54.458 "progress": { 00:22:54.458 "blocks": 14336, 00:22:54.458 "percent": 22 00:22:54.458 } 00:22:54.458 }, 00:22:54.458 "base_bdevs_list": [ 00:22:54.458 { 00:22:54.458 "name": "spare", 00:22:54.458 "uuid": "ab6210c7-5c73-5e5d-b6d4-af1baa91f7ea", 00:22:54.458 "is_configured": true, 00:22:54.458 "data_offset": 2048, 00:22:54.458 "data_size": 63488 00:22:54.458 }, 00:22:54.458 { 00:22:54.458 "name": "BaseBdev2", 00:22:54.458 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:22:54.458 "is_configured": true, 00:22:54.458 "data_offset": 2048, 00:22:54.458 "data_size": 63488 00:22:54.458 } 00:22:54.458 ] 00:22:54.458 }' 00:22:54.458 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:54.716 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:54.716 08:36:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:54.716 [2024-07-23 08:36:06.983852] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:54.716 08:36:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:54.716 08:36:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:55.652 08:36:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:55.652 08:36:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:55.652 08:36:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:55.652 08:36:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:55.652 08:36:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:55.652 08:36:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:55.652 08:36:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.652 08:36:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:55.911 08:36:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:55.911 "name": "raid_bdev1", 00:22:55.911 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:22:55.911 "strip_size_kb": 0, 00:22:55.911 "state": "online", 00:22:55.911 "raid_level": "raid1", 00:22:55.911 "superblock": true, 00:22:55.911 "num_base_bdevs": 2, 00:22:55.911 "num_base_bdevs_discovered": 2, 00:22:55.911 "num_base_bdevs_operational": 2, 00:22:55.911 "process": { 00:22:55.911 "type": "rebuild", 00:22:55.911 "target": "spare", 00:22:55.911 "progress": { 00:22:55.911 "blocks": 34816, 00:22:55.911 "percent": 54 00:22:55.911 } 00:22:55.911 }, 00:22:55.911 "base_bdevs_list": [ 00:22:55.911 { 00:22:55.911 "name": "spare", 00:22:55.911 "uuid": "ab6210c7-5c73-5e5d-b6d4-af1baa91f7ea", 00:22:55.911 "is_configured": true, 00:22:55.911 "data_offset": 2048, 00:22:55.911 "data_size": 63488 00:22:55.911 }, 00:22:55.911 { 00:22:55.911 "name": "BaseBdev2", 00:22:55.911 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:22:55.911 "is_configured": true, 00:22:55.911 "data_offset": 2048, 00:22:55.911 "data_size": 63488 00:22:55.911 } 00:22:55.911 ] 00:22:55.911 }' 00:22:55.911 08:36:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:55.911 08:36:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:55.911 08:36:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:55.911 08:36:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:55.911 08:36:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:56.847 [2024-07-23 08:36:09.000263] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:22:56.847 [2024-07-23 08:36:09.225652] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:22:56.847 08:36:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:56.847 08:36:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:56.847 08:36:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:56.847 08:36:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:56.847 08:36:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:56.847 08:36:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:56.847 08:36:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.847 08:36:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.106 08:36:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:57.106 "name": "raid_bdev1", 00:22:57.106 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:22:57.106 "strip_size_kb": 0, 00:22:57.106 "state": "online", 00:22:57.106 "raid_level": "raid1", 00:22:57.106 "superblock": true, 00:22:57.106 "num_base_bdevs": 2, 00:22:57.106 "num_base_bdevs_discovered": 2, 00:22:57.106 "num_base_bdevs_operational": 2, 00:22:57.106 "process": { 00:22:57.106 "type": "rebuild", 00:22:57.106 "target": "spare", 00:22:57.106 "progress": { 00:22:57.106 "blocks": 53248, 00:22:57.106 "percent": 83 00:22:57.106 } 00:22:57.106 }, 00:22:57.106 "base_bdevs_list": [ 00:22:57.106 { 00:22:57.106 "name": "spare", 00:22:57.106 "uuid": "ab6210c7-5c73-5e5d-b6d4-af1baa91f7ea", 00:22:57.106 "is_configured": true, 00:22:57.106 "data_offset": 2048, 00:22:57.106 "data_size": 63488 00:22:57.106 }, 00:22:57.106 { 00:22:57.106 "name": "BaseBdev2", 00:22:57.106 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:22:57.106 "is_configured": true, 00:22:57.106 "data_offset": 2048, 00:22:57.106 "data_size": 63488 00:22:57.106 } 00:22:57.106 ] 00:22:57.106 }' 00:22:57.106 08:36:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:57.106 08:36:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:57.106 08:36:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:57.106 08:36:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:57.106 08:36:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:57.364 [2024-07-23 08:36:09.878093] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:57.622 [2024-07-23 08:36:09.985501] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:57.622 [2024-07-23 08:36:09.988056] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:58.220 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:58.220 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:58.220 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:58.220 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:58.220 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:58.220 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:58.220 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.220 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.220 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:58.220 "name": "raid_bdev1", 00:22:58.220 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:22:58.220 "strip_size_kb": 0, 00:22:58.220 "state": "online", 00:22:58.220 "raid_level": "raid1", 00:22:58.220 "superblock": true, 00:22:58.220 "num_base_bdevs": 2, 00:22:58.220 "num_base_bdevs_discovered": 2, 00:22:58.220 "num_base_bdevs_operational": 2, 00:22:58.220 "base_bdevs_list": [ 00:22:58.220 { 00:22:58.220 "name": "spare", 00:22:58.220 "uuid": "ab6210c7-5c73-5e5d-b6d4-af1baa91f7ea", 00:22:58.220 "is_configured": true, 00:22:58.220 "data_offset": 2048, 00:22:58.220 "data_size": 63488 00:22:58.220 }, 00:22:58.220 { 00:22:58.220 "name": "BaseBdev2", 00:22:58.220 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:22:58.220 "is_configured": true, 00:22:58.220 "data_offset": 2048, 00:22:58.221 "data_size": 63488 00:22:58.221 } 00:22:58.221 ] 00:22:58.221 }' 00:22:58.481 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:58.481 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:58.481 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:58.481 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:58.481 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:22:58.481 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:58.481 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:58.481 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:58.481 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:58.481 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:58.481 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.481 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.481 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:58.481 "name": "raid_bdev1", 00:22:58.481 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:22:58.481 "strip_size_kb": 0, 00:22:58.481 "state": "online", 00:22:58.481 "raid_level": "raid1", 00:22:58.481 "superblock": true, 00:22:58.481 "num_base_bdevs": 2, 00:22:58.481 "num_base_bdevs_discovered": 2, 00:22:58.481 "num_base_bdevs_operational": 2, 00:22:58.481 "base_bdevs_list": [ 00:22:58.481 { 00:22:58.481 "name": "spare", 00:22:58.481 "uuid": "ab6210c7-5c73-5e5d-b6d4-af1baa91f7ea", 00:22:58.481 "is_configured": true, 00:22:58.481 "data_offset": 2048, 00:22:58.481 "data_size": 63488 00:22:58.481 }, 00:22:58.481 { 00:22:58.481 "name": "BaseBdev2", 00:22:58.481 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:22:58.481 "is_configured": true, 00:22:58.481 "data_offset": 2048, 00:22:58.481 "data_size": 63488 00:22:58.481 } 00:22:58.481 ] 00:22:58.481 }' 00:22:58.740 08:36:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:58.740 "name": "raid_bdev1", 00:22:58.740 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:22:58.740 "strip_size_kb": 0, 00:22:58.740 "state": "online", 00:22:58.740 "raid_level": "raid1", 00:22:58.740 "superblock": true, 00:22:58.740 "num_base_bdevs": 2, 00:22:58.740 "num_base_bdevs_discovered": 2, 00:22:58.740 "num_base_bdevs_operational": 2, 00:22:58.740 "base_bdevs_list": [ 00:22:58.740 { 00:22:58.740 "name": "spare", 00:22:58.740 "uuid": "ab6210c7-5c73-5e5d-b6d4-af1baa91f7ea", 00:22:58.740 "is_configured": true, 00:22:58.740 "data_offset": 2048, 00:22:58.740 "data_size": 63488 00:22:58.740 }, 00:22:58.740 { 00:22:58.740 "name": "BaseBdev2", 00:22:58.740 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:22:58.740 "is_configured": true, 00:22:58.740 "data_offset": 2048, 00:22:58.740 "data_size": 63488 00:22:58.740 } 00:22:58.740 ] 00:22:58.740 }' 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:58.740 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:59.307 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:59.565 [2024-07-23 08:36:11.842870] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:59.565 [2024-07-23 08:36:11.842903] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:59.565 00:22:59.565 Latency(us) 00:22:59.565 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:59.565 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:22:59.565 raid_bdev1 : 10.17 107.77 323.31 0.00 0.00 13229.51 292.57 112347.43 00:22:59.565 =================================================================================================================== 00:22:59.565 Total : 107.77 323.31 0.00 0.00 13229.51 292.57 112347.43 00:22:59.566 [2024-07-23 08:36:11.959206] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:59.566 [2024-07-23 08:36:11.959246] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:59.566 [2024-07-23 08:36:11.959317] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:59.566 [2024-07-23 08:36:11.959333] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036c80 name raid_bdev1, state offline 00:22:59.566 0 00:22:59.566 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.566 08:36:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:22:59.825 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:59.825 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:59.825 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:22:59.825 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:22:59.825 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:59.825 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:22:59.825 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:59.825 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:59.825 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:59.825 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:22:59.825 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:59.825 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:59.825 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:22:59.825 /dev/nbd0 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:00.084 1+0 records in 00:23:00.084 1+0 records out 00:23:00.084 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251127 s, 16.3 MB/s 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:23:00.084 /dev/nbd1 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:00.084 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:00.084 1+0 records in 00:23:00.084 1+0 records out 00:23:00.084 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237056 s, 17.3 MB/s 00:23:00.344 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:00.344 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:23:00.344 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:00.344 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:00.344 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:23:00.344 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:00.344 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:00.344 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:00.344 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:00.344 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:00.344 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:00.344 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:00.344 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:00.344 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:00.344 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:00.603 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:00.603 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:00.603 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:00.603 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:00.603 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:00.603 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:00.603 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:00.603 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:00.603 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:00.603 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:00.603 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:00.603 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:00.603 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:00.603 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:00.603 08:36:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:00.863 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:00.863 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:00.863 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:00.863 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:00.863 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:00.863 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:00.863 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:00.863 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:00.863 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:00.863 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:00.863 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:01.126 [2024-07-23 08:36:13.479744] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:01.126 [2024-07-23 08:36:13.479809] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:01.126 [2024-07-23 08:36:13.479845] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038780 00:23:01.126 [2024-07-23 08:36:13.479858] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:01.126 [2024-07-23 08:36:13.481787] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:01.126 [2024-07-23 08:36:13.481814] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:01.126 [2024-07-23 08:36:13.481896] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:01.126 [2024-07-23 08:36:13.481946] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:01.126 [2024-07-23 08:36:13.482088] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:01.126 spare 00:23:01.126 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:01.126 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:01.126 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:01.126 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:01.126 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:01.126 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:01.126 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:01.126 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:01.126 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:01.126 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:01.126 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.126 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.126 [2024-07-23 08:36:13.582437] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000038d80 00:23:01.126 [2024-07-23 08:36:13.582469] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:01.126 [2024-07-23 08:36:13.582744] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00001e1a0 00:23:01.126 [2024-07-23 08:36:13.582970] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000038d80 00:23:01.126 [2024-07-23 08:36:13.582984] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000038d80 00:23:01.126 [2024-07-23 08:36:13.583151] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:01.384 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:01.384 "name": "raid_bdev1", 00:23:01.384 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:23:01.384 "strip_size_kb": 0, 00:23:01.384 "state": "online", 00:23:01.384 "raid_level": "raid1", 00:23:01.384 "superblock": true, 00:23:01.384 "num_base_bdevs": 2, 00:23:01.384 "num_base_bdevs_discovered": 2, 00:23:01.384 "num_base_bdevs_operational": 2, 00:23:01.384 "base_bdevs_list": [ 00:23:01.384 { 00:23:01.384 "name": "spare", 00:23:01.384 "uuid": "ab6210c7-5c73-5e5d-b6d4-af1baa91f7ea", 00:23:01.384 "is_configured": true, 00:23:01.384 "data_offset": 2048, 00:23:01.384 "data_size": 63488 00:23:01.384 }, 00:23:01.384 { 00:23:01.384 "name": "BaseBdev2", 00:23:01.384 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:23:01.384 "is_configured": true, 00:23:01.384 "data_offset": 2048, 00:23:01.384 "data_size": 63488 00:23:01.384 } 00:23:01.384 ] 00:23:01.384 }' 00:23:01.384 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:01.384 08:36:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:01.951 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:01.951 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:01.951 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:01.951 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:01.951 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:01.951 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.951 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.951 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:01.951 "name": "raid_bdev1", 00:23:01.951 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:23:01.951 "strip_size_kb": 0, 00:23:01.951 "state": "online", 00:23:01.951 "raid_level": "raid1", 00:23:01.951 "superblock": true, 00:23:01.951 "num_base_bdevs": 2, 00:23:01.951 "num_base_bdevs_discovered": 2, 00:23:01.951 "num_base_bdevs_operational": 2, 00:23:01.951 "base_bdevs_list": [ 00:23:01.951 { 00:23:01.951 "name": "spare", 00:23:01.951 "uuid": "ab6210c7-5c73-5e5d-b6d4-af1baa91f7ea", 00:23:01.951 "is_configured": true, 00:23:01.951 "data_offset": 2048, 00:23:01.951 "data_size": 63488 00:23:01.951 }, 00:23:01.951 { 00:23:01.951 "name": "BaseBdev2", 00:23:01.951 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:23:01.951 "is_configured": true, 00:23:01.951 "data_offset": 2048, 00:23:01.951 "data_size": 63488 00:23:01.951 } 00:23:01.951 ] 00:23:01.951 }' 00:23:01.951 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:01.951 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:01.951 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:01.951 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:01.951 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.951 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:02.211 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:02.211 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:02.470 [2024-07-23 08:36:14.751366] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:02.470 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:02.470 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:02.470 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:02.470 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:02.470 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:02.470 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:02.470 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:02.470 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:02.470 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:02.470 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:02.470 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.470 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:02.470 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:02.470 "name": "raid_bdev1", 00:23:02.470 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:23:02.470 "strip_size_kb": 0, 00:23:02.470 "state": "online", 00:23:02.470 "raid_level": "raid1", 00:23:02.470 "superblock": true, 00:23:02.470 "num_base_bdevs": 2, 00:23:02.470 "num_base_bdevs_discovered": 1, 00:23:02.470 "num_base_bdevs_operational": 1, 00:23:02.470 "base_bdevs_list": [ 00:23:02.470 { 00:23:02.470 "name": null, 00:23:02.470 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:02.470 "is_configured": false, 00:23:02.470 "data_offset": 2048, 00:23:02.470 "data_size": 63488 00:23:02.470 }, 00:23:02.470 { 00:23:02.470 "name": "BaseBdev2", 00:23:02.470 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:23:02.470 "is_configured": true, 00:23:02.470 "data_offset": 2048, 00:23:02.470 "data_size": 63488 00:23:02.470 } 00:23:02.470 ] 00:23:02.470 }' 00:23:02.470 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:02.470 08:36:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:03.037 08:36:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:03.296 [2024-07-23 08:36:15.593718] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:03.296 [2024-07-23 08:36:15.593896] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:03.296 [2024-07-23 08:36:15.593914] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:03.296 [2024-07-23 08:36:15.593946] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:03.296 [2024-07-23 08:36:15.613310] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00001e270 00:23:03.296 [2024-07-23 08:36:15.614868] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:03.296 08:36:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:04.232 08:36:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:04.232 08:36:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:04.232 08:36:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:04.232 08:36:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:04.232 08:36:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:04.232 08:36:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.232 08:36:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.491 08:36:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:04.491 "name": "raid_bdev1", 00:23:04.491 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:23:04.491 "strip_size_kb": 0, 00:23:04.491 "state": "online", 00:23:04.491 "raid_level": "raid1", 00:23:04.491 "superblock": true, 00:23:04.491 "num_base_bdevs": 2, 00:23:04.491 "num_base_bdevs_discovered": 2, 00:23:04.491 "num_base_bdevs_operational": 2, 00:23:04.491 "process": { 00:23:04.491 "type": "rebuild", 00:23:04.491 "target": "spare", 00:23:04.491 "progress": { 00:23:04.491 "blocks": 22528, 00:23:04.491 "percent": 35 00:23:04.491 } 00:23:04.491 }, 00:23:04.491 "base_bdevs_list": [ 00:23:04.491 { 00:23:04.491 "name": "spare", 00:23:04.491 "uuid": "ab6210c7-5c73-5e5d-b6d4-af1baa91f7ea", 00:23:04.491 "is_configured": true, 00:23:04.491 "data_offset": 2048, 00:23:04.491 "data_size": 63488 00:23:04.491 }, 00:23:04.491 { 00:23:04.491 "name": "BaseBdev2", 00:23:04.491 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:23:04.491 "is_configured": true, 00:23:04.491 "data_offset": 2048, 00:23:04.491 "data_size": 63488 00:23:04.491 } 00:23:04.491 ] 00:23:04.491 }' 00:23:04.491 08:36:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:04.491 08:36:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:04.491 08:36:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:04.491 08:36:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:04.491 08:36:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:04.750 [2024-07-23 08:36:17.068824] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:04.750 [2024-07-23 08:36:17.126810] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:04.750 [2024-07-23 08:36:17.126854] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:04.750 [2024-07-23 08:36:17.126886] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:04.750 [2024-07-23 08:36:17.126894] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:04.750 08:36:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:04.750 08:36:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:04.750 08:36:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:04.750 08:36:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:04.750 08:36:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:04.750 08:36:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:04.750 08:36:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:04.750 08:36:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:04.750 08:36:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:04.750 08:36:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:04.750 08:36:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.750 08:36:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.009 08:36:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:05.009 "name": "raid_bdev1", 00:23:05.009 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:23:05.009 "strip_size_kb": 0, 00:23:05.009 "state": "online", 00:23:05.009 "raid_level": "raid1", 00:23:05.009 "superblock": true, 00:23:05.009 "num_base_bdevs": 2, 00:23:05.009 "num_base_bdevs_discovered": 1, 00:23:05.009 "num_base_bdevs_operational": 1, 00:23:05.009 "base_bdevs_list": [ 00:23:05.009 { 00:23:05.009 "name": null, 00:23:05.009 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:05.009 "is_configured": false, 00:23:05.009 "data_offset": 2048, 00:23:05.009 "data_size": 63488 00:23:05.009 }, 00:23:05.009 { 00:23:05.009 "name": "BaseBdev2", 00:23:05.009 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:23:05.009 "is_configured": true, 00:23:05.009 "data_offset": 2048, 00:23:05.009 "data_size": 63488 00:23:05.009 } 00:23:05.009 ] 00:23:05.009 }' 00:23:05.009 08:36:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:05.009 08:36:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:05.577 08:36:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:05.577 [2024-07-23 08:36:17.982872] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:05.577 [2024-07-23 08:36:17.982931] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:05.577 [2024-07-23 08:36:17.982956] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039380 00:23:05.577 [2024-07-23 08:36:17.982967] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:05.577 [2024-07-23 08:36:17.983480] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:05.577 [2024-07-23 08:36:17.983501] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:05.577 [2024-07-23 08:36:17.983600] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:05.577 [2024-07-23 08:36:17.983622] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:05.577 [2024-07-23 08:36:17.983638] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:05.577 [2024-07-23 08:36:17.983663] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:05.577 [2024-07-23 08:36:18.004193] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00001e340 00:23:05.577 spare 00:23:05.577 [2024-07-23 08:36:18.005853] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:05.577 08:36:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:06.513 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:06.513 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:06.513 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:06.513 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:06.513 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:06.513 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.513 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:06.772 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:06.772 "name": "raid_bdev1", 00:23:06.772 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:23:06.772 "strip_size_kb": 0, 00:23:06.772 "state": "online", 00:23:06.772 "raid_level": "raid1", 00:23:06.772 "superblock": true, 00:23:06.772 "num_base_bdevs": 2, 00:23:06.772 "num_base_bdevs_discovered": 2, 00:23:06.772 "num_base_bdevs_operational": 2, 00:23:06.772 "process": { 00:23:06.772 "type": "rebuild", 00:23:06.772 "target": "spare", 00:23:06.772 "progress": { 00:23:06.772 "blocks": 22528, 00:23:06.772 "percent": 35 00:23:06.772 } 00:23:06.772 }, 00:23:06.772 "base_bdevs_list": [ 00:23:06.772 { 00:23:06.772 "name": "spare", 00:23:06.772 "uuid": "ab6210c7-5c73-5e5d-b6d4-af1baa91f7ea", 00:23:06.772 "is_configured": true, 00:23:06.772 "data_offset": 2048, 00:23:06.772 "data_size": 63488 00:23:06.772 }, 00:23:06.772 { 00:23:06.772 "name": "BaseBdev2", 00:23:06.772 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:23:06.772 "is_configured": true, 00:23:06.772 "data_offset": 2048, 00:23:06.772 "data_size": 63488 00:23:06.772 } 00:23:06.772 ] 00:23:06.772 }' 00:23:06.772 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:06.772 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:06.772 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:06.772 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:06.772 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:07.031 [2024-07-23 08:36:19.423346] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:07.031 [2024-07-23 08:36:19.518076] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:07.031 [2024-07-23 08:36:19.518158] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:07.031 [2024-07-23 08:36:19.518174] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:07.031 [2024-07-23 08:36:19.518184] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:07.289 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:07.289 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:07.289 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:07.289 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:07.289 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:07.289 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:07.289 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:07.289 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:07.289 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:07.289 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:07.290 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:07.290 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:07.290 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:07.290 "name": "raid_bdev1", 00:23:07.290 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:23:07.290 "strip_size_kb": 0, 00:23:07.290 "state": "online", 00:23:07.290 "raid_level": "raid1", 00:23:07.290 "superblock": true, 00:23:07.290 "num_base_bdevs": 2, 00:23:07.290 "num_base_bdevs_discovered": 1, 00:23:07.290 "num_base_bdevs_operational": 1, 00:23:07.290 "base_bdevs_list": [ 00:23:07.290 { 00:23:07.290 "name": null, 00:23:07.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:07.290 "is_configured": false, 00:23:07.290 "data_offset": 2048, 00:23:07.290 "data_size": 63488 00:23:07.290 }, 00:23:07.290 { 00:23:07.290 "name": "BaseBdev2", 00:23:07.290 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:23:07.290 "is_configured": true, 00:23:07.290 "data_offset": 2048, 00:23:07.290 "data_size": 63488 00:23:07.290 } 00:23:07.290 ] 00:23:07.290 }' 00:23:07.290 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:07.290 08:36:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:07.857 08:36:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:07.857 08:36:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:07.857 08:36:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:07.857 08:36:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:07.857 08:36:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:07.857 08:36:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:07.857 08:36:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:08.116 08:36:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:08.116 "name": "raid_bdev1", 00:23:08.116 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:23:08.116 "strip_size_kb": 0, 00:23:08.116 "state": "online", 00:23:08.116 "raid_level": "raid1", 00:23:08.116 "superblock": true, 00:23:08.116 "num_base_bdevs": 2, 00:23:08.116 "num_base_bdevs_discovered": 1, 00:23:08.116 "num_base_bdevs_operational": 1, 00:23:08.116 "base_bdevs_list": [ 00:23:08.116 { 00:23:08.116 "name": null, 00:23:08.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:08.116 "is_configured": false, 00:23:08.116 "data_offset": 2048, 00:23:08.116 "data_size": 63488 00:23:08.116 }, 00:23:08.116 { 00:23:08.116 "name": "BaseBdev2", 00:23:08.116 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:23:08.116 "is_configured": true, 00:23:08.116 "data_offset": 2048, 00:23:08.116 "data_size": 63488 00:23:08.116 } 00:23:08.116 ] 00:23:08.116 }' 00:23:08.116 08:36:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:08.116 08:36:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:08.116 08:36:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:08.116 08:36:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:08.116 08:36:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:08.382 08:36:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:08.382 [2024-07-23 08:36:20.834454] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:08.382 [2024-07-23 08:36:20.834512] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:08.382 [2024-07-23 08:36:20.834533] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039980 00:23:08.382 [2024-07-23 08:36:20.834544] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:08.382 [2024-07-23 08:36:20.835008] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:08.382 [2024-07-23 08:36:20.835029] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:08.382 [2024-07-23 08:36:20.835107] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:08.382 [2024-07-23 08:36:20.835125] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:08.383 [2024-07-23 08:36:20.835133] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:08.383 BaseBdev1 00:23:08.383 08:36:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:09.759 08:36:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:09.759 08:36:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:09.759 08:36:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:09.759 08:36:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:09.759 08:36:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:09.759 08:36:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:09.759 08:36:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:09.759 08:36:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:09.759 08:36:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:09.759 08:36:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:09.759 08:36:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.759 08:36:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.759 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:09.759 "name": "raid_bdev1", 00:23:09.759 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:23:09.759 "strip_size_kb": 0, 00:23:09.759 "state": "online", 00:23:09.759 "raid_level": "raid1", 00:23:09.759 "superblock": true, 00:23:09.759 "num_base_bdevs": 2, 00:23:09.759 "num_base_bdevs_discovered": 1, 00:23:09.760 "num_base_bdevs_operational": 1, 00:23:09.760 "base_bdevs_list": [ 00:23:09.760 { 00:23:09.760 "name": null, 00:23:09.760 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:09.760 "is_configured": false, 00:23:09.760 "data_offset": 2048, 00:23:09.760 "data_size": 63488 00:23:09.760 }, 00:23:09.760 { 00:23:09.760 "name": "BaseBdev2", 00:23:09.760 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:23:09.760 "is_configured": true, 00:23:09.760 "data_offset": 2048, 00:23:09.760 "data_size": 63488 00:23:09.760 } 00:23:09.760 ] 00:23:09.760 }' 00:23:09.760 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:09.760 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:10.017 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:10.017 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:10.017 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:10.017 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:10.017 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:10.017 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.017 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:10.275 "name": "raid_bdev1", 00:23:10.275 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:23:10.275 "strip_size_kb": 0, 00:23:10.275 "state": "online", 00:23:10.275 "raid_level": "raid1", 00:23:10.275 "superblock": true, 00:23:10.275 "num_base_bdevs": 2, 00:23:10.275 "num_base_bdevs_discovered": 1, 00:23:10.275 "num_base_bdevs_operational": 1, 00:23:10.275 "base_bdevs_list": [ 00:23:10.275 { 00:23:10.275 "name": null, 00:23:10.275 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:10.275 "is_configured": false, 00:23:10.275 "data_offset": 2048, 00:23:10.275 "data_size": 63488 00:23:10.275 }, 00:23:10.275 { 00:23:10.275 "name": "BaseBdev2", 00:23:10.275 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:23:10.275 "is_configured": true, 00:23:10.275 "data_offset": 2048, 00:23:10.275 "data_size": 63488 00:23:10.275 } 00:23:10.275 ] 00:23:10.275 }' 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:10.275 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:10.533 [2024-07-23 08:36:22.932237] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:10.533 [2024-07-23 08:36:22.932381] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:10.533 [2024-07-23 08:36:22.932400] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:10.533 request: 00:23:10.533 { 00:23:10.533 "base_bdev": "BaseBdev1", 00:23:10.533 "raid_bdev": "raid_bdev1", 00:23:10.533 "method": "bdev_raid_add_base_bdev", 00:23:10.533 "req_id": 1 00:23:10.533 } 00:23:10.533 Got JSON-RPC error response 00:23:10.533 response: 00:23:10.533 { 00:23:10.533 "code": -22, 00:23:10.533 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:10.533 } 00:23:10.533 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:23:10.533 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:10.533 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:10.533 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:10.533 08:36:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:11.465 08:36:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:11.465 08:36:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:11.465 08:36:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:11.465 08:36:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:11.465 08:36:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:11.465 08:36:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:11.465 08:36:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:11.465 08:36:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:11.465 08:36:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:11.465 08:36:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:11.465 08:36:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.465 08:36:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.726 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:11.727 "name": "raid_bdev1", 00:23:11.727 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:23:11.727 "strip_size_kb": 0, 00:23:11.727 "state": "online", 00:23:11.727 "raid_level": "raid1", 00:23:11.727 "superblock": true, 00:23:11.727 "num_base_bdevs": 2, 00:23:11.727 "num_base_bdevs_discovered": 1, 00:23:11.727 "num_base_bdevs_operational": 1, 00:23:11.727 "base_bdevs_list": [ 00:23:11.727 { 00:23:11.727 "name": null, 00:23:11.727 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:11.727 "is_configured": false, 00:23:11.727 "data_offset": 2048, 00:23:11.727 "data_size": 63488 00:23:11.727 }, 00:23:11.727 { 00:23:11.727 "name": "BaseBdev2", 00:23:11.727 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:23:11.727 "is_configured": true, 00:23:11.727 "data_offset": 2048, 00:23:11.727 "data_size": 63488 00:23:11.727 } 00:23:11.727 ] 00:23:11.727 }' 00:23:11.727 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:11.727 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:12.345 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:12.345 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:12.345 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:12.346 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:12.346 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:12.346 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.346 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:12.346 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:12.346 "name": "raid_bdev1", 00:23:12.346 "uuid": "6d655b6b-b4ea-42d5-a1a6-818e2ea545c8", 00:23:12.346 "strip_size_kb": 0, 00:23:12.346 "state": "online", 00:23:12.346 "raid_level": "raid1", 00:23:12.346 "superblock": true, 00:23:12.346 "num_base_bdevs": 2, 00:23:12.346 "num_base_bdevs_discovered": 1, 00:23:12.346 "num_base_bdevs_operational": 1, 00:23:12.346 "base_bdevs_list": [ 00:23:12.346 { 00:23:12.346 "name": null, 00:23:12.346 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:12.346 "is_configured": false, 00:23:12.346 "data_offset": 2048, 00:23:12.346 "data_size": 63488 00:23:12.346 }, 00:23:12.346 { 00:23:12.346 "name": "BaseBdev2", 00:23:12.346 "uuid": "a511d801-8809-5ee5-8c01-ba88d0ba70c3", 00:23:12.346 "is_configured": true, 00:23:12.346 "data_offset": 2048, 00:23:12.346 "data_size": 63488 00:23:12.346 } 00:23:12.346 ] 00:23:12.346 }' 00:23:12.346 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:12.346 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:12.346 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:12.604 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:12.604 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 1535690 00:23:12.604 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 1535690 ']' 00:23:12.604 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 1535690 00:23:12.604 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:23:12.604 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:12.604 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1535690 00:23:12.604 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:12.604 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:12.604 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1535690' 00:23:12.604 killing process with pid 1535690 00:23:12.604 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 1535690 00:23:12.604 Received shutdown signal, test time was about 23.116970 seconds 00:23:12.604 00:23:12.604 Latency(us) 00:23:12.604 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:12.604 =================================================================================================================== 00:23:12.604 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:12.604 [2024-07-23 08:36:24.917672] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:12.604 [2024-07-23 08:36:24.917795] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:12.604 08:36:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 1535690 00:23:12.604 [2024-07-23 08:36:24.917849] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:12.604 [2024-07-23 08:36:24.917865] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038d80 name raid_bdev1, state offline 00:23:12.604 [2024-07-23 08:36:25.095175] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:23:13.982 00:23:13.982 real 0m28.044s 00:23:13.982 user 0m41.842s 00:23:13.982 sys 0m3.088s 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:13.982 ************************************ 00:23:13.982 END TEST raid_rebuild_test_sb_io 00:23:13.982 ************************************ 00:23:13.982 08:36:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:13.982 08:36:26 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:23:13.982 08:36:26 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:23:13.982 08:36:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:13.982 08:36:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:13.982 08:36:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:13.982 ************************************ 00:23:13.982 START TEST raid_rebuild_test 00:23:13.982 ************************************ 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:23:13.982 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=1541387 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 1541387 /var/tmp/spdk-raid.sock 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 1541387 ']' 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:14.241 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:14.241 08:36:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:14.241 [2024-07-23 08:36:26.571286] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:23:14.241 [2024-07-23 08:36:26.571373] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1541387 ] 00:23:14.241 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:14.241 Zero copy mechanism will not be used. 00:23:14.241 [2024-07-23 08:36:26.690485] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:14.500 [2024-07-23 08:36:26.906282] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:14.759 [2024-07-23 08:36:27.151183] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:14.759 [2024-07-23 08:36:27.151218] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:15.018 08:36:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:15.018 08:36:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:23:15.018 08:36:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:15.018 08:36:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:15.018 BaseBdev1_malloc 00:23:15.277 08:36:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:15.277 [2024-07-23 08:36:27.682920] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:15.277 [2024-07-23 08:36:27.682975] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:15.277 [2024-07-23 08:36:27.682998] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:23:15.277 [2024-07-23 08:36:27.683011] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:15.277 [2024-07-23 08:36:27.684951] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:15.277 [2024-07-23 08:36:27.684982] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:15.277 BaseBdev1 00:23:15.277 08:36:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:15.277 08:36:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:15.536 BaseBdev2_malloc 00:23:15.536 08:36:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:15.795 [2024-07-23 08:36:28.063839] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:15.795 [2024-07-23 08:36:28.063894] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:15.795 [2024-07-23 08:36:28.063913] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:23:15.795 [2024-07-23 08:36:28.063926] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:15.795 [2024-07-23 08:36:28.065879] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:15.795 [2024-07-23 08:36:28.065908] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:15.796 BaseBdev2 00:23:15.796 08:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:15.796 08:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:15.796 BaseBdev3_malloc 00:23:15.796 08:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:23:16.054 [2024-07-23 08:36:28.439897] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:23:16.054 [2024-07-23 08:36:28.439952] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:16.054 [2024-07-23 08:36:28.439974] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036080 00:23:16.054 [2024-07-23 08:36:28.439985] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:16.054 [2024-07-23 08:36:28.441928] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:16.054 [2024-07-23 08:36:28.441956] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:16.054 BaseBdev3 00:23:16.054 08:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:16.054 08:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:16.313 BaseBdev4_malloc 00:23:16.313 08:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:23:16.313 [2024-07-23 08:36:28.811703] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:23:16.313 [2024-07-23 08:36:28.811761] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:16.313 [2024-07-23 08:36:28.811782] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036c80 00:23:16.313 [2024-07-23 08:36:28.811793] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:16.313 [2024-07-23 08:36:28.813824] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:16.313 [2024-07-23 08:36:28.813855] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:16.313 BaseBdev4 00:23:16.313 08:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:16.572 spare_malloc 00:23:16.572 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:16.831 spare_delay 00:23:16.831 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:17.091 [2024-07-23 08:36:29.352705] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:17.091 [2024-07-23 08:36:29.352761] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:17.091 [2024-07-23 08:36:29.352781] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037e80 00:23:17.091 [2024-07-23 08:36:29.352792] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:17.091 [2024-07-23 08:36:29.354765] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:17.091 [2024-07-23 08:36:29.354797] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:17.091 spare 00:23:17.091 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:23:17.091 [2024-07-23 08:36:29.509143] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:17.091 [2024-07-23 08:36:29.510741] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:17.091 [2024-07-23 08:36:29.510797] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:17.091 [2024-07-23 08:36:29.510847] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:17.091 [2024-07-23 08:36:29.510932] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000038480 00:23:17.091 [2024-07-23 08:36:29.510944] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:17.091 [2024-07-23 08:36:29.511196] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:23:17.091 [2024-07-23 08:36:29.511401] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000038480 00:23:17.091 [2024-07-23 08:36:29.511412] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000038480 00:23:17.091 [2024-07-23 08:36:29.511581] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:17.091 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:17.091 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:17.091 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:17.091 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:17.091 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:17.091 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:17.091 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:17.091 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:17.091 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:17.091 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:17.091 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.091 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.350 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:17.350 "name": "raid_bdev1", 00:23:17.350 "uuid": "ca230b98-84a3-468b-934b-1eae1fb5d7ac", 00:23:17.350 "strip_size_kb": 0, 00:23:17.350 "state": "online", 00:23:17.350 "raid_level": "raid1", 00:23:17.350 "superblock": false, 00:23:17.350 "num_base_bdevs": 4, 00:23:17.350 "num_base_bdevs_discovered": 4, 00:23:17.351 "num_base_bdevs_operational": 4, 00:23:17.351 "base_bdevs_list": [ 00:23:17.351 { 00:23:17.351 "name": "BaseBdev1", 00:23:17.351 "uuid": "d83c4967-e097-5ac2-8f62-df69302dfcb7", 00:23:17.351 "is_configured": true, 00:23:17.351 "data_offset": 0, 00:23:17.351 "data_size": 65536 00:23:17.351 }, 00:23:17.351 { 00:23:17.351 "name": "BaseBdev2", 00:23:17.351 "uuid": "d5c0116c-9eb1-5a16-972a-d3ee11abba86", 00:23:17.351 "is_configured": true, 00:23:17.351 "data_offset": 0, 00:23:17.351 "data_size": 65536 00:23:17.351 }, 00:23:17.351 { 00:23:17.351 "name": "BaseBdev3", 00:23:17.351 "uuid": "be554454-0012-544e-ae92-e8225cd42e28", 00:23:17.351 "is_configured": true, 00:23:17.351 "data_offset": 0, 00:23:17.351 "data_size": 65536 00:23:17.351 }, 00:23:17.351 { 00:23:17.351 "name": "BaseBdev4", 00:23:17.351 "uuid": "325229c5-b69d-5f9a-a60d-e7b8e7cdb017", 00:23:17.351 "is_configured": true, 00:23:17.351 "data_offset": 0, 00:23:17.351 "data_size": 65536 00:23:17.351 } 00:23:17.351 ] 00:23:17.351 }' 00:23:17.351 08:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:17.351 08:36:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:17.918 08:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:17.918 08:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:17.918 [2024-07-23 08:36:30.339582] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:17.918 08:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:23:17.919 08:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.919 08:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:18.178 08:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:23:18.178 08:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:18.178 08:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:18.178 08:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:18.178 08:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:18.178 08:36:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:18.178 08:36:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:18.178 08:36:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:18.178 08:36:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:18.178 08:36:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:18.178 08:36:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:18.178 08:36:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:18.178 08:36:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:18.178 08:36:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:18.178 [2024-07-23 08:36:30.692316] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c200 00:23:18.437 /dev/nbd0 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:18.437 1+0 records in 00:23:18.437 1+0 records out 00:23:18.437 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021387 s, 19.2 MB/s 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:18.437 08:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:23:23.704 65536+0 records in 00:23:23.704 65536+0 records out 00:23:23.704 33554432 bytes (34 MB, 32 MiB) copied, 5.33069 s, 6.3 MB/s 00:23:23.704 08:36:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:23.704 08:36:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:23.704 08:36:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:23.704 08:36:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:23.704 08:36:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:23.704 08:36:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:23.704 08:36:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:23.962 [2024-07-23 08:36:36.292814] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:23.962 [2024-07-23 08:36:36.451074] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.962 08:36:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:24.220 08:36:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:24.220 "name": "raid_bdev1", 00:23:24.220 "uuid": "ca230b98-84a3-468b-934b-1eae1fb5d7ac", 00:23:24.220 "strip_size_kb": 0, 00:23:24.220 "state": "online", 00:23:24.220 "raid_level": "raid1", 00:23:24.220 "superblock": false, 00:23:24.220 "num_base_bdevs": 4, 00:23:24.220 "num_base_bdevs_discovered": 3, 00:23:24.220 "num_base_bdevs_operational": 3, 00:23:24.220 "base_bdevs_list": [ 00:23:24.220 { 00:23:24.220 "name": null, 00:23:24.220 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:24.220 "is_configured": false, 00:23:24.220 "data_offset": 0, 00:23:24.220 "data_size": 65536 00:23:24.220 }, 00:23:24.220 { 00:23:24.220 "name": "BaseBdev2", 00:23:24.220 "uuid": "d5c0116c-9eb1-5a16-972a-d3ee11abba86", 00:23:24.220 "is_configured": true, 00:23:24.220 "data_offset": 0, 00:23:24.220 "data_size": 65536 00:23:24.220 }, 00:23:24.220 { 00:23:24.220 "name": "BaseBdev3", 00:23:24.220 "uuid": "be554454-0012-544e-ae92-e8225cd42e28", 00:23:24.220 "is_configured": true, 00:23:24.220 "data_offset": 0, 00:23:24.220 "data_size": 65536 00:23:24.220 }, 00:23:24.220 { 00:23:24.220 "name": "BaseBdev4", 00:23:24.220 "uuid": "325229c5-b69d-5f9a-a60d-e7b8e7cdb017", 00:23:24.220 "is_configured": true, 00:23:24.220 "data_offset": 0, 00:23:24.220 "data_size": 65536 00:23:24.220 } 00:23:24.220 ] 00:23:24.220 }' 00:23:24.220 08:36:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:24.220 08:36:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:24.786 08:36:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:24.786 [2024-07-23 08:36:37.269280] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:24.786 [2024-07-23 08:36:37.290662] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000d12e10 00:23:24.786 [2024-07-23 08:36:37.292565] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:25.043 08:36:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:25.977 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:25.977 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:25.977 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:25.977 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:25.977 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:25.977 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:25.977 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:25.977 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:25.977 "name": "raid_bdev1", 00:23:25.977 "uuid": "ca230b98-84a3-468b-934b-1eae1fb5d7ac", 00:23:25.977 "strip_size_kb": 0, 00:23:25.977 "state": "online", 00:23:25.977 "raid_level": "raid1", 00:23:25.977 "superblock": false, 00:23:25.977 "num_base_bdevs": 4, 00:23:25.977 "num_base_bdevs_discovered": 4, 00:23:25.977 "num_base_bdevs_operational": 4, 00:23:25.977 "process": { 00:23:25.977 "type": "rebuild", 00:23:25.977 "target": "spare", 00:23:25.977 "progress": { 00:23:25.977 "blocks": 22528, 00:23:25.977 "percent": 34 00:23:25.977 } 00:23:25.977 }, 00:23:25.977 "base_bdevs_list": [ 00:23:25.977 { 00:23:25.977 "name": "spare", 00:23:25.977 "uuid": "9294808c-e581-538d-a33a-2905cb1c2174", 00:23:25.977 "is_configured": true, 00:23:25.977 "data_offset": 0, 00:23:25.977 "data_size": 65536 00:23:25.977 }, 00:23:25.977 { 00:23:25.977 "name": "BaseBdev2", 00:23:25.977 "uuid": "d5c0116c-9eb1-5a16-972a-d3ee11abba86", 00:23:25.977 "is_configured": true, 00:23:25.977 "data_offset": 0, 00:23:25.977 "data_size": 65536 00:23:25.977 }, 00:23:25.977 { 00:23:25.977 "name": "BaseBdev3", 00:23:25.977 "uuid": "be554454-0012-544e-ae92-e8225cd42e28", 00:23:25.977 "is_configured": true, 00:23:25.977 "data_offset": 0, 00:23:25.977 "data_size": 65536 00:23:25.977 }, 00:23:25.977 { 00:23:25.977 "name": "BaseBdev4", 00:23:25.977 "uuid": "325229c5-b69d-5f9a-a60d-e7b8e7cdb017", 00:23:25.977 "is_configured": true, 00:23:25.977 "data_offset": 0, 00:23:25.977 "data_size": 65536 00:23:25.977 } 00:23:25.977 ] 00:23:25.977 }' 00:23:25.977 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:26.235 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:26.235 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:26.235 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:26.235 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:26.235 [2024-07-23 08:36:38.722240] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:26.494 [2024-07-23 08:36:38.804505] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:26.494 [2024-07-23 08:36:38.804576] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:26.494 [2024-07-23 08:36:38.804597] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:26.494 [2024-07-23 08:36:38.804617] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:26.494 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:26.494 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:26.494 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:26.494 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:26.494 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:26.494 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:26.494 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:26.494 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:26.495 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:26.495 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:26.495 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.495 08:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:26.754 08:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:26.754 "name": "raid_bdev1", 00:23:26.754 "uuid": "ca230b98-84a3-468b-934b-1eae1fb5d7ac", 00:23:26.754 "strip_size_kb": 0, 00:23:26.754 "state": "online", 00:23:26.754 "raid_level": "raid1", 00:23:26.754 "superblock": false, 00:23:26.754 "num_base_bdevs": 4, 00:23:26.754 "num_base_bdevs_discovered": 3, 00:23:26.754 "num_base_bdevs_operational": 3, 00:23:26.754 "base_bdevs_list": [ 00:23:26.754 { 00:23:26.754 "name": null, 00:23:26.754 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:26.754 "is_configured": false, 00:23:26.754 "data_offset": 0, 00:23:26.754 "data_size": 65536 00:23:26.754 }, 00:23:26.754 { 00:23:26.754 "name": "BaseBdev2", 00:23:26.754 "uuid": "d5c0116c-9eb1-5a16-972a-d3ee11abba86", 00:23:26.754 "is_configured": true, 00:23:26.754 "data_offset": 0, 00:23:26.754 "data_size": 65536 00:23:26.754 }, 00:23:26.754 { 00:23:26.754 "name": "BaseBdev3", 00:23:26.754 "uuid": "be554454-0012-544e-ae92-e8225cd42e28", 00:23:26.754 "is_configured": true, 00:23:26.754 "data_offset": 0, 00:23:26.754 "data_size": 65536 00:23:26.754 }, 00:23:26.754 { 00:23:26.754 "name": "BaseBdev4", 00:23:26.754 "uuid": "325229c5-b69d-5f9a-a60d-e7b8e7cdb017", 00:23:26.754 "is_configured": true, 00:23:26.754 "data_offset": 0, 00:23:26.754 "data_size": 65536 00:23:26.754 } 00:23:26.754 ] 00:23:26.754 }' 00:23:26.754 08:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:26.754 08:36:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:27.012 08:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:27.012 08:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:27.012 08:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:27.012 08:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:27.012 08:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:27.012 08:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.012 08:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:27.271 08:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:27.271 "name": "raid_bdev1", 00:23:27.271 "uuid": "ca230b98-84a3-468b-934b-1eae1fb5d7ac", 00:23:27.271 "strip_size_kb": 0, 00:23:27.271 "state": "online", 00:23:27.271 "raid_level": "raid1", 00:23:27.271 "superblock": false, 00:23:27.271 "num_base_bdevs": 4, 00:23:27.271 "num_base_bdevs_discovered": 3, 00:23:27.271 "num_base_bdevs_operational": 3, 00:23:27.271 "base_bdevs_list": [ 00:23:27.271 { 00:23:27.271 "name": null, 00:23:27.271 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:27.271 "is_configured": false, 00:23:27.271 "data_offset": 0, 00:23:27.271 "data_size": 65536 00:23:27.271 }, 00:23:27.271 { 00:23:27.271 "name": "BaseBdev2", 00:23:27.271 "uuid": "d5c0116c-9eb1-5a16-972a-d3ee11abba86", 00:23:27.271 "is_configured": true, 00:23:27.271 "data_offset": 0, 00:23:27.271 "data_size": 65536 00:23:27.271 }, 00:23:27.271 { 00:23:27.271 "name": "BaseBdev3", 00:23:27.271 "uuid": "be554454-0012-544e-ae92-e8225cd42e28", 00:23:27.271 "is_configured": true, 00:23:27.271 "data_offset": 0, 00:23:27.271 "data_size": 65536 00:23:27.271 }, 00:23:27.271 { 00:23:27.271 "name": "BaseBdev4", 00:23:27.271 "uuid": "325229c5-b69d-5f9a-a60d-e7b8e7cdb017", 00:23:27.271 "is_configured": true, 00:23:27.271 "data_offset": 0, 00:23:27.271 "data_size": 65536 00:23:27.271 } 00:23:27.271 ] 00:23:27.271 }' 00:23:27.271 08:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:27.271 08:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:27.271 08:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:27.271 08:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:27.271 08:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:27.529 [2024-07-23 08:36:39.910401] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:27.529 [2024-07-23 08:36:39.926342] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000d12ee0 00:23:27.529 [2024-07-23 08:36:39.927987] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:27.529 08:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:28.494 08:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:28.494 08:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:28.494 08:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:28.494 08:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:28.494 08:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:28.494 08:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.494 08:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:28.763 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:28.763 "name": "raid_bdev1", 00:23:28.763 "uuid": "ca230b98-84a3-468b-934b-1eae1fb5d7ac", 00:23:28.763 "strip_size_kb": 0, 00:23:28.763 "state": "online", 00:23:28.763 "raid_level": "raid1", 00:23:28.763 "superblock": false, 00:23:28.763 "num_base_bdevs": 4, 00:23:28.763 "num_base_bdevs_discovered": 4, 00:23:28.763 "num_base_bdevs_operational": 4, 00:23:28.763 "process": { 00:23:28.763 "type": "rebuild", 00:23:28.763 "target": "spare", 00:23:28.763 "progress": { 00:23:28.763 "blocks": 22528, 00:23:28.763 "percent": 34 00:23:28.763 } 00:23:28.763 }, 00:23:28.763 "base_bdevs_list": [ 00:23:28.763 { 00:23:28.763 "name": "spare", 00:23:28.763 "uuid": "9294808c-e581-538d-a33a-2905cb1c2174", 00:23:28.763 "is_configured": true, 00:23:28.763 "data_offset": 0, 00:23:28.763 "data_size": 65536 00:23:28.763 }, 00:23:28.763 { 00:23:28.763 "name": "BaseBdev2", 00:23:28.763 "uuid": "d5c0116c-9eb1-5a16-972a-d3ee11abba86", 00:23:28.763 "is_configured": true, 00:23:28.763 "data_offset": 0, 00:23:28.763 "data_size": 65536 00:23:28.763 }, 00:23:28.763 { 00:23:28.763 "name": "BaseBdev3", 00:23:28.763 "uuid": "be554454-0012-544e-ae92-e8225cd42e28", 00:23:28.763 "is_configured": true, 00:23:28.763 "data_offset": 0, 00:23:28.763 "data_size": 65536 00:23:28.763 }, 00:23:28.763 { 00:23:28.763 "name": "BaseBdev4", 00:23:28.763 "uuid": "325229c5-b69d-5f9a-a60d-e7b8e7cdb017", 00:23:28.763 "is_configured": true, 00:23:28.763 "data_offset": 0, 00:23:28.763 "data_size": 65536 00:23:28.763 } 00:23:28.763 ] 00:23:28.763 }' 00:23:28.763 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:28.763 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:28.763 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:28.763 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:28.763 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:23:28.763 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:23:28.763 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:28.763 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:23:28.763 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:23:29.022 [2024-07-23 08:36:41.369884] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:29.022 [2024-07-23 08:36:41.440049] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x60d000d12ee0 00:23:29.022 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:23:29.022 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:23:29.022 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:29.022 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:29.022 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:29.022 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:29.022 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:29.022 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:29.022 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.281 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:29.281 "name": "raid_bdev1", 00:23:29.281 "uuid": "ca230b98-84a3-468b-934b-1eae1fb5d7ac", 00:23:29.281 "strip_size_kb": 0, 00:23:29.281 "state": "online", 00:23:29.281 "raid_level": "raid1", 00:23:29.281 "superblock": false, 00:23:29.281 "num_base_bdevs": 4, 00:23:29.281 "num_base_bdevs_discovered": 3, 00:23:29.281 "num_base_bdevs_operational": 3, 00:23:29.281 "process": { 00:23:29.281 "type": "rebuild", 00:23:29.281 "target": "spare", 00:23:29.281 "progress": { 00:23:29.281 "blocks": 32768, 00:23:29.281 "percent": 50 00:23:29.281 } 00:23:29.281 }, 00:23:29.281 "base_bdevs_list": [ 00:23:29.281 { 00:23:29.281 "name": "spare", 00:23:29.281 "uuid": "9294808c-e581-538d-a33a-2905cb1c2174", 00:23:29.281 "is_configured": true, 00:23:29.281 "data_offset": 0, 00:23:29.281 "data_size": 65536 00:23:29.281 }, 00:23:29.281 { 00:23:29.281 "name": null, 00:23:29.281 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:29.281 "is_configured": false, 00:23:29.281 "data_offset": 0, 00:23:29.281 "data_size": 65536 00:23:29.281 }, 00:23:29.281 { 00:23:29.281 "name": "BaseBdev3", 00:23:29.281 "uuid": "be554454-0012-544e-ae92-e8225cd42e28", 00:23:29.281 "is_configured": true, 00:23:29.281 "data_offset": 0, 00:23:29.281 "data_size": 65536 00:23:29.281 }, 00:23:29.281 { 00:23:29.281 "name": "BaseBdev4", 00:23:29.281 "uuid": "325229c5-b69d-5f9a-a60d-e7b8e7cdb017", 00:23:29.281 "is_configured": true, 00:23:29.281 "data_offset": 0, 00:23:29.281 "data_size": 65536 00:23:29.281 } 00:23:29.281 ] 00:23:29.281 }' 00:23:29.281 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:29.281 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:29.281 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:29.281 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:29.281 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=754 00:23:29.281 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:29.281 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:29.281 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:29.281 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:29.281 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:29.281 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:29.281 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:29.281 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.540 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:29.540 "name": "raid_bdev1", 00:23:29.540 "uuid": "ca230b98-84a3-468b-934b-1eae1fb5d7ac", 00:23:29.540 "strip_size_kb": 0, 00:23:29.540 "state": "online", 00:23:29.540 "raid_level": "raid1", 00:23:29.540 "superblock": false, 00:23:29.540 "num_base_bdevs": 4, 00:23:29.540 "num_base_bdevs_discovered": 3, 00:23:29.540 "num_base_bdevs_operational": 3, 00:23:29.540 "process": { 00:23:29.540 "type": "rebuild", 00:23:29.540 "target": "spare", 00:23:29.540 "progress": { 00:23:29.540 "blocks": 38912, 00:23:29.540 "percent": 59 00:23:29.540 } 00:23:29.540 }, 00:23:29.540 "base_bdevs_list": [ 00:23:29.540 { 00:23:29.540 "name": "spare", 00:23:29.540 "uuid": "9294808c-e581-538d-a33a-2905cb1c2174", 00:23:29.540 "is_configured": true, 00:23:29.540 "data_offset": 0, 00:23:29.540 "data_size": 65536 00:23:29.540 }, 00:23:29.540 { 00:23:29.540 "name": null, 00:23:29.540 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:29.540 "is_configured": false, 00:23:29.540 "data_offset": 0, 00:23:29.540 "data_size": 65536 00:23:29.540 }, 00:23:29.540 { 00:23:29.540 "name": "BaseBdev3", 00:23:29.540 "uuid": "be554454-0012-544e-ae92-e8225cd42e28", 00:23:29.540 "is_configured": true, 00:23:29.540 "data_offset": 0, 00:23:29.540 "data_size": 65536 00:23:29.540 }, 00:23:29.540 { 00:23:29.540 "name": "BaseBdev4", 00:23:29.540 "uuid": "325229c5-b69d-5f9a-a60d-e7b8e7cdb017", 00:23:29.540 "is_configured": true, 00:23:29.540 "data_offset": 0, 00:23:29.540 "data_size": 65536 00:23:29.540 } 00:23:29.540 ] 00:23:29.540 }' 00:23:29.540 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:29.540 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:29.540 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:29.540 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:29.540 08:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:30.475 08:36:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:30.475 08:36:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:30.475 08:36:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:30.475 08:36:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:30.475 08:36:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:30.475 08:36:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:30.475 08:36:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.475 08:36:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.734 [2024-07-23 08:36:43.153650] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:30.734 [2024-07-23 08:36:43.153714] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:30.734 [2024-07-23 08:36:43.153754] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:30.734 08:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:30.734 "name": "raid_bdev1", 00:23:30.734 "uuid": "ca230b98-84a3-468b-934b-1eae1fb5d7ac", 00:23:30.734 "strip_size_kb": 0, 00:23:30.734 "state": "online", 00:23:30.734 "raid_level": "raid1", 00:23:30.734 "superblock": false, 00:23:30.734 "num_base_bdevs": 4, 00:23:30.734 "num_base_bdevs_discovered": 3, 00:23:30.734 "num_base_bdevs_operational": 3, 00:23:30.734 "process": { 00:23:30.734 "type": "rebuild", 00:23:30.734 "target": "spare", 00:23:30.734 "progress": { 00:23:30.734 "blocks": 63488, 00:23:30.734 "percent": 96 00:23:30.734 } 00:23:30.734 }, 00:23:30.734 "base_bdevs_list": [ 00:23:30.734 { 00:23:30.734 "name": "spare", 00:23:30.734 "uuid": "9294808c-e581-538d-a33a-2905cb1c2174", 00:23:30.734 "is_configured": true, 00:23:30.734 "data_offset": 0, 00:23:30.734 "data_size": 65536 00:23:30.734 }, 00:23:30.734 { 00:23:30.734 "name": null, 00:23:30.734 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:30.734 "is_configured": false, 00:23:30.734 "data_offset": 0, 00:23:30.734 "data_size": 65536 00:23:30.734 }, 00:23:30.734 { 00:23:30.734 "name": "BaseBdev3", 00:23:30.734 "uuid": "be554454-0012-544e-ae92-e8225cd42e28", 00:23:30.734 "is_configured": true, 00:23:30.734 "data_offset": 0, 00:23:30.734 "data_size": 65536 00:23:30.734 }, 00:23:30.734 { 00:23:30.734 "name": "BaseBdev4", 00:23:30.734 "uuid": "325229c5-b69d-5f9a-a60d-e7b8e7cdb017", 00:23:30.734 "is_configured": true, 00:23:30.734 "data_offset": 0, 00:23:30.734 "data_size": 65536 00:23:30.734 } 00:23:30.734 ] 00:23:30.734 }' 00:23:30.734 08:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:30.734 08:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:30.734 08:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:30.734 08:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:30.734 08:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:32.110 "name": "raid_bdev1", 00:23:32.110 "uuid": "ca230b98-84a3-468b-934b-1eae1fb5d7ac", 00:23:32.110 "strip_size_kb": 0, 00:23:32.110 "state": "online", 00:23:32.110 "raid_level": "raid1", 00:23:32.110 "superblock": false, 00:23:32.110 "num_base_bdevs": 4, 00:23:32.110 "num_base_bdevs_discovered": 3, 00:23:32.110 "num_base_bdevs_operational": 3, 00:23:32.110 "base_bdevs_list": [ 00:23:32.110 { 00:23:32.110 "name": "spare", 00:23:32.110 "uuid": "9294808c-e581-538d-a33a-2905cb1c2174", 00:23:32.110 "is_configured": true, 00:23:32.110 "data_offset": 0, 00:23:32.110 "data_size": 65536 00:23:32.110 }, 00:23:32.110 { 00:23:32.110 "name": null, 00:23:32.110 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:32.110 "is_configured": false, 00:23:32.110 "data_offset": 0, 00:23:32.110 "data_size": 65536 00:23:32.110 }, 00:23:32.110 { 00:23:32.110 "name": "BaseBdev3", 00:23:32.110 "uuid": "be554454-0012-544e-ae92-e8225cd42e28", 00:23:32.110 "is_configured": true, 00:23:32.110 "data_offset": 0, 00:23:32.110 "data_size": 65536 00:23:32.110 }, 00:23:32.110 { 00:23:32.110 "name": "BaseBdev4", 00:23:32.110 "uuid": "325229c5-b69d-5f9a-a60d-e7b8e7cdb017", 00:23:32.110 "is_configured": true, 00:23:32.110 "data_offset": 0, 00:23:32.110 "data_size": 65536 00:23:32.110 } 00:23:32.110 ] 00:23:32.110 }' 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.110 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:32.369 "name": "raid_bdev1", 00:23:32.369 "uuid": "ca230b98-84a3-468b-934b-1eae1fb5d7ac", 00:23:32.369 "strip_size_kb": 0, 00:23:32.369 "state": "online", 00:23:32.369 "raid_level": "raid1", 00:23:32.369 "superblock": false, 00:23:32.369 "num_base_bdevs": 4, 00:23:32.369 "num_base_bdevs_discovered": 3, 00:23:32.369 "num_base_bdevs_operational": 3, 00:23:32.369 "base_bdevs_list": [ 00:23:32.369 { 00:23:32.369 "name": "spare", 00:23:32.369 "uuid": "9294808c-e581-538d-a33a-2905cb1c2174", 00:23:32.369 "is_configured": true, 00:23:32.369 "data_offset": 0, 00:23:32.369 "data_size": 65536 00:23:32.369 }, 00:23:32.369 { 00:23:32.369 "name": null, 00:23:32.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:32.369 "is_configured": false, 00:23:32.369 "data_offset": 0, 00:23:32.369 "data_size": 65536 00:23:32.369 }, 00:23:32.369 { 00:23:32.369 "name": "BaseBdev3", 00:23:32.369 "uuid": "be554454-0012-544e-ae92-e8225cd42e28", 00:23:32.369 "is_configured": true, 00:23:32.369 "data_offset": 0, 00:23:32.369 "data_size": 65536 00:23:32.369 }, 00:23:32.369 { 00:23:32.369 "name": "BaseBdev4", 00:23:32.369 "uuid": "325229c5-b69d-5f9a-a60d-e7b8e7cdb017", 00:23:32.369 "is_configured": true, 00:23:32.369 "data_offset": 0, 00:23:32.369 "data_size": 65536 00:23:32.369 } 00:23:32.369 ] 00:23:32.369 }' 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.369 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.628 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:32.628 "name": "raid_bdev1", 00:23:32.628 "uuid": "ca230b98-84a3-468b-934b-1eae1fb5d7ac", 00:23:32.628 "strip_size_kb": 0, 00:23:32.628 "state": "online", 00:23:32.628 "raid_level": "raid1", 00:23:32.628 "superblock": false, 00:23:32.628 "num_base_bdevs": 4, 00:23:32.628 "num_base_bdevs_discovered": 3, 00:23:32.628 "num_base_bdevs_operational": 3, 00:23:32.628 "base_bdevs_list": [ 00:23:32.628 { 00:23:32.628 "name": "spare", 00:23:32.628 "uuid": "9294808c-e581-538d-a33a-2905cb1c2174", 00:23:32.628 "is_configured": true, 00:23:32.628 "data_offset": 0, 00:23:32.628 "data_size": 65536 00:23:32.628 }, 00:23:32.628 { 00:23:32.628 "name": null, 00:23:32.628 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:32.628 "is_configured": false, 00:23:32.628 "data_offset": 0, 00:23:32.628 "data_size": 65536 00:23:32.628 }, 00:23:32.628 { 00:23:32.628 "name": "BaseBdev3", 00:23:32.628 "uuid": "be554454-0012-544e-ae92-e8225cd42e28", 00:23:32.628 "is_configured": true, 00:23:32.628 "data_offset": 0, 00:23:32.628 "data_size": 65536 00:23:32.628 }, 00:23:32.628 { 00:23:32.628 "name": "BaseBdev4", 00:23:32.628 "uuid": "325229c5-b69d-5f9a-a60d-e7b8e7cdb017", 00:23:32.628 "is_configured": true, 00:23:32.628 "data_offset": 0, 00:23:32.628 "data_size": 65536 00:23:32.628 } 00:23:32.628 ] 00:23:32.628 }' 00:23:32.628 08:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:32.628 08:36:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:33.196 08:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:33.196 [2024-07-23 08:36:45.585061] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:33.196 [2024-07-23 08:36:45.585091] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:33.196 [2024-07-23 08:36:45.585162] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:33.196 [2024-07-23 08:36:45.585237] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:33.196 [2024-07-23 08:36:45.585247] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038480 name raid_bdev1, state offline 00:23:33.196 08:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.196 08:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:33.456 /dev/nbd0 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:33.456 1+0 records in 00:23:33.456 1+0 records out 00:23:33.456 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000212869 s, 19.2 MB/s 00:23:33.456 08:36:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:33.715 08:36:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:33.715 08:36:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:33.715 08:36:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:33.715 08:36:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:33.715 08:36:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:33.715 08:36:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:33.715 08:36:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:33.715 /dev/nbd1 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:33.715 1+0 records in 00:23:33.715 1+0 records out 00:23:33.715 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235714 s, 17.4 MB/s 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:33.715 08:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:33.974 08:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:33.974 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:33.974 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:33.974 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:33.974 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:33.974 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:33.974 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 1541387 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 1541387 ']' 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 1541387 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:34.233 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1541387 00:23:34.492 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:34.492 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:34.492 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1541387' 00:23:34.492 killing process with pid 1541387 00:23:34.492 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 1541387 00:23:34.492 Received shutdown signal, test time was about 60.000000 seconds 00:23:34.492 00:23:34.492 Latency(us) 00:23:34.492 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:34.492 =================================================================================================================== 00:23:34.492 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:34.492 [2024-07-23 08:36:46.779816] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:34.492 08:36:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 1541387 00:23:34.750 [2024-07-23 08:36:47.244999] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:36.127 08:36:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:23:36.127 00:23:36.127 real 0m22.012s 00:23:36.127 user 0m28.810s 00:23:36.127 sys 0m3.452s 00:23:36.127 08:36:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:36.127 08:36:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:36.127 ************************************ 00:23:36.127 END TEST raid_rebuild_test 00:23:36.128 ************************************ 00:23:36.128 08:36:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:36.128 08:36:48 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:23:36.128 08:36:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:36.128 08:36:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:36.128 08:36:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:36.128 ************************************ 00:23:36.128 START TEST raid_rebuild_test_sb 00:23:36.128 ************************************ 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=1545644 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 1545644 /var/tmp/spdk-raid.sock 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1545644 ']' 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:36.128 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:36.128 08:36:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:36.386 [2024-07-23 08:36:48.672123] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:23:36.386 [2024-07-23 08:36:48.672227] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1545644 ] 00:23:36.386 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:36.386 Zero copy mechanism will not be used. 00:23:36.386 [2024-07-23 08:36:48.799372] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:36.644 [2024-07-23 08:36:49.019450] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:36.902 [2024-07-23 08:36:49.259745] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:36.902 [2024-07-23 08:36:49.259782] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:37.160 08:36:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:37.160 08:36:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:23:37.160 08:36:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:37.160 08:36:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:37.160 BaseBdev1_malloc 00:23:37.161 08:36:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:37.419 [2024-07-23 08:36:49.802250] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:37.419 [2024-07-23 08:36:49.802317] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:37.419 [2024-07-23 08:36:49.802345] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:23:37.419 [2024-07-23 08:36:49.802362] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:37.419 [2024-07-23 08:36:49.804679] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:37.419 [2024-07-23 08:36:49.804713] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:37.419 BaseBdev1 00:23:37.419 08:36:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:37.419 08:36:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:37.677 BaseBdev2_malloc 00:23:37.677 08:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:37.936 [2024-07-23 08:36:50.199478] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:37.936 [2024-07-23 08:36:50.199542] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:37.936 [2024-07-23 08:36:50.199565] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:23:37.936 [2024-07-23 08:36:50.199579] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:37.936 [2024-07-23 08:36:50.201542] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:37.936 [2024-07-23 08:36:50.201571] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:37.936 BaseBdev2 00:23:37.936 08:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:37.936 08:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:37.936 BaseBdev3_malloc 00:23:37.936 08:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:23:38.194 [2024-07-23 08:36:50.580925] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:23:38.194 [2024-07-23 08:36:50.580975] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:38.195 [2024-07-23 08:36:50.581010] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036080 00:23:38.195 [2024-07-23 08:36:50.581022] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:38.195 [2024-07-23 08:36:50.582899] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:38.195 [2024-07-23 08:36:50.582926] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:38.195 BaseBdev3 00:23:38.195 08:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:38.195 08:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:38.452 BaseBdev4_malloc 00:23:38.452 08:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:23:38.710 [2024-07-23 08:36:50.981275] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:23:38.710 [2024-07-23 08:36:50.981323] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:38.710 [2024-07-23 08:36:50.981345] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036c80 00:23:38.710 [2024-07-23 08:36:50.981357] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:38.710 [2024-07-23 08:36:50.983212] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:38.710 [2024-07-23 08:36:50.983239] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:38.710 BaseBdev4 00:23:38.710 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:38.710 spare_malloc 00:23:38.968 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:38.968 spare_delay 00:23:38.968 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:39.228 [2024-07-23 08:36:51.541310] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:39.228 [2024-07-23 08:36:51.541383] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:39.228 [2024-07-23 08:36:51.541408] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037e80 00:23:39.228 [2024-07-23 08:36:51.541420] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:39.228 [2024-07-23 08:36:51.543475] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:39.228 [2024-07-23 08:36:51.543509] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:39.228 spare 00:23:39.228 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:23:39.228 [2024-07-23 08:36:51.701807] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:39.228 [2024-07-23 08:36:51.703457] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:39.228 [2024-07-23 08:36:51.703516] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:39.228 [2024-07-23 08:36:51.703567] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:39.228 [2024-07-23 08:36:51.703791] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000038480 00:23:39.228 [2024-07-23 08:36:51.703808] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:39.228 [2024-07-23 08:36:51.704083] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:23:39.228 [2024-07-23 08:36:51.704299] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000038480 00:23:39.228 [2024-07-23 08:36:51.704310] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000038480 00:23:39.228 [2024-07-23 08:36:51.704480] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:39.228 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:39.228 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:39.228 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:39.228 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:39.228 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:39.228 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:39.228 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:39.228 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:39.228 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:39.228 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:39.228 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.228 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.487 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:39.487 "name": "raid_bdev1", 00:23:39.487 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:23:39.487 "strip_size_kb": 0, 00:23:39.487 "state": "online", 00:23:39.487 "raid_level": "raid1", 00:23:39.487 "superblock": true, 00:23:39.487 "num_base_bdevs": 4, 00:23:39.487 "num_base_bdevs_discovered": 4, 00:23:39.487 "num_base_bdevs_operational": 4, 00:23:39.487 "base_bdevs_list": [ 00:23:39.487 { 00:23:39.487 "name": "BaseBdev1", 00:23:39.487 "uuid": "d46ac624-810d-5a5d-96e7-03a0dbcc0897", 00:23:39.487 "is_configured": true, 00:23:39.487 "data_offset": 2048, 00:23:39.487 "data_size": 63488 00:23:39.487 }, 00:23:39.487 { 00:23:39.487 "name": "BaseBdev2", 00:23:39.487 "uuid": "900e56f0-891a-5341-a6e7-3ac3fb08c227", 00:23:39.487 "is_configured": true, 00:23:39.487 "data_offset": 2048, 00:23:39.487 "data_size": 63488 00:23:39.487 }, 00:23:39.487 { 00:23:39.487 "name": "BaseBdev3", 00:23:39.487 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:23:39.487 "is_configured": true, 00:23:39.487 "data_offset": 2048, 00:23:39.487 "data_size": 63488 00:23:39.487 }, 00:23:39.487 { 00:23:39.487 "name": "BaseBdev4", 00:23:39.487 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:23:39.487 "is_configured": true, 00:23:39.487 "data_offset": 2048, 00:23:39.487 "data_size": 63488 00:23:39.487 } 00:23:39.487 ] 00:23:39.487 }' 00:23:39.487 08:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:39.487 08:36:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:40.053 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:40.053 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:40.053 [2024-07-23 08:36:52.564462] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:40.312 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:40.603 [2024-07-23 08:36:52.913085] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c200 00:23:40.603 /dev/nbd0 00:23:40.603 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:40.603 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:40.603 08:36:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:40.603 08:36:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:40.603 08:36:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:40.603 08:36:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:40.603 08:36:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:40.603 08:36:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:40.603 08:36:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:40.603 08:36:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:40.603 08:36:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:40.603 1+0 records in 00:23:40.603 1+0 records out 00:23:40.603 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000182026 s, 22.5 MB/s 00:23:40.604 08:36:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:40.604 08:36:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:40.604 08:36:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:40.604 08:36:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:40.604 08:36:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:40.604 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:40.604 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:40.604 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:40.604 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:40.604 08:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:23:45.899 63488+0 records in 00:23:45.899 63488+0 records out 00:23:45.899 32505856 bytes (33 MB, 31 MiB) copied, 5.01015 s, 6.5 MB/s 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:45.899 [2024-07-23 08:36:58.196438] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:45.899 [2024-07-23 08:36:58.356976] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:45.899 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:46.158 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:46.158 "name": "raid_bdev1", 00:23:46.158 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:23:46.158 "strip_size_kb": 0, 00:23:46.158 "state": "online", 00:23:46.158 "raid_level": "raid1", 00:23:46.158 "superblock": true, 00:23:46.158 "num_base_bdevs": 4, 00:23:46.158 "num_base_bdevs_discovered": 3, 00:23:46.158 "num_base_bdevs_operational": 3, 00:23:46.158 "base_bdevs_list": [ 00:23:46.158 { 00:23:46.158 "name": null, 00:23:46.158 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:46.158 "is_configured": false, 00:23:46.158 "data_offset": 2048, 00:23:46.158 "data_size": 63488 00:23:46.158 }, 00:23:46.158 { 00:23:46.158 "name": "BaseBdev2", 00:23:46.158 "uuid": "900e56f0-891a-5341-a6e7-3ac3fb08c227", 00:23:46.158 "is_configured": true, 00:23:46.158 "data_offset": 2048, 00:23:46.158 "data_size": 63488 00:23:46.158 }, 00:23:46.158 { 00:23:46.158 "name": "BaseBdev3", 00:23:46.158 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:23:46.158 "is_configured": true, 00:23:46.158 "data_offset": 2048, 00:23:46.158 "data_size": 63488 00:23:46.158 }, 00:23:46.158 { 00:23:46.158 "name": "BaseBdev4", 00:23:46.158 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:23:46.158 "is_configured": true, 00:23:46.158 "data_offset": 2048, 00:23:46.158 "data_size": 63488 00:23:46.158 } 00:23:46.158 ] 00:23:46.158 }' 00:23:46.158 08:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:46.158 08:36:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:46.726 08:36:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:46.726 [2024-07-23 08:36:59.207236] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:46.726 [2024-07-23 08:36:59.225636] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000ca95b0 00:23:46.726 [2024-07-23 08:36:59.227345] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:46.726 08:36:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:48.102 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:48.102 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:48.102 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:48.102 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:48.102 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:48.102 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.102 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.102 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:48.102 "name": "raid_bdev1", 00:23:48.102 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:23:48.102 "strip_size_kb": 0, 00:23:48.102 "state": "online", 00:23:48.102 "raid_level": "raid1", 00:23:48.102 "superblock": true, 00:23:48.102 "num_base_bdevs": 4, 00:23:48.102 "num_base_bdevs_discovered": 4, 00:23:48.102 "num_base_bdevs_operational": 4, 00:23:48.102 "process": { 00:23:48.102 "type": "rebuild", 00:23:48.102 "target": "spare", 00:23:48.102 "progress": { 00:23:48.102 "blocks": 22528, 00:23:48.102 "percent": 35 00:23:48.102 } 00:23:48.102 }, 00:23:48.102 "base_bdevs_list": [ 00:23:48.102 { 00:23:48.102 "name": "spare", 00:23:48.102 "uuid": "66c51065-3a13-5cf6-9ffa-6c99c5a6a5f0", 00:23:48.102 "is_configured": true, 00:23:48.102 "data_offset": 2048, 00:23:48.102 "data_size": 63488 00:23:48.102 }, 00:23:48.102 { 00:23:48.102 "name": "BaseBdev2", 00:23:48.102 "uuid": "900e56f0-891a-5341-a6e7-3ac3fb08c227", 00:23:48.102 "is_configured": true, 00:23:48.102 "data_offset": 2048, 00:23:48.102 "data_size": 63488 00:23:48.102 }, 00:23:48.102 { 00:23:48.102 "name": "BaseBdev3", 00:23:48.102 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:23:48.102 "is_configured": true, 00:23:48.102 "data_offset": 2048, 00:23:48.102 "data_size": 63488 00:23:48.102 }, 00:23:48.102 { 00:23:48.102 "name": "BaseBdev4", 00:23:48.102 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:23:48.102 "is_configured": true, 00:23:48.102 "data_offset": 2048, 00:23:48.102 "data_size": 63488 00:23:48.102 } 00:23:48.102 ] 00:23:48.102 }' 00:23:48.102 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:48.102 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:48.102 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:48.102 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:48.102 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:48.361 [2024-07-23 08:37:00.649102] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:48.361 [2024-07-23 08:37:00.739348] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:48.361 [2024-07-23 08:37:00.739403] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:48.361 [2024-07-23 08:37:00.739437] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:48.361 [2024-07-23 08:37:00.739447] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:48.361 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:48.361 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:48.361 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:48.361 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:48.361 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:48.361 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:48.361 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:48.361 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:48.361 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:48.361 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:48.361 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.361 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.619 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:48.619 "name": "raid_bdev1", 00:23:48.619 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:23:48.619 "strip_size_kb": 0, 00:23:48.619 "state": "online", 00:23:48.619 "raid_level": "raid1", 00:23:48.619 "superblock": true, 00:23:48.619 "num_base_bdevs": 4, 00:23:48.619 "num_base_bdevs_discovered": 3, 00:23:48.619 "num_base_bdevs_operational": 3, 00:23:48.619 "base_bdevs_list": [ 00:23:48.619 { 00:23:48.619 "name": null, 00:23:48.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:48.619 "is_configured": false, 00:23:48.619 "data_offset": 2048, 00:23:48.619 "data_size": 63488 00:23:48.619 }, 00:23:48.619 { 00:23:48.619 "name": "BaseBdev2", 00:23:48.619 "uuid": "900e56f0-891a-5341-a6e7-3ac3fb08c227", 00:23:48.619 "is_configured": true, 00:23:48.619 "data_offset": 2048, 00:23:48.619 "data_size": 63488 00:23:48.619 }, 00:23:48.619 { 00:23:48.619 "name": "BaseBdev3", 00:23:48.619 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:23:48.619 "is_configured": true, 00:23:48.619 "data_offset": 2048, 00:23:48.619 "data_size": 63488 00:23:48.619 }, 00:23:48.619 { 00:23:48.619 "name": "BaseBdev4", 00:23:48.619 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:23:48.619 "is_configured": true, 00:23:48.619 "data_offset": 2048, 00:23:48.619 "data_size": 63488 00:23:48.619 } 00:23:48.619 ] 00:23:48.619 }' 00:23:48.619 08:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:48.619 08:37:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:49.187 08:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:49.187 08:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:49.187 08:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:49.187 08:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:49.187 08:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:49.187 08:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.187 08:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:49.187 08:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:49.187 "name": "raid_bdev1", 00:23:49.187 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:23:49.187 "strip_size_kb": 0, 00:23:49.187 "state": "online", 00:23:49.187 "raid_level": "raid1", 00:23:49.187 "superblock": true, 00:23:49.187 "num_base_bdevs": 4, 00:23:49.187 "num_base_bdevs_discovered": 3, 00:23:49.187 "num_base_bdevs_operational": 3, 00:23:49.187 "base_bdevs_list": [ 00:23:49.187 { 00:23:49.187 "name": null, 00:23:49.187 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:49.187 "is_configured": false, 00:23:49.187 "data_offset": 2048, 00:23:49.187 "data_size": 63488 00:23:49.187 }, 00:23:49.187 { 00:23:49.187 "name": "BaseBdev2", 00:23:49.187 "uuid": "900e56f0-891a-5341-a6e7-3ac3fb08c227", 00:23:49.187 "is_configured": true, 00:23:49.187 "data_offset": 2048, 00:23:49.187 "data_size": 63488 00:23:49.187 }, 00:23:49.187 { 00:23:49.187 "name": "BaseBdev3", 00:23:49.187 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:23:49.187 "is_configured": true, 00:23:49.187 "data_offset": 2048, 00:23:49.187 "data_size": 63488 00:23:49.187 }, 00:23:49.187 { 00:23:49.187 "name": "BaseBdev4", 00:23:49.187 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:23:49.187 "is_configured": true, 00:23:49.187 "data_offset": 2048, 00:23:49.187 "data_size": 63488 00:23:49.187 } 00:23:49.187 ] 00:23:49.187 }' 00:23:49.187 08:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:49.187 08:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:49.187 08:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:49.187 08:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:49.187 08:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:49.446 [2024-07-23 08:37:01.857268] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:49.446 [2024-07-23 08:37:01.873896] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000ca9680 00:23:49.446 [2024-07-23 08:37:01.875567] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:49.446 08:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:50.379 08:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:50.379 08:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:50.379 08:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:50.379 08:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:50.379 08:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:50.379 08:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.379 08:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.638 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:50.638 "name": "raid_bdev1", 00:23:50.638 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:23:50.638 "strip_size_kb": 0, 00:23:50.638 "state": "online", 00:23:50.638 "raid_level": "raid1", 00:23:50.638 "superblock": true, 00:23:50.638 "num_base_bdevs": 4, 00:23:50.638 "num_base_bdevs_discovered": 4, 00:23:50.638 "num_base_bdevs_operational": 4, 00:23:50.638 "process": { 00:23:50.638 "type": "rebuild", 00:23:50.638 "target": "spare", 00:23:50.638 "progress": { 00:23:50.638 "blocks": 22528, 00:23:50.638 "percent": 35 00:23:50.638 } 00:23:50.638 }, 00:23:50.638 "base_bdevs_list": [ 00:23:50.638 { 00:23:50.638 "name": "spare", 00:23:50.638 "uuid": "66c51065-3a13-5cf6-9ffa-6c99c5a6a5f0", 00:23:50.638 "is_configured": true, 00:23:50.638 "data_offset": 2048, 00:23:50.638 "data_size": 63488 00:23:50.638 }, 00:23:50.638 { 00:23:50.638 "name": "BaseBdev2", 00:23:50.638 "uuid": "900e56f0-891a-5341-a6e7-3ac3fb08c227", 00:23:50.638 "is_configured": true, 00:23:50.638 "data_offset": 2048, 00:23:50.638 "data_size": 63488 00:23:50.638 }, 00:23:50.638 { 00:23:50.638 "name": "BaseBdev3", 00:23:50.638 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:23:50.638 "is_configured": true, 00:23:50.638 "data_offset": 2048, 00:23:50.638 "data_size": 63488 00:23:50.638 }, 00:23:50.638 { 00:23:50.638 "name": "BaseBdev4", 00:23:50.638 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:23:50.638 "is_configured": true, 00:23:50.638 "data_offset": 2048, 00:23:50.638 "data_size": 63488 00:23:50.638 } 00:23:50.638 ] 00:23:50.638 }' 00:23:50.638 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:50.638 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:50.638 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:50.638 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:50.638 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:50.638 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:50.638 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:50.638 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:23:50.638 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:50.638 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:23:50.638 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:23:50.896 [2024-07-23 08:37:03.309876] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:51.154 [2024-07-23 08:37:03.487859] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x60d000ca9680 00:23:51.154 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:23:51.154 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:23:51.154 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:51.154 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:51.154 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:51.154 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:51.154 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:51.154 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.154 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:51.413 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:51.413 "name": "raid_bdev1", 00:23:51.413 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:23:51.413 "strip_size_kb": 0, 00:23:51.413 "state": "online", 00:23:51.413 "raid_level": "raid1", 00:23:51.413 "superblock": true, 00:23:51.413 "num_base_bdevs": 4, 00:23:51.413 "num_base_bdevs_discovered": 3, 00:23:51.413 "num_base_bdevs_operational": 3, 00:23:51.413 "process": { 00:23:51.413 "type": "rebuild", 00:23:51.413 "target": "spare", 00:23:51.413 "progress": { 00:23:51.413 "blocks": 34816, 00:23:51.413 "percent": 54 00:23:51.413 } 00:23:51.413 }, 00:23:51.413 "base_bdevs_list": [ 00:23:51.413 { 00:23:51.413 "name": "spare", 00:23:51.413 "uuid": "66c51065-3a13-5cf6-9ffa-6c99c5a6a5f0", 00:23:51.413 "is_configured": true, 00:23:51.413 "data_offset": 2048, 00:23:51.413 "data_size": 63488 00:23:51.413 }, 00:23:51.413 { 00:23:51.413 "name": null, 00:23:51.413 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:51.413 "is_configured": false, 00:23:51.413 "data_offset": 2048, 00:23:51.413 "data_size": 63488 00:23:51.413 }, 00:23:51.413 { 00:23:51.413 "name": "BaseBdev3", 00:23:51.413 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:23:51.413 "is_configured": true, 00:23:51.413 "data_offset": 2048, 00:23:51.413 "data_size": 63488 00:23:51.413 }, 00:23:51.413 { 00:23:51.413 "name": "BaseBdev4", 00:23:51.413 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:23:51.413 "is_configured": true, 00:23:51.413 "data_offset": 2048, 00:23:51.413 "data_size": 63488 00:23:51.413 } 00:23:51.413 ] 00:23:51.413 }' 00:23:51.413 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:51.413 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:51.413 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:51.413 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:51.413 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=776 00:23:51.413 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:51.413 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:51.413 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:51.413 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:51.414 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:51.414 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:51.414 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:51.414 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.672 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:51.672 "name": "raid_bdev1", 00:23:51.672 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:23:51.672 "strip_size_kb": 0, 00:23:51.672 "state": "online", 00:23:51.672 "raid_level": "raid1", 00:23:51.672 "superblock": true, 00:23:51.672 "num_base_bdevs": 4, 00:23:51.672 "num_base_bdevs_discovered": 3, 00:23:51.672 "num_base_bdevs_operational": 3, 00:23:51.672 "process": { 00:23:51.672 "type": "rebuild", 00:23:51.672 "target": "spare", 00:23:51.672 "progress": { 00:23:51.672 "blocks": 38912, 00:23:51.673 "percent": 61 00:23:51.673 } 00:23:51.673 }, 00:23:51.673 "base_bdevs_list": [ 00:23:51.673 { 00:23:51.673 "name": "spare", 00:23:51.673 "uuid": "66c51065-3a13-5cf6-9ffa-6c99c5a6a5f0", 00:23:51.673 "is_configured": true, 00:23:51.673 "data_offset": 2048, 00:23:51.673 "data_size": 63488 00:23:51.673 }, 00:23:51.673 { 00:23:51.673 "name": null, 00:23:51.673 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:51.673 "is_configured": false, 00:23:51.673 "data_offset": 2048, 00:23:51.673 "data_size": 63488 00:23:51.673 }, 00:23:51.673 { 00:23:51.673 "name": "BaseBdev3", 00:23:51.673 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:23:51.673 "is_configured": true, 00:23:51.673 "data_offset": 2048, 00:23:51.673 "data_size": 63488 00:23:51.673 }, 00:23:51.673 { 00:23:51.673 "name": "BaseBdev4", 00:23:51.673 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:23:51.673 "is_configured": true, 00:23:51.673 "data_offset": 2048, 00:23:51.673 "data_size": 63488 00:23:51.673 } 00:23:51.673 ] 00:23:51.673 }' 00:23:51.673 08:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:51.673 08:37:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:51.673 08:37:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:51.673 08:37:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:51.673 08:37:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:52.608 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:52.608 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:52.608 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:52.608 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:52.608 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:52.608 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:52.608 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:52.608 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.608 [2024-07-23 08:37:05.100541] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:52.608 [2024-07-23 08:37:05.100615] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:52.608 [2024-07-23 08:37:05.100720] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:52.867 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:52.867 "name": "raid_bdev1", 00:23:52.867 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:23:52.867 "strip_size_kb": 0, 00:23:52.867 "state": "online", 00:23:52.867 "raid_level": "raid1", 00:23:52.867 "superblock": true, 00:23:52.867 "num_base_bdevs": 4, 00:23:52.867 "num_base_bdevs_discovered": 3, 00:23:52.867 "num_base_bdevs_operational": 3, 00:23:52.867 "base_bdevs_list": [ 00:23:52.867 { 00:23:52.867 "name": "spare", 00:23:52.867 "uuid": "66c51065-3a13-5cf6-9ffa-6c99c5a6a5f0", 00:23:52.867 "is_configured": true, 00:23:52.867 "data_offset": 2048, 00:23:52.867 "data_size": 63488 00:23:52.867 }, 00:23:52.867 { 00:23:52.867 "name": null, 00:23:52.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:52.867 "is_configured": false, 00:23:52.867 "data_offset": 2048, 00:23:52.867 "data_size": 63488 00:23:52.867 }, 00:23:52.867 { 00:23:52.867 "name": "BaseBdev3", 00:23:52.867 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:23:52.867 "is_configured": true, 00:23:52.867 "data_offset": 2048, 00:23:52.867 "data_size": 63488 00:23:52.867 }, 00:23:52.867 { 00:23:52.867 "name": "BaseBdev4", 00:23:52.867 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:23:52.867 "is_configured": true, 00:23:52.867 "data_offset": 2048, 00:23:52.867 "data_size": 63488 00:23:52.867 } 00:23:52.867 ] 00:23:52.867 }' 00:23:52.867 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:52.867 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:52.867 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:52.867 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:52.867 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:23:52.867 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:52.867 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:52.867 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:52.867 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:52.867 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:52.867 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.867 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:53.126 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:53.126 "name": "raid_bdev1", 00:23:53.126 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:23:53.126 "strip_size_kb": 0, 00:23:53.126 "state": "online", 00:23:53.126 "raid_level": "raid1", 00:23:53.126 "superblock": true, 00:23:53.126 "num_base_bdevs": 4, 00:23:53.126 "num_base_bdevs_discovered": 3, 00:23:53.126 "num_base_bdevs_operational": 3, 00:23:53.126 "base_bdevs_list": [ 00:23:53.126 { 00:23:53.126 "name": "spare", 00:23:53.126 "uuid": "66c51065-3a13-5cf6-9ffa-6c99c5a6a5f0", 00:23:53.126 "is_configured": true, 00:23:53.126 "data_offset": 2048, 00:23:53.126 "data_size": 63488 00:23:53.126 }, 00:23:53.126 { 00:23:53.126 "name": null, 00:23:53.126 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:53.126 "is_configured": false, 00:23:53.126 "data_offset": 2048, 00:23:53.126 "data_size": 63488 00:23:53.126 }, 00:23:53.126 { 00:23:53.126 "name": "BaseBdev3", 00:23:53.126 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:23:53.126 "is_configured": true, 00:23:53.126 "data_offset": 2048, 00:23:53.126 "data_size": 63488 00:23:53.126 }, 00:23:53.126 { 00:23:53.126 "name": "BaseBdev4", 00:23:53.126 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:23:53.126 "is_configured": true, 00:23:53.126 "data_offset": 2048, 00:23:53.126 "data_size": 63488 00:23:53.126 } 00:23:53.126 ] 00:23:53.126 }' 00:23:53.126 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:53.126 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:53.126 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:53.126 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:53.126 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:53.126 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:53.126 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:53.126 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:53.126 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:53.127 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:53.127 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:53.127 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:53.127 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:53.127 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:53.127 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:53.127 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.386 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:53.386 "name": "raid_bdev1", 00:23:53.386 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:23:53.386 "strip_size_kb": 0, 00:23:53.386 "state": "online", 00:23:53.386 "raid_level": "raid1", 00:23:53.386 "superblock": true, 00:23:53.386 "num_base_bdevs": 4, 00:23:53.386 "num_base_bdevs_discovered": 3, 00:23:53.386 "num_base_bdevs_operational": 3, 00:23:53.386 "base_bdevs_list": [ 00:23:53.386 { 00:23:53.386 "name": "spare", 00:23:53.386 "uuid": "66c51065-3a13-5cf6-9ffa-6c99c5a6a5f0", 00:23:53.386 "is_configured": true, 00:23:53.386 "data_offset": 2048, 00:23:53.386 "data_size": 63488 00:23:53.386 }, 00:23:53.386 { 00:23:53.386 "name": null, 00:23:53.386 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:53.386 "is_configured": false, 00:23:53.386 "data_offset": 2048, 00:23:53.386 "data_size": 63488 00:23:53.386 }, 00:23:53.386 { 00:23:53.386 "name": "BaseBdev3", 00:23:53.386 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:23:53.386 "is_configured": true, 00:23:53.386 "data_offset": 2048, 00:23:53.386 "data_size": 63488 00:23:53.386 }, 00:23:53.386 { 00:23:53.386 "name": "BaseBdev4", 00:23:53.386 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:23:53.386 "is_configured": true, 00:23:53.386 "data_offset": 2048, 00:23:53.386 "data_size": 63488 00:23:53.386 } 00:23:53.386 ] 00:23:53.386 }' 00:23:53.386 08:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:53.386 08:37:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:53.953 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:53.953 [2024-07-23 08:37:06.327007] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:53.953 [2024-07-23 08:37:06.327049] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:53.953 [2024-07-23 08:37:06.327137] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:53.953 [2024-07-23 08:37:06.327226] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:53.953 [2024-07-23 08:37:06.327243] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038480 name raid_bdev1, state offline 00:23:53.953 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.953 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:23:54.212 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:54.212 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:54.212 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:54.212 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:54.212 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:54.212 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:54.212 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:54.212 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:54.212 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:54.212 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:54.212 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:54.212 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:54.213 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:54.472 /dev/nbd0 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:54.472 1+0 records in 00:23:54.472 1+0 records out 00:23:54.472 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000158635 s, 25.8 MB/s 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:54.472 /dev/nbd1 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:54.472 1+0 records in 00:23:54.472 1+0 records out 00:23:54.472 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253951 s, 16.1 MB/s 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:54.472 08:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:54.731 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:54.731 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:54.731 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:54.731 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:54.731 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:54.731 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:54.731 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:54.989 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:54.989 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:54.989 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:54.989 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:54.989 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:54.989 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:54.989 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:54.989 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:54.989 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:54.989 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:55.248 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:55.248 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:55.248 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:55.248 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:55.248 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:55.248 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:55.248 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:55.248 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:55.248 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:55.248 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:55.248 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:55.506 [2024-07-23 08:37:07.858546] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:55.506 [2024-07-23 08:37:07.858606] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:55.506 [2024-07-23 08:37:07.858636] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039c80 00:23:55.506 [2024-07-23 08:37:07.858662] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:55.506 [2024-07-23 08:37:07.860698] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:55.506 [2024-07-23 08:37:07.860727] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:55.506 [2024-07-23 08:37:07.860824] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:55.506 [2024-07-23 08:37:07.860870] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:55.506 [2024-07-23 08:37:07.861039] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:55.506 [2024-07-23 08:37:07.861126] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:55.506 spare 00:23:55.506 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:55.506 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:55.506 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:55.506 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:55.506 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:55.506 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:55.506 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:55.506 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:55.506 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:55.506 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:55.506 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.506 08:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.506 [2024-07-23 08:37:07.961455] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x61600003a280 00:23:55.506 [2024-07-23 08:37:07.961480] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:55.506 [2024-07-23 08:37:07.961733] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000cc7d30 00:23:55.506 [2024-07-23 08:37:07.961932] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x61600003a280 00:23:55.506 [2024-07-23 08:37:07.961947] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x61600003a280 00:23:55.506 [2024-07-23 08:37:07.962093] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:55.763 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:55.763 "name": "raid_bdev1", 00:23:55.763 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:23:55.763 "strip_size_kb": 0, 00:23:55.763 "state": "online", 00:23:55.763 "raid_level": "raid1", 00:23:55.763 "superblock": true, 00:23:55.763 "num_base_bdevs": 4, 00:23:55.763 "num_base_bdevs_discovered": 3, 00:23:55.763 "num_base_bdevs_operational": 3, 00:23:55.763 "base_bdevs_list": [ 00:23:55.763 { 00:23:55.763 "name": "spare", 00:23:55.763 "uuid": "66c51065-3a13-5cf6-9ffa-6c99c5a6a5f0", 00:23:55.763 "is_configured": true, 00:23:55.763 "data_offset": 2048, 00:23:55.763 "data_size": 63488 00:23:55.763 }, 00:23:55.763 { 00:23:55.763 "name": null, 00:23:55.763 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:55.763 "is_configured": false, 00:23:55.763 "data_offset": 2048, 00:23:55.763 "data_size": 63488 00:23:55.763 }, 00:23:55.763 { 00:23:55.763 "name": "BaseBdev3", 00:23:55.763 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:23:55.763 "is_configured": true, 00:23:55.763 "data_offset": 2048, 00:23:55.763 "data_size": 63488 00:23:55.763 }, 00:23:55.763 { 00:23:55.763 "name": "BaseBdev4", 00:23:55.763 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:23:55.763 "is_configured": true, 00:23:55.763 "data_offset": 2048, 00:23:55.763 "data_size": 63488 00:23:55.763 } 00:23:55.763 ] 00:23:55.763 }' 00:23:55.763 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:55.763 08:37:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:56.021 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:56.021 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:56.021 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:56.280 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:56.280 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:56.280 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.280 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.280 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:56.280 "name": "raid_bdev1", 00:23:56.280 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:23:56.280 "strip_size_kb": 0, 00:23:56.280 "state": "online", 00:23:56.280 "raid_level": "raid1", 00:23:56.280 "superblock": true, 00:23:56.280 "num_base_bdevs": 4, 00:23:56.280 "num_base_bdevs_discovered": 3, 00:23:56.280 "num_base_bdevs_operational": 3, 00:23:56.280 "base_bdevs_list": [ 00:23:56.280 { 00:23:56.280 "name": "spare", 00:23:56.280 "uuid": "66c51065-3a13-5cf6-9ffa-6c99c5a6a5f0", 00:23:56.280 "is_configured": true, 00:23:56.280 "data_offset": 2048, 00:23:56.280 "data_size": 63488 00:23:56.280 }, 00:23:56.280 { 00:23:56.280 "name": null, 00:23:56.280 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:56.280 "is_configured": false, 00:23:56.280 "data_offset": 2048, 00:23:56.280 "data_size": 63488 00:23:56.280 }, 00:23:56.280 { 00:23:56.280 "name": "BaseBdev3", 00:23:56.280 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:23:56.280 "is_configured": true, 00:23:56.280 "data_offset": 2048, 00:23:56.280 "data_size": 63488 00:23:56.280 }, 00:23:56.280 { 00:23:56.280 "name": "BaseBdev4", 00:23:56.280 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:23:56.280 "is_configured": true, 00:23:56.280 "data_offset": 2048, 00:23:56.280 "data_size": 63488 00:23:56.280 } 00:23:56.280 ] 00:23:56.280 }' 00:23:56.280 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:56.280 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:56.280 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:56.280 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:56.280 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.280 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:56.539 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:56.540 08:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:56.799 [2024-07-23 08:37:09.106149] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:56.799 08:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:56.799 08:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:56.799 08:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:56.799 08:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:56.799 08:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:56.799 08:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:56.799 08:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:56.799 08:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:56.799 08:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:56.799 08:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:56.799 08:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.799 08:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.799 08:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:56.799 "name": "raid_bdev1", 00:23:56.799 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:23:56.799 "strip_size_kb": 0, 00:23:56.799 "state": "online", 00:23:56.799 "raid_level": "raid1", 00:23:56.799 "superblock": true, 00:23:56.799 "num_base_bdevs": 4, 00:23:56.799 "num_base_bdevs_discovered": 2, 00:23:56.799 "num_base_bdevs_operational": 2, 00:23:56.799 "base_bdevs_list": [ 00:23:56.799 { 00:23:56.799 "name": null, 00:23:56.799 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:56.799 "is_configured": false, 00:23:56.799 "data_offset": 2048, 00:23:56.799 "data_size": 63488 00:23:56.799 }, 00:23:56.799 { 00:23:56.799 "name": null, 00:23:56.799 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:56.799 "is_configured": false, 00:23:56.799 "data_offset": 2048, 00:23:56.799 "data_size": 63488 00:23:56.799 }, 00:23:56.799 { 00:23:56.799 "name": "BaseBdev3", 00:23:56.799 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:23:56.799 "is_configured": true, 00:23:56.799 "data_offset": 2048, 00:23:56.799 "data_size": 63488 00:23:56.799 }, 00:23:56.799 { 00:23:56.799 "name": "BaseBdev4", 00:23:56.799 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:23:56.799 "is_configured": true, 00:23:56.799 "data_offset": 2048, 00:23:56.799 "data_size": 63488 00:23:56.799 } 00:23:56.799 ] 00:23:56.799 }' 00:23:56.799 08:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:56.799 08:37:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:57.368 08:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:57.646 [2024-07-23 08:37:09.928308] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:57.646 [2024-07-23 08:37:09.928506] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:23:57.646 [2024-07-23 08:37:09.928522] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:57.646 [2024-07-23 08:37:09.928555] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:57.646 [2024-07-23 08:37:09.944027] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000cc7e00 00:23:57.646 [2024-07-23 08:37:09.945599] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:57.646 08:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:58.616 08:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:58.616 08:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:58.616 08:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:58.616 08:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:58.616 08:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:58.616 08:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:58.616 08:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:58.876 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:58.876 "name": "raid_bdev1", 00:23:58.876 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:23:58.876 "strip_size_kb": 0, 00:23:58.876 "state": "online", 00:23:58.876 "raid_level": "raid1", 00:23:58.876 "superblock": true, 00:23:58.876 "num_base_bdevs": 4, 00:23:58.876 "num_base_bdevs_discovered": 3, 00:23:58.876 "num_base_bdevs_operational": 3, 00:23:58.876 "process": { 00:23:58.876 "type": "rebuild", 00:23:58.876 "target": "spare", 00:23:58.876 "progress": { 00:23:58.876 "blocks": 22528, 00:23:58.876 "percent": 35 00:23:58.876 } 00:23:58.876 }, 00:23:58.876 "base_bdevs_list": [ 00:23:58.876 { 00:23:58.876 "name": "spare", 00:23:58.876 "uuid": "66c51065-3a13-5cf6-9ffa-6c99c5a6a5f0", 00:23:58.876 "is_configured": true, 00:23:58.876 "data_offset": 2048, 00:23:58.876 "data_size": 63488 00:23:58.876 }, 00:23:58.876 { 00:23:58.876 "name": null, 00:23:58.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:58.876 "is_configured": false, 00:23:58.876 "data_offset": 2048, 00:23:58.876 "data_size": 63488 00:23:58.876 }, 00:23:58.876 { 00:23:58.876 "name": "BaseBdev3", 00:23:58.876 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:23:58.876 "is_configured": true, 00:23:58.876 "data_offset": 2048, 00:23:58.876 "data_size": 63488 00:23:58.876 }, 00:23:58.876 { 00:23:58.876 "name": "BaseBdev4", 00:23:58.876 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:23:58.876 "is_configured": true, 00:23:58.876 "data_offset": 2048, 00:23:58.876 "data_size": 63488 00:23:58.876 } 00:23:58.876 ] 00:23:58.876 }' 00:23:58.876 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:58.876 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:58.876 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:58.876 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:58.876 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:58.876 [2024-07-23 08:37:11.363606] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:59.135 [2024-07-23 08:37:11.458098] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:59.135 [2024-07-23 08:37:11.458162] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:59.135 [2024-07-23 08:37:11.458182] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:59.135 [2024-07-23 08:37:11.458191] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:59.135 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:59.135 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:59.135 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:59.135 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:59.135 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:59.135 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:59.135 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:59.135 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:59.135 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:59.135 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:59.135 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.135 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:59.394 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:59.394 "name": "raid_bdev1", 00:23:59.394 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:23:59.394 "strip_size_kb": 0, 00:23:59.394 "state": "online", 00:23:59.394 "raid_level": "raid1", 00:23:59.394 "superblock": true, 00:23:59.394 "num_base_bdevs": 4, 00:23:59.394 "num_base_bdevs_discovered": 2, 00:23:59.394 "num_base_bdevs_operational": 2, 00:23:59.394 "base_bdevs_list": [ 00:23:59.394 { 00:23:59.394 "name": null, 00:23:59.394 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:59.394 "is_configured": false, 00:23:59.394 "data_offset": 2048, 00:23:59.394 "data_size": 63488 00:23:59.394 }, 00:23:59.394 { 00:23:59.394 "name": null, 00:23:59.394 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:59.394 "is_configured": false, 00:23:59.394 "data_offset": 2048, 00:23:59.394 "data_size": 63488 00:23:59.394 }, 00:23:59.394 { 00:23:59.394 "name": "BaseBdev3", 00:23:59.394 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:23:59.394 "is_configured": true, 00:23:59.394 "data_offset": 2048, 00:23:59.394 "data_size": 63488 00:23:59.394 }, 00:23:59.394 { 00:23:59.394 "name": "BaseBdev4", 00:23:59.394 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:23:59.394 "is_configured": true, 00:23:59.394 "data_offset": 2048, 00:23:59.394 "data_size": 63488 00:23:59.394 } 00:23:59.394 ] 00:23:59.394 }' 00:23:59.394 08:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:59.394 08:37:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:59.963 08:37:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:59.963 [2024-07-23 08:37:12.340175] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:59.963 [2024-07-23 08:37:12.340239] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:59.963 [2024-07-23 08:37:12.340261] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600003a880 00:23:59.963 [2024-07-23 08:37:12.340272] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:59.963 [2024-07-23 08:37:12.340844] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:59.963 [2024-07-23 08:37:12.340865] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:59.963 [2024-07-23 08:37:12.340961] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:59.963 [2024-07-23 08:37:12.340975] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:23:59.963 [2024-07-23 08:37:12.340988] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:59.963 [2024-07-23 08:37:12.341010] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:59.963 [2024-07-23 08:37:12.356336] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000cc7ed0 00:23:59.963 spare 00:23:59.963 [2024-07-23 08:37:12.357950] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:59.963 08:37:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:00.899 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:00.899 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:00.899 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:00.900 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:00.900 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:00.900 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.900 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.158 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:01.158 "name": "raid_bdev1", 00:24:01.158 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:24:01.158 "strip_size_kb": 0, 00:24:01.158 "state": "online", 00:24:01.158 "raid_level": "raid1", 00:24:01.158 "superblock": true, 00:24:01.158 "num_base_bdevs": 4, 00:24:01.158 "num_base_bdevs_discovered": 3, 00:24:01.158 "num_base_bdevs_operational": 3, 00:24:01.158 "process": { 00:24:01.158 "type": "rebuild", 00:24:01.158 "target": "spare", 00:24:01.158 "progress": { 00:24:01.158 "blocks": 22528, 00:24:01.158 "percent": 35 00:24:01.158 } 00:24:01.158 }, 00:24:01.158 "base_bdevs_list": [ 00:24:01.158 { 00:24:01.158 "name": "spare", 00:24:01.158 "uuid": "66c51065-3a13-5cf6-9ffa-6c99c5a6a5f0", 00:24:01.158 "is_configured": true, 00:24:01.158 "data_offset": 2048, 00:24:01.158 "data_size": 63488 00:24:01.158 }, 00:24:01.158 { 00:24:01.158 "name": null, 00:24:01.158 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:01.158 "is_configured": false, 00:24:01.158 "data_offset": 2048, 00:24:01.158 "data_size": 63488 00:24:01.158 }, 00:24:01.158 { 00:24:01.158 "name": "BaseBdev3", 00:24:01.158 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:24:01.158 "is_configured": true, 00:24:01.158 "data_offset": 2048, 00:24:01.158 "data_size": 63488 00:24:01.158 }, 00:24:01.158 { 00:24:01.158 "name": "BaseBdev4", 00:24:01.158 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:24:01.158 "is_configured": true, 00:24:01.158 "data_offset": 2048, 00:24:01.158 "data_size": 63488 00:24:01.158 } 00:24:01.158 ] 00:24:01.158 }' 00:24:01.158 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:01.158 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:01.158 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:01.158 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:01.158 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:01.416 [2024-07-23 08:37:13.812064] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:01.416 [2024-07-23 08:37:13.870072] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:01.416 [2024-07-23 08:37:13.870126] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:01.416 [2024-07-23 08:37:13.870143] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:01.416 [2024-07-23 08:37:13.870153] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:01.416 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:01.416 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:01.416 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:01.416 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:01.417 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:01.417 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:01.417 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:01.417 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:01.417 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:01.417 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:01.417 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.417 08:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.676 08:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:01.676 "name": "raid_bdev1", 00:24:01.676 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:24:01.676 "strip_size_kb": 0, 00:24:01.676 "state": "online", 00:24:01.676 "raid_level": "raid1", 00:24:01.676 "superblock": true, 00:24:01.676 "num_base_bdevs": 4, 00:24:01.676 "num_base_bdevs_discovered": 2, 00:24:01.676 "num_base_bdevs_operational": 2, 00:24:01.676 "base_bdevs_list": [ 00:24:01.676 { 00:24:01.676 "name": null, 00:24:01.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:01.676 "is_configured": false, 00:24:01.676 "data_offset": 2048, 00:24:01.676 "data_size": 63488 00:24:01.676 }, 00:24:01.676 { 00:24:01.676 "name": null, 00:24:01.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:01.676 "is_configured": false, 00:24:01.676 "data_offset": 2048, 00:24:01.676 "data_size": 63488 00:24:01.676 }, 00:24:01.676 { 00:24:01.676 "name": "BaseBdev3", 00:24:01.676 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:24:01.676 "is_configured": true, 00:24:01.676 "data_offset": 2048, 00:24:01.676 "data_size": 63488 00:24:01.676 }, 00:24:01.676 { 00:24:01.676 "name": "BaseBdev4", 00:24:01.676 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:24:01.676 "is_configured": true, 00:24:01.676 "data_offset": 2048, 00:24:01.676 "data_size": 63488 00:24:01.676 } 00:24:01.676 ] 00:24:01.676 }' 00:24:01.676 08:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:01.676 08:37:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:02.245 08:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:02.245 08:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:02.245 08:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:02.245 08:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:02.245 08:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:02.245 08:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:02.245 08:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:02.245 08:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:02.245 "name": "raid_bdev1", 00:24:02.245 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:24:02.245 "strip_size_kb": 0, 00:24:02.245 "state": "online", 00:24:02.245 "raid_level": "raid1", 00:24:02.245 "superblock": true, 00:24:02.245 "num_base_bdevs": 4, 00:24:02.245 "num_base_bdevs_discovered": 2, 00:24:02.245 "num_base_bdevs_operational": 2, 00:24:02.245 "base_bdevs_list": [ 00:24:02.245 { 00:24:02.245 "name": null, 00:24:02.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.245 "is_configured": false, 00:24:02.245 "data_offset": 2048, 00:24:02.245 "data_size": 63488 00:24:02.245 }, 00:24:02.245 { 00:24:02.245 "name": null, 00:24:02.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.245 "is_configured": false, 00:24:02.245 "data_offset": 2048, 00:24:02.245 "data_size": 63488 00:24:02.245 }, 00:24:02.245 { 00:24:02.245 "name": "BaseBdev3", 00:24:02.245 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:24:02.245 "is_configured": true, 00:24:02.245 "data_offset": 2048, 00:24:02.245 "data_size": 63488 00:24:02.245 }, 00:24:02.245 { 00:24:02.245 "name": "BaseBdev4", 00:24:02.245 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:24:02.245 "is_configured": true, 00:24:02.245 "data_offset": 2048, 00:24:02.245 "data_size": 63488 00:24:02.245 } 00:24:02.245 ] 00:24:02.245 }' 00:24:02.245 08:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:02.503 08:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:02.503 08:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:02.503 08:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:02.503 08:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:02.503 08:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:02.762 [2024-07-23 08:37:15.148500] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:02.762 [2024-07-23 08:37:15.148589] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:02.762 [2024-07-23 08:37:15.148625] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600003ae80 00:24:02.762 [2024-07-23 08:37:15.148638] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:02.762 [2024-07-23 08:37:15.149139] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:02.762 [2024-07-23 08:37:15.149161] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:02.762 [2024-07-23 08:37:15.149252] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:02.762 [2024-07-23 08:37:15.149271] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:24:02.762 [2024-07-23 08:37:15.149279] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:02.762 BaseBdev1 00:24:02.762 08:37:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:03.699 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:03.699 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:03.699 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:03.699 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:03.699 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:03.699 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:03.699 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:03.699 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:03.699 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:03.699 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:03.699 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.699 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:03.958 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:03.958 "name": "raid_bdev1", 00:24:03.958 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:24:03.958 "strip_size_kb": 0, 00:24:03.958 "state": "online", 00:24:03.958 "raid_level": "raid1", 00:24:03.958 "superblock": true, 00:24:03.958 "num_base_bdevs": 4, 00:24:03.958 "num_base_bdevs_discovered": 2, 00:24:03.958 "num_base_bdevs_operational": 2, 00:24:03.958 "base_bdevs_list": [ 00:24:03.958 { 00:24:03.958 "name": null, 00:24:03.958 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:03.958 "is_configured": false, 00:24:03.958 "data_offset": 2048, 00:24:03.958 "data_size": 63488 00:24:03.958 }, 00:24:03.958 { 00:24:03.958 "name": null, 00:24:03.958 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:03.958 "is_configured": false, 00:24:03.958 "data_offset": 2048, 00:24:03.958 "data_size": 63488 00:24:03.958 }, 00:24:03.958 { 00:24:03.958 "name": "BaseBdev3", 00:24:03.958 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:24:03.958 "is_configured": true, 00:24:03.958 "data_offset": 2048, 00:24:03.958 "data_size": 63488 00:24:03.958 }, 00:24:03.958 { 00:24:03.958 "name": "BaseBdev4", 00:24:03.958 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:24:03.958 "is_configured": true, 00:24:03.958 "data_offset": 2048, 00:24:03.958 "data_size": 63488 00:24:03.958 } 00:24:03.958 ] 00:24:03.958 }' 00:24:03.958 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:03.958 08:37:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:04.527 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:04.527 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:04.527 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:04.527 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:04.527 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:04.527 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.527 08:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:04.527 08:37:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:04.527 "name": "raid_bdev1", 00:24:04.527 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:24:04.527 "strip_size_kb": 0, 00:24:04.527 "state": "online", 00:24:04.527 "raid_level": "raid1", 00:24:04.527 "superblock": true, 00:24:04.527 "num_base_bdevs": 4, 00:24:04.527 "num_base_bdevs_discovered": 2, 00:24:04.527 "num_base_bdevs_operational": 2, 00:24:04.527 "base_bdevs_list": [ 00:24:04.527 { 00:24:04.527 "name": null, 00:24:04.527 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:04.527 "is_configured": false, 00:24:04.527 "data_offset": 2048, 00:24:04.527 "data_size": 63488 00:24:04.527 }, 00:24:04.527 { 00:24:04.527 "name": null, 00:24:04.527 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:04.527 "is_configured": false, 00:24:04.527 "data_offset": 2048, 00:24:04.527 "data_size": 63488 00:24:04.527 }, 00:24:04.527 { 00:24:04.527 "name": "BaseBdev3", 00:24:04.527 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:24:04.527 "is_configured": true, 00:24:04.527 "data_offset": 2048, 00:24:04.527 "data_size": 63488 00:24:04.527 }, 00:24:04.527 { 00:24:04.527 "name": "BaseBdev4", 00:24:04.527 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:24:04.527 "is_configured": true, 00:24:04.527 "data_offset": 2048, 00:24:04.527 "data_size": 63488 00:24:04.527 } 00:24:04.527 ] 00:24:04.527 }' 00:24:04.527 08:37:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:04.786 [2024-07-23 08:37:17.254079] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:04.786 [2024-07-23 08:37:17.254254] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:24:04.786 [2024-07-23 08:37:17.254274] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:04.786 request: 00:24:04.786 { 00:24:04.786 "base_bdev": "BaseBdev1", 00:24:04.786 "raid_bdev": "raid_bdev1", 00:24:04.786 "method": "bdev_raid_add_base_bdev", 00:24:04.786 "req_id": 1 00:24:04.786 } 00:24:04.786 Got JSON-RPC error response 00:24:04.786 response: 00:24:04.786 { 00:24:04.786 "code": -22, 00:24:04.786 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:04.786 } 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:04.786 08:37:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:06.163 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:06.163 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:06.163 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:06.163 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:06.163 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:06.163 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:06.163 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:06.163 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:06.163 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:06.163 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:06.163 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.163 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.163 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:06.163 "name": "raid_bdev1", 00:24:06.163 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:24:06.163 "strip_size_kb": 0, 00:24:06.163 "state": "online", 00:24:06.163 "raid_level": "raid1", 00:24:06.163 "superblock": true, 00:24:06.163 "num_base_bdevs": 4, 00:24:06.163 "num_base_bdevs_discovered": 2, 00:24:06.163 "num_base_bdevs_operational": 2, 00:24:06.163 "base_bdevs_list": [ 00:24:06.163 { 00:24:06.163 "name": null, 00:24:06.163 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:06.163 "is_configured": false, 00:24:06.163 "data_offset": 2048, 00:24:06.163 "data_size": 63488 00:24:06.163 }, 00:24:06.163 { 00:24:06.163 "name": null, 00:24:06.163 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:06.163 "is_configured": false, 00:24:06.163 "data_offset": 2048, 00:24:06.163 "data_size": 63488 00:24:06.163 }, 00:24:06.163 { 00:24:06.163 "name": "BaseBdev3", 00:24:06.163 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:24:06.163 "is_configured": true, 00:24:06.163 "data_offset": 2048, 00:24:06.163 "data_size": 63488 00:24:06.163 }, 00:24:06.163 { 00:24:06.163 "name": "BaseBdev4", 00:24:06.163 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:24:06.163 "is_configured": true, 00:24:06.163 "data_offset": 2048, 00:24:06.163 "data_size": 63488 00:24:06.163 } 00:24:06.163 ] 00:24:06.163 }' 00:24:06.163 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:06.163 08:37:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:06.422 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:06.422 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:06.422 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:06.422 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:06.422 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:06.681 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.681 08:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.681 08:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:06.681 "name": "raid_bdev1", 00:24:06.681 "uuid": "28145a08-8e51-4363-85c6-42db38ffa799", 00:24:06.681 "strip_size_kb": 0, 00:24:06.681 "state": "online", 00:24:06.681 "raid_level": "raid1", 00:24:06.681 "superblock": true, 00:24:06.681 "num_base_bdevs": 4, 00:24:06.681 "num_base_bdevs_discovered": 2, 00:24:06.681 "num_base_bdevs_operational": 2, 00:24:06.681 "base_bdevs_list": [ 00:24:06.681 { 00:24:06.681 "name": null, 00:24:06.681 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:06.681 "is_configured": false, 00:24:06.681 "data_offset": 2048, 00:24:06.681 "data_size": 63488 00:24:06.681 }, 00:24:06.681 { 00:24:06.681 "name": null, 00:24:06.681 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:06.681 "is_configured": false, 00:24:06.681 "data_offset": 2048, 00:24:06.681 "data_size": 63488 00:24:06.681 }, 00:24:06.681 { 00:24:06.681 "name": "BaseBdev3", 00:24:06.681 "uuid": "05df138b-24cf-5798-91e1-c31718c794f5", 00:24:06.681 "is_configured": true, 00:24:06.681 "data_offset": 2048, 00:24:06.681 "data_size": 63488 00:24:06.681 }, 00:24:06.681 { 00:24:06.681 "name": "BaseBdev4", 00:24:06.681 "uuid": "a8efae45-04fc-5cac-84eb-b46eaa858b8c", 00:24:06.681 "is_configured": true, 00:24:06.681 "data_offset": 2048, 00:24:06.681 "data_size": 63488 00:24:06.681 } 00:24:06.681 ] 00:24:06.681 }' 00:24:06.681 08:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:06.681 08:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:06.681 08:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:06.681 08:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:06.681 08:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 1545644 00:24:06.681 08:37:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1545644 ']' 00:24:06.681 08:37:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 1545644 00:24:06.681 08:37:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:24:06.681 08:37:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:06.681 08:37:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1545644 00:24:06.939 08:37:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:06.939 08:37:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:06.939 08:37:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1545644' 00:24:06.939 killing process with pid 1545644 00:24:06.939 08:37:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 1545644 00:24:06.939 Received shutdown signal, test time was about 60.000000 seconds 00:24:06.939 00:24:06.939 Latency(us) 00:24:06.939 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:06.939 =================================================================================================================== 00:24:06.939 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:06.939 08:37:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 1545644 00:24:06.939 [2024-07-23 08:37:19.231116] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:06.939 [2024-07-23 08:37:19.231257] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:06.939 [2024-07-23 08:37:19.231323] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:06.939 [2024-07-23 08:37:19.231337] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x61600003a280 name raid_bdev1, state offline 00:24:07.198 [2024-07-23 08:37:19.647183] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:08.575 08:37:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:24:08.575 00:24:08.575 real 0m32.377s 00:24:08.575 user 0m46.036s 00:24:08.575 sys 0m4.347s 00:24:08.575 08:37:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:08.575 08:37:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:08.575 ************************************ 00:24:08.575 END TEST raid_rebuild_test_sb 00:24:08.575 ************************************ 00:24:08.575 08:37:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:08.575 08:37:20 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:24:08.575 08:37:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:08.575 08:37:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:08.575 08:37:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:08.575 ************************************ 00:24:08.575 START TEST raid_rebuild_test_io 00:24:08.575 ************************************ 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1551959 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1551959 /var/tmp/spdk-raid.sock 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 1551959 ']' 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:08.575 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:08.575 08:37:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:08.833 [2024-07-23 08:37:21.111429] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:24:08.833 [2024-07-23 08:37:21.111539] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1551959 ] 00:24:08.833 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:08.833 Zero copy mechanism will not be used. 00:24:08.833 [2024-07-23 08:37:21.234992] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:09.092 [2024-07-23 08:37:21.459102] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:09.351 [2024-07-23 08:37:21.725575] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:09.351 [2024-07-23 08:37:21.725604] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:09.609 08:37:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:09.609 08:37:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:24:09.609 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:09.609 08:37:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:09.609 BaseBdev1_malloc 00:24:09.609 08:37:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:09.867 [2024-07-23 08:37:22.236518] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:09.867 [2024-07-23 08:37:22.236577] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:09.868 [2024-07-23 08:37:22.236600] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:24:09.868 [2024-07-23 08:37:22.236623] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:09.868 [2024-07-23 08:37:22.238647] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:09.868 [2024-07-23 08:37:22.238677] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:09.868 BaseBdev1 00:24:09.868 08:37:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:09.868 08:37:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:10.126 BaseBdev2_malloc 00:24:10.126 08:37:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:10.126 [2024-07-23 08:37:22.604360] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:10.126 [2024-07-23 08:37:22.604411] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:10.126 [2024-07-23 08:37:22.604446] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:24:10.126 [2024-07-23 08:37:22.604459] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:10.126 [2024-07-23 08:37:22.606393] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:10.126 [2024-07-23 08:37:22.606421] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:10.126 BaseBdev2 00:24:10.126 08:37:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:10.126 08:37:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:10.384 BaseBdev3_malloc 00:24:10.384 08:37:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:10.642 [2024-07-23 08:37:22.991984] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:10.642 [2024-07-23 08:37:22.992046] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:10.642 [2024-07-23 08:37:22.992069] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036080 00:24:10.642 [2024-07-23 08:37:22.992080] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:10.642 [2024-07-23 08:37:22.994045] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:10.642 [2024-07-23 08:37:22.994074] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:10.642 BaseBdev3 00:24:10.642 08:37:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:10.642 08:37:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:10.901 BaseBdev4_malloc 00:24:10.901 08:37:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:10.901 [2024-07-23 08:37:23.356188] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:10.901 [2024-07-23 08:37:23.356243] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:10.901 [2024-07-23 08:37:23.356262] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036c80 00:24:10.901 [2024-07-23 08:37:23.356272] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:10.901 [2024-07-23 08:37:23.358231] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:10.901 [2024-07-23 08:37:23.358259] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:10.901 BaseBdev4 00:24:10.901 08:37:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:11.159 spare_malloc 00:24:11.159 08:37:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:11.418 spare_delay 00:24:11.418 08:37:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:11.418 [2024-07-23 08:37:23.891666] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:11.418 [2024-07-23 08:37:23.891727] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:11.418 [2024-07-23 08:37:23.891747] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037e80 00:24:11.418 [2024-07-23 08:37:23.891758] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:11.418 [2024-07-23 08:37:23.893760] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:11.418 [2024-07-23 08:37:23.893789] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:11.418 spare 00:24:11.418 08:37:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:11.686 [2024-07-23 08:37:24.048107] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:11.686 [2024-07-23 08:37:24.049777] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:11.686 [2024-07-23 08:37:24.049836] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:11.686 [2024-07-23 08:37:24.049884] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:11.686 [2024-07-23 08:37:24.049977] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000038480 00:24:11.686 [2024-07-23 08:37:24.049990] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:11.686 [2024-07-23 08:37:24.050275] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:24:11.686 [2024-07-23 08:37:24.050482] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000038480 00:24:11.686 [2024-07-23 08:37:24.050493] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000038480 00:24:11.686 [2024-07-23 08:37:24.050683] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:11.686 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:11.686 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:11.686 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:11.686 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:11.686 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:11.686 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:11.686 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:11.686 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:11.686 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:11.686 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:11.686 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.686 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.997 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:11.997 "name": "raid_bdev1", 00:24:11.997 "uuid": "5b435716-48b3-4f72-b07e-2e956cca835c", 00:24:11.997 "strip_size_kb": 0, 00:24:11.997 "state": "online", 00:24:11.997 "raid_level": "raid1", 00:24:11.997 "superblock": false, 00:24:11.997 "num_base_bdevs": 4, 00:24:11.997 "num_base_bdevs_discovered": 4, 00:24:11.997 "num_base_bdevs_operational": 4, 00:24:11.997 "base_bdevs_list": [ 00:24:11.997 { 00:24:11.997 "name": "BaseBdev1", 00:24:11.997 "uuid": "ab80c718-2610-588f-9acd-929ff11a9d7f", 00:24:11.997 "is_configured": true, 00:24:11.997 "data_offset": 0, 00:24:11.997 "data_size": 65536 00:24:11.997 }, 00:24:11.997 { 00:24:11.997 "name": "BaseBdev2", 00:24:11.997 "uuid": "3831af0e-9862-5877-bb4a-9e7dc768ca10", 00:24:11.997 "is_configured": true, 00:24:11.997 "data_offset": 0, 00:24:11.997 "data_size": 65536 00:24:11.997 }, 00:24:11.997 { 00:24:11.997 "name": "BaseBdev3", 00:24:11.997 "uuid": "ad5a6322-0ab4-5581-b970-51f797c6bced", 00:24:11.997 "is_configured": true, 00:24:11.997 "data_offset": 0, 00:24:11.997 "data_size": 65536 00:24:11.997 }, 00:24:11.997 { 00:24:11.997 "name": "BaseBdev4", 00:24:11.997 "uuid": "45e9de6c-8e6c-5ac7-98a4-5ac4b4ddac43", 00:24:11.998 "is_configured": true, 00:24:11.998 "data_offset": 0, 00:24:11.998 "data_size": 65536 00:24:11.998 } 00:24:11.998 ] 00:24:11.998 }' 00:24:11.998 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:11.998 08:37:24 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:12.257 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:12.257 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:12.516 [2024-07-23 08:37:24.854516] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:12.516 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:12.516 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.516 08:37:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:12.775 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:12.775 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:24:12.775 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:12.776 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:12.776 [2024-07-23 08:37:25.120209] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c2d0 00:24:12.776 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:12.776 Zero copy mechanism will not be used. 00:24:12.776 Running I/O for 60 seconds... 00:24:12.776 [2024-07-23 08:37:25.200938] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:12.776 [2024-07-23 08:37:25.211784] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d00000c2d0 00:24:12.776 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:12.776 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:12.776 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:12.776 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:12.776 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:12.776 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:12.776 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:12.776 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:12.776 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:12.776 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:12.776 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.776 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.035 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:13.035 "name": "raid_bdev1", 00:24:13.035 "uuid": "5b435716-48b3-4f72-b07e-2e956cca835c", 00:24:13.035 "strip_size_kb": 0, 00:24:13.035 "state": "online", 00:24:13.035 "raid_level": "raid1", 00:24:13.035 "superblock": false, 00:24:13.035 "num_base_bdevs": 4, 00:24:13.035 "num_base_bdevs_discovered": 3, 00:24:13.035 "num_base_bdevs_operational": 3, 00:24:13.035 "base_bdevs_list": [ 00:24:13.035 { 00:24:13.035 "name": null, 00:24:13.035 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:13.035 "is_configured": false, 00:24:13.035 "data_offset": 0, 00:24:13.035 "data_size": 65536 00:24:13.035 }, 00:24:13.035 { 00:24:13.035 "name": "BaseBdev2", 00:24:13.035 "uuid": "3831af0e-9862-5877-bb4a-9e7dc768ca10", 00:24:13.035 "is_configured": true, 00:24:13.035 "data_offset": 0, 00:24:13.035 "data_size": 65536 00:24:13.035 }, 00:24:13.035 { 00:24:13.035 "name": "BaseBdev3", 00:24:13.035 "uuid": "ad5a6322-0ab4-5581-b970-51f797c6bced", 00:24:13.035 "is_configured": true, 00:24:13.035 "data_offset": 0, 00:24:13.035 "data_size": 65536 00:24:13.035 }, 00:24:13.035 { 00:24:13.035 "name": "BaseBdev4", 00:24:13.035 "uuid": "45e9de6c-8e6c-5ac7-98a4-5ac4b4ddac43", 00:24:13.035 "is_configured": true, 00:24:13.035 "data_offset": 0, 00:24:13.035 "data_size": 65536 00:24:13.035 } 00:24:13.035 ] 00:24:13.035 }' 00:24:13.035 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:13.035 08:37:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:13.607 08:37:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:13.607 [2024-07-23 08:37:26.109459] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:13.872 08:37:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:13.872 [2024-07-23 08:37:26.181044] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c3a0 00:24:13.872 [2024-07-23 08:37:26.182797] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:13.872 [2024-07-23 08:37:26.306432] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:13.872 [2024-07-23 08:37:26.306754] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:14.131 [2024-07-23 08:37:26.434595] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:14.131 [2024-07-23 08:37:26.435255] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:14.389 [2024-07-23 08:37:26.758202] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:14.648 [2024-07-23 08:37:26.961416] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:14.648 [2024-07-23 08:37:26.961703] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:14.907 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:14.907 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:14.907 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:14.907 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:14.907 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:14.907 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.907 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:14.907 [2024-07-23 08:37:27.209254] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:14.907 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:14.907 "name": "raid_bdev1", 00:24:14.907 "uuid": "5b435716-48b3-4f72-b07e-2e956cca835c", 00:24:14.907 "strip_size_kb": 0, 00:24:14.907 "state": "online", 00:24:14.907 "raid_level": "raid1", 00:24:14.907 "superblock": false, 00:24:14.907 "num_base_bdevs": 4, 00:24:14.907 "num_base_bdevs_discovered": 4, 00:24:14.907 "num_base_bdevs_operational": 4, 00:24:14.907 "process": { 00:24:14.907 "type": "rebuild", 00:24:14.907 "target": "spare", 00:24:14.907 "progress": { 00:24:14.907 "blocks": 14336, 00:24:14.907 "percent": 21 00:24:14.907 } 00:24:14.907 }, 00:24:14.907 "base_bdevs_list": [ 00:24:14.907 { 00:24:14.907 "name": "spare", 00:24:14.907 "uuid": "4d647fa2-2f47-5f9f-a15f-1b78cdc1a5d0", 00:24:14.907 "is_configured": true, 00:24:14.907 "data_offset": 0, 00:24:14.907 "data_size": 65536 00:24:14.907 }, 00:24:14.907 { 00:24:14.907 "name": "BaseBdev2", 00:24:14.907 "uuid": "3831af0e-9862-5877-bb4a-9e7dc768ca10", 00:24:14.907 "is_configured": true, 00:24:14.907 "data_offset": 0, 00:24:14.907 "data_size": 65536 00:24:14.907 }, 00:24:14.907 { 00:24:14.907 "name": "BaseBdev3", 00:24:14.907 "uuid": "ad5a6322-0ab4-5581-b970-51f797c6bced", 00:24:14.907 "is_configured": true, 00:24:14.907 "data_offset": 0, 00:24:14.907 "data_size": 65536 00:24:14.907 }, 00:24:14.907 { 00:24:14.907 "name": "BaseBdev4", 00:24:14.907 "uuid": "45e9de6c-8e6c-5ac7-98a4-5ac4b4ddac43", 00:24:14.907 "is_configured": true, 00:24:14.907 "data_offset": 0, 00:24:14.907 "data_size": 65536 00:24:14.907 } 00:24:14.907 ] 00:24:14.907 }' 00:24:14.907 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:14.907 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:14.907 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:15.166 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:15.166 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:15.166 [2024-07-23 08:37:27.433998] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:15.166 [2024-07-23 08:37:27.583060] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:15.425 [2024-07-23 08:37:27.770225] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:15.425 [2024-07-23 08:37:27.778764] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:15.425 [2024-07-23 08:37:27.778796] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:15.425 [2024-07-23 08:37:27.778809] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:15.425 [2024-07-23 08:37:27.800247] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d00000c2d0 00:24:15.425 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:15.425 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:15.425 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:15.425 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:15.425 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:15.425 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:15.425 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:15.425 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:15.425 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:15.425 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:15.425 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.425 08:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.684 08:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:15.684 "name": "raid_bdev1", 00:24:15.684 "uuid": "5b435716-48b3-4f72-b07e-2e956cca835c", 00:24:15.684 "strip_size_kb": 0, 00:24:15.684 "state": "online", 00:24:15.684 "raid_level": "raid1", 00:24:15.684 "superblock": false, 00:24:15.684 "num_base_bdevs": 4, 00:24:15.684 "num_base_bdevs_discovered": 3, 00:24:15.684 "num_base_bdevs_operational": 3, 00:24:15.684 "base_bdevs_list": [ 00:24:15.684 { 00:24:15.684 "name": null, 00:24:15.684 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:15.684 "is_configured": false, 00:24:15.684 "data_offset": 0, 00:24:15.684 "data_size": 65536 00:24:15.684 }, 00:24:15.684 { 00:24:15.684 "name": "BaseBdev2", 00:24:15.684 "uuid": "3831af0e-9862-5877-bb4a-9e7dc768ca10", 00:24:15.684 "is_configured": true, 00:24:15.684 "data_offset": 0, 00:24:15.684 "data_size": 65536 00:24:15.684 }, 00:24:15.684 { 00:24:15.684 "name": "BaseBdev3", 00:24:15.684 "uuid": "ad5a6322-0ab4-5581-b970-51f797c6bced", 00:24:15.684 "is_configured": true, 00:24:15.684 "data_offset": 0, 00:24:15.684 "data_size": 65536 00:24:15.684 }, 00:24:15.684 { 00:24:15.684 "name": "BaseBdev4", 00:24:15.684 "uuid": "45e9de6c-8e6c-5ac7-98a4-5ac4b4ddac43", 00:24:15.684 "is_configured": true, 00:24:15.684 "data_offset": 0, 00:24:15.684 "data_size": 65536 00:24:15.684 } 00:24:15.684 ] 00:24:15.684 }' 00:24:15.684 08:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:15.684 08:37:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:16.251 08:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:16.251 08:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:16.251 08:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:16.251 08:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:16.251 08:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:16.251 08:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.251 08:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:16.251 08:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:16.251 "name": "raid_bdev1", 00:24:16.251 "uuid": "5b435716-48b3-4f72-b07e-2e956cca835c", 00:24:16.251 "strip_size_kb": 0, 00:24:16.251 "state": "online", 00:24:16.251 "raid_level": "raid1", 00:24:16.251 "superblock": false, 00:24:16.251 "num_base_bdevs": 4, 00:24:16.251 "num_base_bdevs_discovered": 3, 00:24:16.251 "num_base_bdevs_operational": 3, 00:24:16.251 "base_bdevs_list": [ 00:24:16.251 { 00:24:16.251 "name": null, 00:24:16.251 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:16.251 "is_configured": false, 00:24:16.251 "data_offset": 0, 00:24:16.251 "data_size": 65536 00:24:16.251 }, 00:24:16.251 { 00:24:16.251 "name": "BaseBdev2", 00:24:16.251 "uuid": "3831af0e-9862-5877-bb4a-9e7dc768ca10", 00:24:16.251 "is_configured": true, 00:24:16.251 "data_offset": 0, 00:24:16.251 "data_size": 65536 00:24:16.251 }, 00:24:16.252 { 00:24:16.252 "name": "BaseBdev3", 00:24:16.252 "uuid": "ad5a6322-0ab4-5581-b970-51f797c6bced", 00:24:16.252 "is_configured": true, 00:24:16.252 "data_offset": 0, 00:24:16.252 "data_size": 65536 00:24:16.252 }, 00:24:16.252 { 00:24:16.252 "name": "BaseBdev4", 00:24:16.252 "uuid": "45e9de6c-8e6c-5ac7-98a4-5ac4b4ddac43", 00:24:16.252 "is_configured": true, 00:24:16.252 "data_offset": 0, 00:24:16.252 "data_size": 65536 00:24:16.252 } 00:24:16.252 ] 00:24:16.252 }' 00:24:16.252 08:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:16.510 08:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:16.510 08:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:16.510 08:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:16.510 08:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:16.510 [2024-07-23 08:37:28.973298] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:16.510 08:37:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:16.510 [2024-07-23 08:37:29.025798] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c470 00:24:16.510 [2024-07-23 08:37:29.027496] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:16.768 [2024-07-23 08:37:29.136976] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:16.768 [2024-07-23 08:37:29.137404] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:17.027 [2024-07-23 08:37:29.361626] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:17.027 [2024-07-23 08:37:29.362257] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:17.286 [2024-07-23 08:37:29.703480] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:17.286 [2024-07-23 08:37:29.703819] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:17.545 [2024-07-23 08:37:29.922061] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:17.545 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:17.545 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:17.545 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:17.545 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:17.545 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:17.545 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.545 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:17.803 [2024-07-23 08:37:30.151774] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:17.803 [2024-07-23 08:37:30.153013] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:17.803 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:17.803 "name": "raid_bdev1", 00:24:17.803 "uuid": "5b435716-48b3-4f72-b07e-2e956cca835c", 00:24:17.803 "strip_size_kb": 0, 00:24:17.803 "state": "online", 00:24:17.803 "raid_level": "raid1", 00:24:17.803 "superblock": false, 00:24:17.803 "num_base_bdevs": 4, 00:24:17.803 "num_base_bdevs_discovered": 4, 00:24:17.803 "num_base_bdevs_operational": 4, 00:24:17.803 "process": { 00:24:17.803 "type": "rebuild", 00:24:17.804 "target": "spare", 00:24:17.804 "progress": { 00:24:17.804 "blocks": 14336, 00:24:17.804 "percent": 21 00:24:17.804 } 00:24:17.804 }, 00:24:17.804 "base_bdevs_list": [ 00:24:17.804 { 00:24:17.804 "name": "spare", 00:24:17.804 "uuid": "4d647fa2-2f47-5f9f-a15f-1b78cdc1a5d0", 00:24:17.804 "is_configured": true, 00:24:17.804 "data_offset": 0, 00:24:17.804 "data_size": 65536 00:24:17.804 }, 00:24:17.804 { 00:24:17.804 "name": "BaseBdev2", 00:24:17.804 "uuid": "3831af0e-9862-5877-bb4a-9e7dc768ca10", 00:24:17.804 "is_configured": true, 00:24:17.804 "data_offset": 0, 00:24:17.804 "data_size": 65536 00:24:17.804 }, 00:24:17.804 { 00:24:17.804 "name": "BaseBdev3", 00:24:17.804 "uuid": "ad5a6322-0ab4-5581-b970-51f797c6bced", 00:24:17.804 "is_configured": true, 00:24:17.804 "data_offset": 0, 00:24:17.804 "data_size": 65536 00:24:17.804 }, 00:24:17.804 { 00:24:17.804 "name": "BaseBdev4", 00:24:17.804 "uuid": "45e9de6c-8e6c-5ac7-98a4-5ac4b4ddac43", 00:24:17.804 "is_configured": true, 00:24:17.804 "data_offset": 0, 00:24:17.804 "data_size": 65536 00:24:17.804 } 00:24:17.804 ] 00:24:17.804 }' 00:24:17.804 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:17.804 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:17.804 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:17.804 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:17.804 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:24:17.804 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:24:17.804 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:17.804 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:24:17.804 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:18.061 [2024-07-23 08:37:30.436289] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:18.320 [2024-07-23 08:37:30.607513] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x60d00000c2d0 00:24:18.320 [2024-07-23 08:37:30.607554] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x60d00000c470 00:24:18.320 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:24:18.320 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:24:18.320 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:18.320 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:18.320 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:18.320 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:18.320 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:18.320 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.320 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:18.320 [2024-07-23 08:37:30.725376] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:18.320 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:18.320 "name": "raid_bdev1", 00:24:18.320 "uuid": "5b435716-48b3-4f72-b07e-2e956cca835c", 00:24:18.320 "strip_size_kb": 0, 00:24:18.320 "state": "online", 00:24:18.320 "raid_level": "raid1", 00:24:18.320 "superblock": false, 00:24:18.320 "num_base_bdevs": 4, 00:24:18.320 "num_base_bdevs_discovered": 3, 00:24:18.320 "num_base_bdevs_operational": 3, 00:24:18.320 "process": { 00:24:18.320 "type": "rebuild", 00:24:18.320 "target": "spare", 00:24:18.320 "progress": { 00:24:18.320 "blocks": 20480, 00:24:18.320 "percent": 31 00:24:18.320 } 00:24:18.320 }, 00:24:18.320 "base_bdevs_list": [ 00:24:18.320 { 00:24:18.320 "name": "spare", 00:24:18.320 "uuid": "4d647fa2-2f47-5f9f-a15f-1b78cdc1a5d0", 00:24:18.320 "is_configured": true, 00:24:18.320 "data_offset": 0, 00:24:18.320 "data_size": 65536 00:24:18.320 }, 00:24:18.320 { 00:24:18.320 "name": null, 00:24:18.320 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:18.320 "is_configured": false, 00:24:18.320 "data_offset": 0, 00:24:18.320 "data_size": 65536 00:24:18.320 }, 00:24:18.320 { 00:24:18.320 "name": "BaseBdev3", 00:24:18.320 "uuid": "ad5a6322-0ab4-5581-b970-51f797c6bced", 00:24:18.320 "is_configured": true, 00:24:18.320 "data_offset": 0, 00:24:18.320 "data_size": 65536 00:24:18.320 }, 00:24:18.320 { 00:24:18.320 "name": "BaseBdev4", 00:24:18.320 "uuid": "45e9de6c-8e6c-5ac7-98a4-5ac4b4ddac43", 00:24:18.320 "is_configured": true, 00:24:18.320 "data_offset": 0, 00:24:18.320 "data_size": 65536 00:24:18.320 } 00:24:18.320 ] 00:24:18.320 }' 00:24:18.320 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:18.578 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:18.578 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:18.578 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:18.578 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=803 00:24:18.578 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:18.578 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:18.578 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:18.578 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:18.578 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:18.578 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:18.578 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:18.578 08:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.578 [2024-07-23 08:37:30.948518] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:18.578 08:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:18.578 "name": "raid_bdev1", 00:24:18.578 "uuid": "5b435716-48b3-4f72-b07e-2e956cca835c", 00:24:18.578 "strip_size_kb": 0, 00:24:18.578 "state": "online", 00:24:18.578 "raid_level": "raid1", 00:24:18.578 "superblock": false, 00:24:18.578 "num_base_bdevs": 4, 00:24:18.578 "num_base_bdevs_discovered": 3, 00:24:18.578 "num_base_bdevs_operational": 3, 00:24:18.578 "process": { 00:24:18.578 "type": "rebuild", 00:24:18.578 "target": "spare", 00:24:18.578 "progress": { 00:24:18.578 "blocks": 22528, 00:24:18.578 "percent": 34 00:24:18.578 } 00:24:18.578 }, 00:24:18.578 "base_bdevs_list": [ 00:24:18.578 { 00:24:18.578 "name": "spare", 00:24:18.578 "uuid": "4d647fa2-2f47-5f9f-a15f-1b78cdc1a5d0", 00:24:18.578 "is_configured": true, 00:24:18.578 "data_offset": 0, 00:24:18.578 "data_size": 65536 00:24:18.578 }, 00:24:18.578 { 00:24:18.578 "name": null, 00:24:18.578 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:18.578 "is_configured": false, 00:24:18.578 "data_offset": 0, 00:24:18.578 "data_size": 65536 00:24:18.578 }, 00:24:18.578 { 00:24:18.578 "name": "BaseBdev3", 00:24:18.578 "uuid": "ad5a6322-0ab4-5581-b970-51f797c6bced", 00:24:18.578 "is_configured": true, 00:24:18.578 "data_offset": 0, 00:24:18.578 "data_size": 65536 00:24:18.578 }, 00:24:18.578 { 00:24:18.578 "name": "BaseBdev4", 00:24:18.578 "uuid": "45e9de6c-8e6c-5ac7-98a4-5ac4b4ddac43", 00:24:18.578 "is_configured": true, 00:24:18.578 "data_offset": 0, 00:24:18.578 "data_size": 65536 00:24:18.578 } 00:24:18.578 ] 00:24:18.578 }' 00:24:18.578 08:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:18.836 08:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:18.836 08:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:18.836 08:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:18.836 08:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:18.837 [2024-07-23 08:37:31.166430] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:24:18.837 [2024-07-23 08:37:31.268013] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:24:18.837 [2024-07-23 08:37:31.268229] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:24:19.403 [2024-07-23 08:37:31.621660] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:24:19.662 08:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:19.662 08:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:19.662 08:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:19.662 08:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:19.662 08:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:19.662 08:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:19.662 08:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.662 08:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:19.921 08:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:19.921 "name": "raid_bdev1", 00:24:19.921 "uuid": "5b435716-48b3-4f72-b07e-2e956cca835c", 00:24:19.921 "strip_size_kb": 0, 00:24:19.921 "state": "online", 00:24:19.921 "raid_level": "raid1", 00:24:19.921 "superblock": false, 00:24:19.921 "num_base_bdevs": 4, 00:24:19.921 "num_base_bdevs_discovered": 3, 00:24:19.921 "num_base_bdevs_operational": 3, 00:24:19.921 "process": { 00:24:19.921 "type": "rebuild", 00:24:19.921 "target": "spare", 00:24:19.921 "progress": { 00:24:19.921 "blocks": 47104, 00:24:19.921 "percent": 71 00:24:19.921 } 00:24:19.921 }, 00:24:19.921 "base_bdevs_list": [ 00:24:19.921 { 00:24:19.921 "name": "spare", 00:24:19.921 "uuid": "4d647fa2-2f47-5f9f-a15f-1b78cdc1a5d0", 00:24:19.921 "is_configured": true, 00:24:19.921 "data_offset": 0, 00:24:19.921 "data_size": 65536 00:24:19.921 }, 00:24:19.921 { 00:24:19.921 "name": null, 00:24:19.921 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:19.921 "is_configured": false, 00:24:19.921 "data_offset": 0, 00:24:19.921 "data_size": 65536 00:24:19.921 }, 00:24:19.921 { 00:24:19.921 "name": "BaseBdev3", 00:24:19.921 "uuid": "ad5a6322-0ab4-5581-b970-51f797c6bced", 00:24:19.921 "is_configured": true, 00:24:19.921 "data_offset": 0, 00:24:19.921 "data_size": 65536 00:24:19.921 }, 00:24:19.921 { 00:24:19.921 "name": "BaseBdev4", 00:24:19.921 "uuid": "45e9de6c-8e6c-5ac7-98a4-5ac4b4ddac43", 00:24:19.921 "is_configured": true, 00:24:19.921 "data_offset": 0, 00:24:19.921 "data_size": 65536 00:24:19.921 } 00:24:19.921 ] 00:24:19.921 }' 00:24:19.921 08:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:19.921 08:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:19.921 08:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:19.921 08:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:19.921 08:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:20.180 [2024-07-23 08:37:32.521750] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:24:20.439 [2024-07-23 08:37:32.950704] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:24:20.697 [2024-07-23 08:37:33.165552] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:24:20.964 08:37:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:20.964 08:37:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:20.964 08:37:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:20.964 08:37:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:20.964 08:37:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:20.964 08:37:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:20.964 08:37:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:20.964 08:37:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.224 [2024-07-23 08:37:33.495220] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:21.224 [2024-07-23 08:37:33.595458] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:21.224 [2024-07-23 08:37:33.603351] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:21.224 08:37:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:21.224 "name": "raid_bdev1", 00:24:21.224 "uuid": "5b435716-48b3-4f72-b07e-2e956cca835c", 00:24:21.224 "strip_size_kb": 0, 00:24:21.224 "state": "online", 00:24:21.224 "raid_level": "raid1", 00:24:21.224 "superblock": false, 00:24:21.224 "num_base_bdevs": 4, 00:24:21.224 "num_base_bdevs_discovered": 3, 00:24:21.224 "num_base_bdevs_operational": 3, 00:24:21.224 "process": { 00:24:21.224 "type": "rebuild", 00:24:21.224 "target": "spare", 00:24:21.224 "progress": { 00:24:21.225 "blocks": 65536, 00:24:21.225 "percent": 100 00:24:21.225 } 00:24:21.225 }, 00:24:21.225 "base_bdevs_list": [ 00:24:21.225 { 00:24:21.225 "name": "spare", 00:24:21.225 "uuid": "4d647fa2-2f47-5f9f-a15f-1b78cdc1a5d0", 00:24:21.225 "is_configured": true, 00:24:21.225 "data_offset": 0, 00:24:21.225 "data_size": 65536 00:24:21.225 }, 00:24:21.225 { 00:24:21.225 "name": null, 00:24:21.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:21.225 "is_configured": false, 00:24:21.225 "data_offset": 0, 00:24:21.225 "data_size": 65536 00:24:21.225 }, 00:24:21.225 { 00:24:21.225 "name": "BaseBdev3", 00:24:21.225 "uuid": "ad5a6322-0ab4-5581-b970-51f797c6bced", 00:24:21.225 "is_configured": true, 00:24:21.225 "data_offset": 0, 00:24:21.225 "data_size": 65536 00:24:21.225 }, 00:24:21.225 { 00:24:21.225 "name": "BaseBdev4", 00:24:21.225 "uuid": "45e9de6c-8e6c-5ac7-98a4-5ac4b4ddac43", 00:24:21.225 "is_configured": true, 00:24:21.225 "data_offset": 0, 00:24:21.225 "data_size": 65536 00:24:21.225 } 00:24:21.225 ] 00:24:21.225 }' 00:24:21.225 08:37:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:21.225 08:37:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:21.225 08:37:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:21.225 08:37:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:21.225 08:37:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:22.599 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:22.599 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:22.599 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:22.599 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:22.599 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:22.599 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:22.599 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.599 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:22.599 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:22.600 "name": "raid_bdev1", 00:24:22.600 "uuid": "5b435716-48b3-4f72-b07e-2e956cca835c", 00:24:22.600 "strip_size_kb": 0, 00:24:22.600 "state": "online", 00:24:22.600 "raid_level": "raid1", 00:24:22.600 "superblock": false, 00:24:22.600 "num_base_bdevs": 4, 00:24:22.600 "num_base_bdevs_discovered": 3, 00:24:22.600 "num_base_bdevs_operational": 3, 00:24:22.600 "base_bdevs_list": [ 00:24:22.600 { 00:24:22.600 "name": "spare", 00:24:22.600 "uuid": "4d647fa2-2f47-5f9f-a15f-1b78cdc1a5d0", 00:24:22.600 "is_configured": true, 00:24:22.600 "data_offset": 0, 00:24:22.600 "data_size": 65536 00:24:22.600 }, 00:24:22.600 { 00:24:22.600 "name": null, 00:24:22.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:22.600 "is_configured": false, 00:24:22.600 "data_offset": 0, 00:24:22.600 "data_size": 65536 00:24:22.600 }, 00:24:22.600 { 00:24:22.600 "name": "BaseBdev3", 00:24:22.600 "uuid": "ad5a6322-0ab4-5581-b970-51f797c6bced", 00:24:22.600 "is_configured": true, 00:24:22.600 "data_offset": 0, 00:24:22.600 "data_size": 65536 00:24:22.600 }, 00:24:22.600 { 00:24:22.600 "name": "BaseBdev4", 00:24:22.600 "uuid": "45e9de6c-8e6c-5ac7-98a4-5ac4b4ddac43", 00:24:22.600 "is_configured": true, 00:24:22.600 "data_offset": 0, 00:24:22.600 "data_size": 65536 00:24:22.600 } 00:24:22.600 ] 00:24:22.600 }' 00:24:22.600 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:22.600 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:22.600 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:22.600 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:22.600 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:24:22.600 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:22.600 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:22.600 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:22.600 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:22.600 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:22.600 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:22.600 08:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:22.858 "name": "raid_bdev1", 00:24:22.858 "uuid": "5b435716-48b3-4f72-b07e-2e956cca835c", 00:24:22.858 "strip_size_kb": 0, 00:24:22.858 "state": "online", 00:24:22.858 "raid_level": "raid1", 00:24:22.858 "superblock": false, 00:24:22.858 "num_base_bdevs": 4, 00:24:22.858 "num_base_bdevs_discovered": 3, 00:24:22.858 "num_base_bdevs_operational": 3, 00:24:22.858 "base_bdevs_list": [ 00:24:22.858 { 00:24:22.858 "name": "spare", 00:24:22.858 "uuid": "4d647fa2-2f47-5f9f-a15f-1b78cdc1a5d0", 00:24:22.858 "is_configured": true, 00:24:22.858 "data_offset": 0, 00:24:22.858 "data_size": 65536 00:24:22.858 }, 00:24:22.858 { 00:24:22.858 "name": null, 00:24:22.858 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:22.858 "is_configured": false, 00:24:22.858 "data_offset": 0, 00:24:22.858 "data_size": 65536 00:24:22.858 }, 00:24:22.858 { 00:24:22.858 "name": "BaseBdev3", 00:24:22.858 "uuid": "ad5a6322-0ab4-5581-b970-51f797c6bced", 00:24:22.858 "is_configured": true, 00:24:22.858 "data_offset": 0, 00:24:22.858 "data_size": 65536 00:24:22.858 }, 00:24:22.858 { 00:24:22.858 "name": "BaseBdev4", 00:24:22.858 "uuid": "45e9de6c-8e6c-5ac7-98a4-5ac4b4ddac43", 00:24:22.858 "is_configured": true, 00:24:22.858 "data_offset": 0, 00:24:22.858 "data_size": 65536 00:24:22.858 } 00:24:22.858 ] 00:24:22.858 }' 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.858 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:23.117 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:23.117 "name": "raid_bdev1", 00:24:23.117 "uuid": "5b435716-48b3-4f72-b07e-2e956cca835c", 00:24:23.117 "strip_size_kb": 0, 00:24:23.117 "state": "online", 00:24:23.117 "raid_level": "raid1", 00:24:23.117 "superblock": false, 00:24:23.117 "num_base_bdevs": 4, 00:24:23.117 "num_base_bdevs_discovered": 3, 00:24:23.117 "num_base_bdevs_operational": 3, 00:24:23.117 "base_bdevs_list": [ 00:24:23.117 { 00:24:23.117 "name": "spare", 00:24:23.117 "uuid": "4d647fa2-2f47-5f9f-a15f-1b78cdc1a5d0", 00:24:23.117 "is_configured": true, 00:24:23.117 "data_offset": 0, 00:24:23.117 "data_size": 65536 00:24:23.117 }, 00:24:23.117 { 00:24:23.117 "name": null, 00:24:23.117 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:23.117 "is_configured": false, 00:24:23.117 "data_offset": 0, 00:24:23.117 "data_size": 65536 00:24:23.117 }, 00:24:23.117 { 00:24:23.117 "name": "BaseBdev3", 00:24:23.117 "uuid": "ad5a6322-0ab4-5581-b970-51f797c6bced", 00:24:23.117 "is_configured": true, 00:24:23.117 "data_offset": 0, 00:24:23.117 "data_size": 65536 00:24:23.117 }, 00:24:23.117 { 00:24:23.117 "name": "BaseBdev4", 00:24:23.117 "uuid": "45e9de6c-8e6c-5ac7-98a4-5ac4b4ddac43", 00:24:23.117 "is_configured": true, 00:24:23.117 "data_offset": 0, 00:24:23.117 "data_size": 65536 00:24:23.117 } 00:24:23.117 ] 00:24:23.117 }' 00:24:23.117 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:23.117 08:37:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:23.684 08:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:23.684 [2024-07-23 08:37:36.071275] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:23.684 [2024-07-23 08:37:36.071308] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:23.684 00:24:23.684 Latency(us) 00:24:23.684 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:23.684 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:24:23.684 raid_bdev1 : 10.96 90.80 272.39 0.00 0.00 15842.43 296.47 110350.14 00:24:23.684 =================================================================================================================== 00:24:23.684 Total : 90.80 272.39 0.00 0.00 15842.43 296.47 110350.14 00:24:23.684 [2024-07-23 08:37:36.126879] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:23.684 [2024-07-23 08:37:36.126916] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:23.684 [2024-07-23 08:37:36.127003] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:23.684 [2024-07-23 08:37:36.127016] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038480 name raid_bdev1, state offline 00:24:23.684 0 00:24:23.684 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.684 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:24:23.943 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:23.943 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:23.943 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:24:23.943 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:24:23.943 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:23.943 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:24:23.943 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:23.943 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:23.943 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:23.943 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:23.943 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:23.943 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:23.943 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:24:23.943 /dev/nbd0 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:24.202 1+0 records in 00:24:24.202 1+0 records out 00:24:24.202 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000223319 s, 18.3 MB/s 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:24:24.202 /dev/nbd1 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:24.202 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:24.461 1+0 records in 00:24:24.461 1+0 records out 00:24:24.461 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000147741 s, 27.7 MB/s 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:24.461 08:37:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:24.720 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:24.720 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:24.720 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:24.721 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:24:25.021 /dev/nbd1 00:24:25.021 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:25.021 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:25.021 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:25.021 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:24:25.021 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:25.022 1+0 records in 00:24:25.022 1+0 records out 00:24:25.022 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000158245 s, 25.9 MB/s 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:25.022 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:25.280 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:25.280 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 1551959 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 1551959 ']' 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 1551959 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:25.281 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1551959 00:24:25.540 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:25.541 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:25.541 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1551959' 00:24:25.541 killing process with pid 1551959 00:24:25.541 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 1551959 00:24:25.541 Received shutdown signal, test time was about 12.652012 seconds 00:24:25.541 00:24:25.541 Latency(us) 00:24:25.541 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:25.541 =================================================================================================================== 00:24:25.541 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:25.541 [2024-07-23 08:37:37.804863] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:25.541 08:37:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 1551959 00:24:25.800 [2024-07-23 08:37:38.166316] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:27.176 08:37:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:24:27.176 00:24:27.176 real 0m18.460s 00:24:27.176 user 0m26.695s 00:24:27.176 sys 0m2.265s 00:24:27.176 08:37:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:27.176 08:37:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:27.176 ************************************ 00:24:27.176 END TEST raid_rebuild_test_io 00:24:27.176 ************************************ 00:24:27.176 08:37:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:27.176 08:37:39 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:24:27.176 08:37:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:27.176 08:37:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:27.177 08:37:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:27.177 ************************************ 00:24:27.177 START TEST raid_rebuild_test_sb_io 00:24:27.177 ************************************ 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1555715 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1555715 /var/tmp/spdk-raid.sock 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 1555715 ']' 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:27.177 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:27.177 08:37:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:27.177 [2024-07-23 08:37:39.631654] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:24:27.177 [2024-07-23 08:37:39.631764] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1555715 ] 00:24:27.177 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:27.177 Zero copy mechanism will not be used. 00:24:27.435 [2024-07-23 08:37:39.757223] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:27.694 [2024-07-23 08:37:39.973072] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:27.953 [2024-07-23 08:37:40.221819] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:27.953 [2024-07-23 08:37:40.221853] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:27.953 08:37:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:27.953 08:37:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:24:27.953 08:37:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:27.953 08:37:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:28.211 BaseBdev1_malloc 00:24:28.211 08:37:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:28.470 [2024-07-23 08:37:40.770177] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:28.470 [2024-07-23 08:37:40.770234] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:28.470 [2024-07-23 08:37:40.770259] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:24:28.470 [2024-07-23 08:37:40.770272] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:28.470 [2024-07-23 08:37:40.772243] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:28.470 [2024-07-23 08:37:40.772273] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:28.470 BaseBdev1 00:24:28.470 08:37:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:28.470 08:37:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:28.729 BaseBdev2_malloc 00:24:28.729 08:37:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:28.729 [2024-07-23 08:37:41.155144] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:28.729 [2024-07-23 08:37:41.155194] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:28.729 [2024-07-23 08:37:41.155214] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:24:28.729 [2024-07-23 08:37:41.155227] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:28.729 [2024-07-23 08:37:41.157133] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:28.729 [2024-07-23 08:37:41.157160] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:28.729 BaseBdev2 00:24:28.729 08:37:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:28.729 08:37:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:28.987 BaseBdev3_malloc 00:24:28.987 08:37:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:29.245 [2024-07-23 08:37:41.513496] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:29.246 [2024-07-23 08:37:41.513548] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:29.246 [2024-07-23 08:37:41.513570] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036080 00:24:29.246 [2024-07-23 08:37:41.513581] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:29.246 [2024-07-23 08:37:41.515537] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:29.246 [2024-07-23 08:37:41.515565] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:29.246 BaseBdev3 00:24:29.246 08:37:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:29.246 08:37:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:29.246 BaseBdev4_malloc 00:24:29.246 08:37:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:29.503 [2024-07-23 08:37:41.898013] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:29.503 [2024-07-23 08:37:41.898069] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:29.503 [2024-07-23 08:37:41.898107] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036c80 00:24:29.503 [2024-07-23 08:37:41.898118] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:29.503 [2024-07-23 08:37:41.900068] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:29.503 [2024-07-23 08:37:41.900096] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:29.503 BaseBdev4 00:24:29.503 08:37:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:29.762 spare_malloc 00:24:29.762 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:30.020 spare_delay 00:24:30.020 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:30.020 [2024-07-23 08:37:42.446036] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:30.020 [2024-07-23 08:37:42.446093] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:30.020 [2024-07-23 08:37:42.446115] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037e80 00:24:30.020 [2024-07-23 08:37:42.446126] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:30.020 [2024-07-23 08:37:42.448059] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:30.020 [2024-07-23 08:37:42.448086] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:30.020 spare 00:24:30.020 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:30.279 [2024-07-23 08:37:42.614525] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:30.279 [2024-07-23 08:37:42.616170] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:30.279 [2024-07-23 08:37:42.616228] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:30.279 [2024-07-23 08:37:42.616276] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:30.279 [2024-07-23 08:37:42.616481] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000038480 00:24:30.279 [2024-07-23 08:37:42.616494] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:30.279 [2024-07-23 08:37:42.616763] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:24:30.279 [2024-07-23 08:37:42.616979] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000038480 00:24:30.279 [2024-07-23 08:37:42.616990] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000038480 00:24:30.279 [2024-07-23 08:37:42.617157] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:30.279 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:30.279 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:30.279 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:30.279 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:30.279 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:30.279 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:30.279 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:30.280 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:30.280 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:30.280 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:30.280 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.280 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:30.539 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:30.539 "name": "raid_bdev1", 00:24:30.539 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:30.539 "strip_size_kb": 0, 00:24:30.539 "state": "online", 00:24:30.539 "raid_level": "raid1", 00:24:30.539 "superblock": true, 00:24:30.539 "num_base_bdevs": 4, 00:24:30.539 "num_base_bdevs_discovered": 4, 00:24:30.539 "num_base_bdevs_operational": 4, 00:24:30.539 "base_bdevs_list": [ 00:24:30.539 { 00:24:30.539 "name": "BaseBdev1", 00:24:30.539 "uuid": "ed2817f0-ce4a-5b70-bfbd-9158c658a6a0", 00:24:30.539 "is_configured": true, 00:24:30.539 "data_offset": 2048, 00:24:30.539 "data_size": 63488 00:24:30.539 }, 00:24:30.539 { 00:24:30.539 "name": "BaseBdev2", 00:24:30.539 "uuid": "40233bed-86d4-57c7-bc1e-ba9544413197", 00:24:30.539 "is_configured": true, 00:24:30.539 "data_offset": 2048, 00:24:30.539 "data_size": 63488 00:24:30.539 }, 00:24:30.539 { 00:24:30.539 "name": "BaseBdev3", 00:24:30.539 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:30.539 "is_configured": true, 00:24:30.539 "data_offset": 2048, 00:24:30.539 "data_size": 63488 00:24:30.539 }, 00:24:30.539 { 00:24:30.539 "name": "BaseBdev4", 00:24:30.539 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:30.539 "is_configured": true, 00:24:30.539 "data_offset": 2048, 00:24:30.539 "data_size": 63488 00:24:30.539 } 00:24:30.539 ] 00:24:30.539 }' 00:24:30.539 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:30.539 08:37:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:30.798 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:30.798 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:31.056 [2024-07-23 08:37:43.436916] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:31.056 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:24:31.056 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.056 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:31.315 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:24:31.315 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:24:31.315 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:31.315 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:31.315 [2024-07-23 08:37:43.714245] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c2d0 00:24:31.315 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:31.315 Zero copy mechanism will not be used. 00:24:31.315 Running I/O for 60 seconds... 00:24:31.315 [2024-07-23 08:37:43.781263] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:31.315 [2024-07-23 08:37:43.786684] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d00000c2d0 00:24:31.315 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:31.315 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:31.315 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:31.315 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:31.315 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:31.315 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:31.315 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:31.315 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:31.315 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:31.315 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:31.315 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.315 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:31.575 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:31.575 "name": "raid_bdev1", 00:24:31.575 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:31.575 "strip_size_kb": 0, 00:24:31.575 "state": "online", 00:24:31.575 "raid_level": "raid1", 00:24:31.575 "superblock": true, 00:24:31.575 "num_base_bdevs": 4, 00:24:31.575 "num_base_bdevs_discovered": 3, 00:24:31.575 "num_base_bdevs_operational": 3, 00:24:31.575 "base_bdevs_list": [ 00:24:31.575 { 00:24:31.575 "name": null, 00:24:31.575 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:31.575 "is_configured": false, 00:24:31.575 "data_offset": 2048, 00:24:31.575 "data_size": 63488 00:24:31.575 }, 00:24:31.575 { 00:24:31.575 "name": "BaseBdev2", 00:24:31.575 "uuid": "40233bed-86d4-57c7-bc1e-ba9544413197", 00:24:31.575 "is_configured": true, 00:24:31.575 "data_offset": 2048, 00:24:31.575 "data_size": 63488 00:24:31.575 }, 00:24:31.575 { 00:24:31.575 "name": "BaseBdev3", 00:24:31.575 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:31.575 "is_configured": true, 00:24:31.575 "data_offset": 2048, 00:24:31.575 "data_size": 63488 00:24:31.575 }, 00:24:31.575 { 00:24:31.575 "name": "BaseBdev4", 00:24:31.575 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:31.575 "is_configured": true, 00:24:31.575 "data_offset": 2048, 00:24:31.575 "data_size": 63488 00:24:31.575 } 00:24:31.575 ] 00:24:31.575 }' 00:24:31.575 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:31.575 08:37:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:32.143 08:37:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:32.143 [2024-07-23 08:37:44.659311] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:32.401 08:37:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:32.402 [2024-07-23 08:37:44.719417] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c3a0 00:24:32.402 [2024-07-23 08:37:44.721153] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:32.402 [2024-07-23 08:37:44.828104] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:32.402 [2024-07-23 08:37:44.829408] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:32.660 [2024-07-23 08:37:45.031787] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:32.660 [2024-07-23 08:37:45.031976] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:32.919 [2024-07-23 08:37:45.306229] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:33.177 [2024-07-23 08:37:45.523449] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:33.177 [2024-07-23 08:37:45.523676] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:33.436 08:37:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:33.437 08:37:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:33.437 08:37:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:33.437 08:37:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:33.437 08:37:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:33.437 08:37:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.437 08:37:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.437 [2024-07-23 08:37:45.894901] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:33.437 08:37:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:33.437 "name": "raid_bdev1", 00:24:33.437 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:33.437 "strip_size_kb": 0, 00:24:33.437 "state": "online", 00:24:33.437 "raid_level": "raid1", 00:24:33.437 "superblock": true, 00:24:33.437 "num_base_bdevs": 4, 00:24:33.437 "num_base_bdevs_discovered": 4, 00:24:33.437 "num_base_bdevs_operational": 4, 00:24:33.437 "process": { 00:24:33.437 "type": "rebuild", 00:24:33.437 "target": "spare", 00:24:33.437 "progress": { 00:24:33.437 "blocks": 14336, 00:24:33.437 "percent": 22 00:24:33.437 } 00:24:33.437 }, 00:24:33.437 "base_bdevs_list": [ 00:24:33.437 { 00:24:33.437 "name": "spare", 00:24:33.437 "uuid": "54b6a74d-f9e0-5441-9914-2ba65af33c52", 00:24:33.437 "is_configured": true, 00:24:33.437 "data_offset": 2048, 00:24:33.437 "data_size": 63488 00:24:33.437 }, 00:24:33.437 { 00:24:33.437 "name": "BaseBdev2", 00:24:33.437 "uuid": "40233bed-86d4-57c7-bc1e-ba9544413197", 00:24:33.437 "is_configured": true, 00:24:33.437 "data_offset": 2048, 00:24:33.437 "data_size": 63488 00:24:33.437 }, 00:24:33.437 { 00:24:33.437 "name": "BaseBdev3", 00:24:33.437 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:33.437 "is_configured": true, 00:24:33.437 "data_offset": 2048, 00:24:33.437 "data_size": 63488 00:24:33.437 }, 00:24:33.437 { 00:24:33.437 "name": "BaseBdev4", 00:24:33.437 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:33.437 "is_configured": true, 00:24:33.437 "data_offset": 2048, 00:24:33.437 "data_size": 63488 00:24:33.437 } 00:24:33.437 ] 00:24:33.437 }' 00:24:33.437 08:37:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:33.437 08:37:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:33.437 08:37:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:33.695 08:37:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:33.695 08:37:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:33.695 [2024-07-23 08:37:46.121625] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:33.954 [2024-07-23 08:37:46.233180] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:33.954 [2024-07-23 08:37:46.335625] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:33.954 [2024-07-23 08:37:46.346618] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:33.954 [2024-07-23 08:37:46.346656] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:33.954 [2024-07-23 08:37:46.346674] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:33.954 [2024-07-23 08:37:46.389977] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d00000c2d0 00:24:33.954 08:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:33.954 08:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:33.954 08:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:33.954 08:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:33.954 08:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:33.954 08:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:33.954 08:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:33.954 08:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:33.954 08:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:33.954 08:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:33.955 08:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.955 08:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:34.214 08:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:34.214 "name": "raid_bdev1", 00:24:34.214 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:34.214 "strip_size_kb": 0, 00:24:34.214 "state": "online", 00:24:34.214 "raid_level": "raid1", 00:24:34.214 "superblock": true, 00:24:34.214 "num_base_bdevs": 4, 00:24:34.214 "num_base_bdevs_discovered": 3, 00:24:34.214 "num_base_bdevs_operational": 3, 00:24:34.214 "base_bdevs_list": [ 00:24:34.214 { 00:24:34.214 "name": null, 00:24:34.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:34.214 "is_configured": false, 00:24:34.214 "data_offset": 2048, 00:24:34.214 "data_size": 63488 00:24:34.214 }, 00:24:34.214 { 00:24:34.214 "name": "BaseBdev2", 00:24:34.214 "uuid": "40233bed-86d4-57c7-bc1e-ba9544413197", 00:24:34.214 "is_configured": true, 00:24:34.214 "data_offset": 2048, 00:24:34.214 "data_size": 63488 00:24:34.214 }, 00:24:34.214 { 00:24:34.214 "name": "BaseBdev3", 00:24:34.214 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:34.214 "is_configured": true, 00:24:34.214 "data_offset": 2048, 00:24:34.214 "data_size": 63488 00:24:34.214 }, 00:24:34.214 { 00:24:34.214 "name": "BaseBdev4", 00:24:34.214 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:34.214 "is_configured": true, 00:24:34.214 "data_offset": 2048, 00:24:34.214 "data_size": 63488 00:24:34.214 } 00:24:34.214 ] 00:24:34.214 }' 00:24:34.214 08:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:34.214 08:37:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:34.782 08:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:34.782 08:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:34.782 08:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:34.782 08:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:34.782 08:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:34.782 08:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.782 08:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:35.117 08:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:35.117 "name": "raid_bdev1", 00:24:35.117 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:35.117 "strip_size_kb": 0, 00:24:35.117 "state": "online", 00:24:35.117 "raid_level": "raid1", 00:24:35.117 "superblock": true, 00:24:35.117 "num_base_bdevs": 4, 00:24:35.117 "num_base_bdevs_discovered": 3, 00:24:35.117 "num_base_bdevs_operational": 3, 00:24:35.117 "base_bdevs_list": [ 00:24:35.117 { 00:24:35.117 "name": null, 00:24:35.117 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:35.117 "is_configured": false, 00:24:35.117 "data_offset": 2048, 00:24:35.117 "data_size": 63488 00:24:35.117 }, 00:24:35.117 { 00:24:35.117 "name": "BaseBdev2", 00:24:35.117 "uuid": "40233bed-86d4-57c7-bc1e-ba9544413197", 00:24:35.117 "is_configured": true, 00:24:35.117 "data_offset": 2048, 00:24:35.117 "data_size": 63488 00:24:35.117 }, 00:24:35.117 { 00:24:35.117 "name": "BaseBdev3", 00:24:35.117 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:35.117 "is_configured": true, 00:24:35.117 "data_offset": 2048, 00:24:35.117 "data_size": 63488 00:24:35.117 }, 00:24:35.117 { 00:24:35.117 "name": "BaseBdev4", 00:24:35.117 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:35.117 "is_configured": true, 00:24:35.117 "data_offset": 2048, 00:24:35.117 "data_size": 63488 00:24:35.117 } 00:24:35.117 ] 00:24:35.117 }' 00:24:35.117 08:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:35.117 08:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:35.117 08:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:35.117 08:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:35.117 08:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:35.117 [2024-07-23 08:37:47.552285] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:35.117 08:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:35.384 [2024-07-23 08:37:47.634460] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c470 00:24:35.384 [2024-07-23 08:37:47.636250] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:35.384 [2024-07-23 08:37:47.790368] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:35.384 [2024-07-23 08:37:47.791778] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:35.644 [2024-07-23 08:37:48.031021] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:35.644 [2024-07-23 08:37:48.031219] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:35.903 [2024-07-23 08:37:48.367717] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:36.162 [2024-07-23 08:37:48.592023] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:36.162 [2024-07-23 08:37:48.592228] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:36.162 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:36.162 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:36.162 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:36.162 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:36.162 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:36.162 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.162 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:36.420 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:36.420 "name": "raid_bdev1", 00:24:36.420 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:36.420 "strip_size_kb": 0, 00:24:36.420 "state": "online", 00:24:36.420 "raid_level": "raid1", 00:24:36.420 "superblock": true, 00:24:36.420 "num_base_bdevs": 4, 00:24:36.421 "num_base_bdevs_discovered": 4, 00:24:36.421 "num_base_bdevs_operational": 4, 00:24:36.421 "process": { 00:24:36.421 "type": "rebuild", 00:24:36.421 "target": "spare", 00:24:36.421 "progress": { 00:24:36.421 "blocks": 10240, 00:24:36.421 "percent": 16 00:24:36.421 } 00:24:36.421 }, 00:24:36.421 "base_bdevs_list": [ 00:24:36.421 { 00:24:36.421 "name": "spare", 00:24:36.421 "uuid": "54b6a74d-f9e0-5441-9914-2ba65af33c52", 00:24:36.421 "is_configured": true, 00:24:36.421 "data_offset": 2048, 00:24:36.421 "data_size": 63488 00:24:36.421 }, 00:24:36.421 { 00:24:36.421 "name": "BaseBdev2", 00:24:36.421 "uuid": "40233bed-86d4-57c7-bc1e-ba9544413197", 00:24:36.421 "is_configured": true, 00:24:36.421 "data_offset": 2048, 00:24:36.421 "data_size": 63488 00:24:36.421 }, 00:24:36.421 { 00:24:36.421 "name": "BaseBdev3", 00:24:36.421 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:36.421 "is_configured": true, 00:24:36.421 "data_offset": 2048, 00:24:36.421 "data_size": 63488 00:24:36.421 }, 00:24:36.421 { 00:24:36.421 "name": "BaseBdev4", 00:24:36.421 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:36.421 "is_configured": true, 00:24:36.421 "data_offset": 2048, 00:24:36.421 "data_size": 63488 00:24:36.421 } 00:24:36.421 ] 00:24:36.421 }' 00:24:36.421 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:36.421 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:36.421 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:36.421 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:36.421 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:36.421 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:36.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:36.421 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:24:36.421 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:36.421 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:24:36.421 08:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:36.421 [2024-07-23 08:37:48.924872] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:36.421 [2024-07-23 08:37:48.926323] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:36.679 [2024-07-23 08:37:49.036233] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:36.679 [2024-07-23 08:37:49.145316] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:36.938 [2024-07-23 08:37:49.348836] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x60d00000c2d0 00:24:36.938 [2024-07-23 08:37:49.348867] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x60d00000c470 00:24:36.938 [2024-07-23 08:37:49.369771] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:36.938 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:24:36.938 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:24:36.938 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:36.938 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:36.938 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:36.938 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:36.938 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:36.938 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.938 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:37.207 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:37.207 "name": "raid_bdev1", 00:24:37.207 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:37.207 "strip_size_kb": 0, 00:24:37.207 "state": "online", 00:24:37.207 "raid_level": "raid1", 00:24:37.207 "superblock": true, 00:24:37.207 "num_base_bdevs": 4, 00:24:37.207 "num_base_bdevs_discovered": 3, 00:24:37.207 "num_base_bdevs_operational": 3, 00:24:37.207 "process": { 00:24:37.207 "type": "rebuild", 00:24:37.207 "target": "spare", 00:24:37.207 "progress": { 00:24:37.207 "blocks": 16384, 00:24:37.207 "percent": 25 00:24:37.207 } 00:24:37.207 }, 00:24:37.207 "base_bdevs_list": [ 00:24:37.207 { 00:24:37.207 "name": "spare", 00:24:37.207 "uuid": "54b6a74d-f9e0-5441-9914-2ba65af33c52", 00:24:37.207 "is_configured": true, 00:24:37.207 "data_offset": 2048, 00:24:37.207 "data_size": 63488 00:24:37.207 }, 00:24:37.207 { 00:24:37.207 "name": null, 00:24:37.207 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:37.207 "is_configured": false, 00:24:37.207 "data_offset": 2048, 00:24:37.207 "data_size": 63488 00:24:37.207 }, 00:24:37.207 { 00:24:37.207 "name": "BaseBdev3", 00:24:37.207 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:37.207 "is_configured": true, 00:24:37.207 "data_offset": 2048, 00:24:37.207 "data_size": 63488 00:24:37.207 }, 00:24:37.207 { 00:24:37.207 "name": "BaseBdev4", 00:24:37.207 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:37.207 "is_configured": true, 00:24:37.207 "data_offset": 2048, 00:24:37.207 "data_size": 63488 00:24:37.207 } 00:24:37.207 ] 00:24:37.207 }' 00:24:37.207 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:37.207 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:37.207 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:37.207 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:37.207 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=822 00:24:37.207 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:37.207 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:37.207 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:37.207 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:37.207 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:37.207 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:37.207 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:37.207 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:37.207 [2024-07-23 08:37:49.700933] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:37.207 [2024-07-23 08:37:49.701833] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:37.470 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:37.470 "name": "raid_bdev1", 00:24:37.470 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:37.470 "strip_size_kb": 0, 00:24:37.470 "state": "online", 00:24:37.470 "raid_level": "raid1", 00:24:37.470 "superblock": true, 00:24:37.470 "num_base_bdevs": 4, 00:24:37.470 "num_base_bdevs_discovered": 3, 00:24:37.470 "num_base_bdevs_operational": 3, 00:24:37.470 "process": { 00:24:37.470 "type": "rebuild", 00:24:37.470 "target": "spare", 00:24:37.470 "progress": { 00:24:37.470 "blocks": 20480, 00:24:37.470 "percent": 32 00:24:37.470 } 00:24:37.470 }, 00:24:37.470 "base_bdevs_list": [ 00:24:37.470 { 00:24:37.470 "name": "spare", 00:24:37.470 "uuid": "54b6a74d-f9e0-5441-9914-2ba65af33c52", 00:24:37.470 "is_configured": true, 00:24:37.470 "data_offset": 2048, 00:24:37.470 "data_size": 63488 00:24:37.470 }, 00:24:37.470 { 00:24:37.470 "name": null, 00:24:37.470 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:37.470 "is_configured": false, 00:24:37.470 "data_offset": 2048, 00:24:37.470 "data_size": 63488 00:24:37.470 }, 00:24:37.470 { 00:24:37.470 "name": "BaseBdev3", 00:24:37.470 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:37.470 "is_configured": true, 00:24:37.470 "data_offset": 2048, 00:24:37.470 "data_size": 63488 00:24:37.470 }, 00:24:37.470 { 00:24:37.470 "name": "BaseBdev4", 00:24:37.470 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:37.470 "is_configured": true, 00:24:37.470 "data_offset": 2048, 00:24:37.470 "data_size": 63488 00:24:37.470 } 00:24:37.470 ] 00:24:37.470 }' 00:24:37.470 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:37.470 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:37.470 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:37.470 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:37.470 08:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:37.470 [2024-07-23 08:37:49.943703] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:37.470 [2024-07-23 08:37:49.944119] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:38.407 [2024-07-23 08:37:50.603167] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:24:38.407 [2024-07-23 08:37:50.710575] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:24:38.407 08:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:38.407 08:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:38.407 08:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:38.407 08:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:38.407 08:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:38.407 08:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:38.407 08:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.407 08:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:38.667 [2024-07-23 08:37:51.029692] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:24:38.667 08:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:38.667 "name": "raid_bdev1", 00:24:38.667 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:38.667 "strip_size_kb": 0, 00:24:38.667 "state": "online", 00:24:38.667 "raid_level": "raid1", 00:24:38.667 "superblock": true, 00:24:38.667 "num_base_bdevs": 4, 00:24:38.667 "num_base_bdevs_discovered": 3, 00:24:38.667 "num_base_bdevs_operational": 3, 00:24:38.667 "process": { 00:24:38.667 "type": "rebuild", 00:24:38.667 "target": "spare", 00:24:38.667 "progress": { 00:24:38.667 "blocks": 38912, 00:24:38.667 "percent": 61 00:24:38.667 } 00:24:38.667 }, 00:24:38.667 "base_bdevs_list": [ 00:24:38.667 { 00:24:38.667 "name": "spare", 00:24:38.667 "uuid": "54b6a74d-f9e0-5441-9914-2ba65af33c52", 00:24:38.667 "is_configured": true, 00:24:38.667 "data_offset": 2048, 00:24:38.667 "data_size": 63488 00:24:38.667 }, 00:24:38.667 { 00:24:38.667 "name": null, 00:24:38.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:38.667 "is_configured": false, 00:24:38.667 "data_offset": 2048, 00:24:38.667 "data_size": 63488 00:24:38.667 }, 00:24:38.667 { 00:24:38.667 "name": "BaseBdev3", 00:24:38.667 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:38.667 "is_configured": true, 00:24:38.667 "data_offset": 2048, 00:24:38.667 "data_size": 63488 00:24:38.667 }, 00:24:38.667 { 00:24:38.667 "name": "BaseBdev4", 00:24:38.667 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:38.667 "is_configured": true, 00:24:38.667 "data_offset": 2048, 00:24:38.667 "data_size": 63488 00:24:38.667 } 00:24:38.667 ] 00:24:38.667 }' 00:24:38.667 08:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:38.667 08:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:38.667 08:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:38.667 [2024-07-23 08:37:51.145329] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:24:38.667 08:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:38.667 08:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:38.926 [2024-07-23 08:37:51.364319] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:24:39.864 [2024-07-23 08:37:52.140011] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:24:39.864 08:37:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:39.864 08:37:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:39.864 08:37:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:39.864 08:37:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:39.864 08:37:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:39.864 08:37:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:39.864 08:37:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.864 08:37:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.864 08:37:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:39.864 "name": "raid_bdev1", 00:24:39.864 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:39.864 "strip_size_kb": 0, 00:24:39.864 "state": "online", 00:24:39.864 "raid_level": "raid1", 00:24:39.864 "superblock": true, 00:24:39.864 "num_base_bdevs": 4, 00:24:39.864 "num_base_bdevs_discovered": 3, 00:24:39.864 "num_base_bdevs_operational": 3, 00:24:39.864 "process": { 00:24:39.864 "type": "rebuild", 00:24:39.864 "target": "spare", 00:24:39.864 "progress": { 00:24:39.864 "blocks": 59392, 00:24:39.864 "percent": 93 00:24:39.864 } 00:24:39.864 }, 00:24:39.864 "base_bdevs_list": [ 00:24:39.864 { 00:24:39.864 "name": "spare", 00:24:39.864 "uuid": "54b6a74d-f9e0-5441-9914-2ba65af33c52", 00:24:39.864 "is_configured": true, 00:24:39.864 "data_offset": 2048, 00:24:39.864 "data_size": 63488 00:24:39.864 }, 00:24:39.864 { 00:24:39.864 "name": null, 00:24:39.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:39.864 "is_configured": false, 00:24:39.864 "data_offset": 2048, 00:24:39.864 "data_size": 63488 00:24:39.864 }, 00:24:39.864 { 00:24:39.864 "name": "BaseBdev3", 00:24:39.864 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:39.864 "is_configured": true, 00:24:39.864 "data_offset": 2048, 00:24:39.864 "data_size": 63488 00:24:39.864 }, 00:24:39.864 { 00:24:39.864 "name": "BaseBdev4", 00:24:39.864 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:39.864 "is_configured": true, 00:24:39.864 "data_offset": 2048, 00:24:39.864 "data_size": 63488 00:24:39.864 } 00:24:39.864 ] 00:24:39.864 }' 00:24:39.864 08:37:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:39.864 08:37:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:39.864 08:37:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:40.123 08:37:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:40.123 08:37:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:40.123 [2024-07-23 08:37:52.464574] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:40.123 [2024-07-23 08:37:52.563454] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:40.123 [2024-07-23 08:37:52.565096] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:41.059 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:41.059 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:41.059 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:41.059 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:41.059 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:41.059 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:41.059 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.059 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.317 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:41.317 "name": "raid_bdev1", 00:24:41.317 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:41.317 "strip_size_kb": 0, 00:24:41.317 "state": "online", 00:24:41.317 "raid_level": "raid1", 00:24:41.317 "superblock": true, 00:24:41.317 "num_base_bdevs": 4, 00:24:41.317 "num_base_bdevs_discovered": 3, 00:24:41.317 "num_base_bdevs_operational": 3, 00:24:41.317 "base_bdevs_list": [ 00:24:41.317 { 00:24:41.317 "name": "spare", 00:24:41.317 "uuid": "54b6a74d-f9e0-5441-9914-2ba65af33c52", 00:24:41.317 "is_configured": true, 00:24:41.317 "data_offset": 2048, 00:24:41.317 "data_size": 63488 00:24:41.317 }, 00:24:41.317 { 00:24:41.317 "name": null, 00:24:41.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:41.317 "is_configured": false, 00:24:41.317 "data_offset": 2048, 00:24:41.317 "data_size": 63488 00:24:41.317 }, 00:24:41.317 { 00:24:41.317 "name": "BaseBdev3", 00:24:41.317 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:41.317 "is_configured": true, 00:24:41.317 "data_offset": 2048, 00:24:41.317 "data_size": 63488 00:24:41.317 }, 00:24:41.317 { 00:24:41.317 "name": "BaseBdev4", 00:24:41.317 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:41.317 "is_configured": true, 00:24:41.317 "data_offset": 2048, 00:24:41.317 "data_size": 63488 00:24:41.317 } 00:24:41.317 ] 00:24:41.317 }' 00:24:41.317 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:41.317 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:41.317 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:41.317 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:41.317 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:24:41.317 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:41.317 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:41.317 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:41.317 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:41.317 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:41.317 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.317 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:41.576 "name": "raid_bdev1", 00:24:41.576 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:41.576 "strip_size_kb": 0, 00:24:41.576 "state": "online", 00:24:41.576 "raid_level": "raid1", 00:24:41.576 "superblock": true, 00:24:41.576 "num_base_bdevs": 4, 00:24:41.576 "num_base_bdevs_discovered": 3, 00:24:41.576 "num_base_bdevs_operational": 3, 00:24:41.576 "base_bdevs_list": [ 00:24:41.576 { 00:24:41.576 "name": "spare", 00:24:41.576 "uuid": "54b6a74d-f9e0-5441-9914-2ba65af33c52", 00:24:41.576 "is_configured": true, 00:24:41.576 "data_offset": 2048, 00:24:41.576 "data_size": 63488 00:24:41.576 }, 00:24:41.576 { 00:24:41.576 "name": null, 00:24:41.576 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:41.576 "is_configured": false, 00:24:41.576 "data_offset": 2048, 00:24:41.576 "data_size": 63488 00:24:41.576 }, 00:24:41.576 { 00:24:41.576 "name": "BaseBdev3", 00:24:41.576 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:41.576 "is_configured": true, 00:24:41.576 "data_offset": 2048, 00:24:41.576 "data_size": 63488 00:24:41.576 }, 00:24:41.576 { 00:24:41.576 "name": "BaseBdev4", 00:24:41.576 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:41.576 "is_configured": true, 00:24:41.576 "data_offset": 2048, 00:24:41.576 "data_size": 63488 00:24:41.576 } 00:24:41.576 ] 00:24:41.576 }' 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.576 08:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.835 08:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:41.835 "name": "raid_bdev1", 00:24:41.835 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:41.835 "strip_size_kb": 0, 00:24:41.835 "state": "online", 00:24:41.835 "raid_level": "raid1", 00:24:41.835 "superblock": true, 00:24:41.835 "num_base_bdevs": 4, 00:24:41.835 "num_base_bdevs_discovered": 3, 00:24:41.835 "num_base_bdevs_operational": 3, 00:24:41.835 "base_bdevs_list": [ 00:24:41.835 { 00:24:41.835 "name": "spare", 00:24:41.835 "uuid": "54b6a74d-f9e0-5441-9914-2ba65af33c52", 00:24:41.835 "is_configured": true, 00:24:41.835 "data_offset": 2048, 00:24:41.835 "data_size": 63488 00:24:41.835 }, 00:24:41.835 { 00:24:41.835 "name": null, 00:24:41.835 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:41.835 "is_configured": false, 00:24:41.835 "data_offset": 2048, 00:24:41.835 "data_size": 63488 00:24:41.835 }, 00:24:41.835 { 00:24:41.835 "name": "BaseBdev3", 00:24:41.835 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:41.835 "is_configured": true, 00:24:41.835 "data_offset": 2048, 00:24:41.836 "data_size": 63488 00:24:41.836 }, 00:24:41.836 { 00:24:41.836 "name": "BaseBdev4", 00:24:41.836 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:41.836 "is_configured": true, 00:24:41.836 "data_offset": 2048, 00:24:41.836 "data_size": 63488 00:24:41.836 } 00:24:41.836 ] 00:24:41.836 }' 00:24:41.836 08:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:41.836 08:37:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:42.094 08:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:42.352 [2024-07-23 08:37:54.755782] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:42.352 [2024-07-23 08:37:54.755815] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:42.352 00:24:42.352 Latency(us) 00:24:42.352 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:42.352 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:24:42.352 raid_bdev1 : 11.11 96.41 289.23 0.00 0.00 14322.85 296.47 116841.33 00:24:42.352 =================================================================================================================== 00:24:42.352 Total : 96.41 289.23 0.00 0.00 14322.85 296.47 116841.33 00:24:42.610 [2024-07-23 08:37:54.873014] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:42.610 [2024-07-23 08:37:54.873049] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:42.610 [2024-07-23 08:37:54.873139] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:42.610 [2024-07-23 08:37:54.873150] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038480 name raid_bdev1, state offline 00:24:42.610 0 00:24:42.610 08:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.610 08:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:24:42.610 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:42.611 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:42.611 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:24:42.611 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:24:42.611 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:42.611 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:24:42.611 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:42.611 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:42.611 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:42.611 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:42.611 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:42.611 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:42.611 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:24:42.870 /dev/nbd0 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:42.870 1+0 records in 00:24:42.870 1+0 records out 00:24:42.870 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225434 s, 18.2 MB/s 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:42.870 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:24:43.129 /dev/nbd1 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:43.129 1+0 records in 00:24:43.129 1+0 records out 00:24:43.129 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000226997 s, 18.0 MB/s 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:43.129 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:43.388 08:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:24:43.646 /dev/nbd1 00:24:43.646 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:43.646 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:43.646 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:43.646 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:24:43.646 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:43.646 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:43.646 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:43.646 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:24:43.646 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:43.647 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:43.647 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:43.647 1+0 records in 00:24:43.647 1+0 records out 00:24:43.647 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251243 s, 16.3 MB/s 00:24:43.647 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:43.647 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:24:43.647 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:43.647 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:43.647 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:24:43.647 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:43.647 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:43.647 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:43.904 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:43.904 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:43.904 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:43.904 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:43.904 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:43.904 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:43.904 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:43.905 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:43.905 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:43.905 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:43.905 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:43.905 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:43.905 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:43.905 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:43.905 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:43.905 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:43.905 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:43.905 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:43.905 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:43.905 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:43.905 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:43.905 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:44.163 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:44.163 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:44.163 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:44.163 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:44.163 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:44.163 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:44.163 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:44.163 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:44.163 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:44.163 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:44.422 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:44.422 [2024-07-23 08:37:56.923407] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:44.422 [2024-07-23 08:37:56.923458] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:44.422 [2024-07-23 08:37:56.923495] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039f80 00:24:44.422 [2024-07-23 08:37:56.923504] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:44.422 [2024-07-23 08:37:56.925468] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:44.422 [2024-07-23 08:37:56.925493] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:44.422 [2024-07-23 08:37:56.925577] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:44.422 [2024-07-23 08:37:56.925630] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:44.422 [2024-07-23 08:37:56.925802] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:44.422 [2024-07-23 08:37:56.925882] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:44.422 spare 00:24:44.682 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:44.682 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:44.682 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:44.682 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:44.682 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:44.682 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:44.682 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:44.682 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:44.682 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:44.682 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:44.682 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:44.682 08:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.682 [2024-07-23 08:37:57.026203] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x61600003a580 00:24:44.682 [2024-07-23 08:37:57.026225] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:44.682 [2024-07-23 08:37:57.026468] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000040200 00:24:44.682 [2024-07-23 08:37:57.026660] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x61600003a580 00:24:44.682 [2024-07-23 08:37:57.026672] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x61600003a580 00:24:44.682 [2024-07-23 08:37:57.026816] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:44.682 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:44.682 "name": "raid_bdev1", 00:24:44.682 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:44.682 "strip_size_kb": 0, 00:24:44.682 "state": "online", 00:24:44.682 "raid_level": "raid1", 00:24:44.682 "superblock": true, 00:24:44.682 "num_base_bdevs": 4, 00:24:44.682 "num_base_bdevs_discovered": 3, 00:24:44.682 "num_base_bdevs_operational": 3, 00:24:44.682 "base_bdevs_list": [ 00:24:44.682 { 00:24:44.682 "name": "spare", 00:24:44.682 "uuid": "54b6a74d-f9e0-5441-9914-2ba65af33c52", 00:24:44.682 "is_configured": true, 00:24:44.682 "data_offset": 2048, 00:24:44.682 "data_size": 63488 00:24:44.682 }, 00:24:44.682 { 00:24:44.682 "name": null, 00:24:44.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:44.682 "is_configured": false, 00:24:44.682 "data_offset": 2048, 00:24:44.682 "data_size": 63488 00:24:44.682 }, 00:24:44.682 { 00:24:44.682 "name": "BaseBdev3", 00:24:44.682 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:44.682 "is_configured": true, 00:24:44.682 "data_offset": 2048, 00:24:44.682 "data_size": 63488 00:24:44.682 }, 00:24:44.682 { 00:24:44.682 "name": "BaseBdev4", 00:24:44.682 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:44.682 "is_configured": true, 00:24:44.682 "data_offset": 2048, 00:24:44.682 "data_size": 63488 00:24:44.682 } 00:24:44.682 ] 00:24:44.682 }' 00:24:44.682 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:44.682 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:45.251 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:45.251 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:45.251 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:45.251 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:45.251 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:45.251 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.251 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.251 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:45.251 "name": "raid_bdev1", 00:24:45.251 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:45.251 "strip_size_kb": 0, 00:24:45.251 "state": "online", 00:24:45.251 "raid_level": "raid1", 00:24:45.251 "superblock": true, 00:24:45.251 "num_base_bdevs": 4, 00:24:45.251 "num_base_bdevs_discovered": 3, 00:24:45.251 "num_base_bdevs_operational": 3, 00:24:45.251 "base_bdevs_list": [ 00:24:45.251 { 00:24:45.251 "name": "spare", 00:24:45.251 "uuid": "54b6a74d-f9e0-5441-9914-2ba65af33c52", 00:24:45.251 "is_configured": true, 00:24:45.251 "data_offset": 2048, 00:24:45.251 "data_size": 63488 00:24:45.251 }, 00:24:45.251 { 00:24:45.251 "name": null, 00:24:45.251 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:45.251 "is_configured": false, 00:24:45.251 "data_offset": 2048, 00:24:45.251 "data_size": 63488 00:24:45.251 }, 00:24:45.251 { 00:24:45.251 "name": "BaseBdev3", 00:24:45.251 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:45.251 "is_configured": true, 00:24:45.251 "data_offset": 2048, 00:24:45.251 "data_size": 63488 00:24:45.251 }, 00:24:45.251 { 00:24:45.251 "name": "BaseBdev4", 00:24:45.251 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:45.251 "is_configured": true, 00:24:45.251 "data_offset": 2048, 00:24:45.251 "data_size": 63488 00:24:45.251 } 00:24:45.251 ] 00:24:45.251 }' 00:24:45.251 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:45.510 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:45.510 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:45.510 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:45.510 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:45.510 08:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.510 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:45.510 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:45.769 [2024-07-23 08:37:58.183103] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:45.769 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:45.769 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:45.769 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:45.769 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:45.769 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:45.769 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:45.769 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:45.769 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:45.769 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:45.769 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:45.769 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.769 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.027 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:46.027 "name": "raid_bdev1", 00:24:46.027 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:46.027 "strip_size_kb": 0, 00:24:46.027 "state": "online", 00:24:46.027 "raid_level": "raid1", 00:24:46.027 "superblock": true, 00:24:46.027 "num_base_bdevs": 4, 00:24:46.027 "num_base_bdevs_discovered": 2, 00:24:46.027 "num_base_bdevs_operational": 2, 00:24:46.027 "base_bdevs_list": [ 00:24:46.027 { 00:24:46.027 "name": null, 00:24:46.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:46.027 "is_configured": false, 00:24:46.027 "data_offset": 2048, 00:24:46.027 "data_size": 63488 00:24:46.027 }, 00:24:46.027 { 00:24:46.027 "name": null, 00:24:46.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:46.027 "is_configured": false, 00:24:46.027 "data_offset": 2048, 00:24:46.027 "data_size": 63488 00:24:46.027 }, 00:24:46.027 { 00:24:46.027 "name": "BaseBdev3", 00:24:46.027 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:46.027 "is_configured": true, 00:24:46.027 "data_offset": 2048, 00:24:46.027 "data_size": 63488 00:24:46.027 }, 00:24:46.027 { 00:24:46.027 "name": "BaseBdev4", 00:24:46.027 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:46.027 "is_configured": true, 00:24:46.027 "data_offset": 2048, 00:24:46.027 "data_size": 63488 00:24:46.027 } 00:24:46.027 ] 00:24:46.027 }' 00:24:46.027 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:46.027 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:46.594 08:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:46.594 [2024-07-23 08:37:59.025413] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:46.594 [2024-07-23 08:37:59.025620] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:24:46.594 [2024-07-23 08:37:59.025638] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:46.594 [2024-07-23 08:37:59.025668] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:46.595 [2024-07-23 08:37:59.042311] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000402d0 00:24:46.595 [2024-07-23 08:37:59.043972] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:46.595 08:37:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:47.972 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:47.972 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:47.972 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:47.972 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:47.972 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:47.972 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.972 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:47.972 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:47.972 "name": "raid_bdev1", 00:24:47.972 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:47.972 "strip_size_kb": 0, 00:24:47.972 "state": "online", 00:24:47.972 "raid_level": "raid1", 00:24:47.972 "superblock": true, 00:24:47.972 "num_base_bdevs": 4, 00:24:47.972 "num_base_bdevs_discovered": 3, 00:24:47.972 "num_base_bdevs_operational": 3, 00:24:47.972 "process": { 00:24:47.972 "type": "rebuild", 00:24:47.972 "target": "spare", 00:24:47.972 "progress": { 00:24:47.972 "blocks": 22528, 00:24:47.972 "percent": 35 00:24:47.972 } 00:24:47.972 }, 00:24:47.972 "base_bdevs_list": [ 00:24:47.972 { 00:24:47.972 "name": "spare", 00:24:47.972 "uuid": "54b6a74d-f9e0-5441-9914-2ba65af33c52", 00:24:47.972 "is_configured": true, 00:24:47.972 "data_offset": 2048, 00:24:47.972 "data_size": 63488 00:24:47.972 }, 00:24:47.972 { 00:24:47.972 "name": null, 00:24:47.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:47.972 "is_configured": false, 00:24:47.972 "data_offset": 2048, 00:24:47.972 "data_size": 63488 00:24:47.972 }, 00:24:47.972 { 00:24:47.972 "name": "BaseBdev3", 00:24:47.972 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:47.972 "is_configured": true, 00:24:47.972 "data_offset": 2048, 00:24:47.973 "data_size": 63488 00:24:47.973 }, 00:24:47.973 { 00:24:47.973 "name": "BaseBdev4", 00:24:47.973 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:47.973 "is_configured": true, 00:24:47.973 "data_offset": 2048, 00:24:47.973 "data_size": 63488 00:24:47.973 } 00:24:47.973 ] 00:24:47.973 }' 00:24:47.973 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:47.973 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:47.973 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:47.973 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:47.973 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:47.973 [2024-07-23 08:38:00.473934] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:48.232 [2024-07-23 08:38:00.556155] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:48.232 [2024-07-23 08:38:00.556205] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:48.232 [2024-07-23 08:38:00.556223] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:48.232 [2024-07-23 08:38:00.556231] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:48.232 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:48.232 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:48.232 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:48.232 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:48.232 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:48.232 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:48.232 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:48.232 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:48.232 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:48.232 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:48.232 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.232 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:48.491 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:48.491 "name": "raid_bdev1", 00:24:48.491 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:48.491 "strip_size_kb": 0, 00:24:48.491 "state": "online", 00:24:48.491 "raid_level": "raid1", 00:24:48.491 "superblock": true, 00:24:48.491 "num_base_bdevs": 4, 00:24:48.491 "num_base_bdevs_discovered": 2, 00:24:48.491 "num_base_bdevs_operational": 2, 00:24:48.491 "base_bdevs_list": [ 00:24:48.491 { 00:24:48.491 "name": null, 00:24:48.491 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:48.491 "is_configured": false, 00:24:48.491 "data_offset": 2048, 00:24:48.491 "data_size": 63488 00:24:48.491 }, 00:24:48.491 { 00:24:48.491 "name": null, 00:24:48.491 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:48.491 "is_configured": false, 00:24:48.491 "data_offset": 2048, 00:24:48.491 "data_size": 63488 00:24:48.491 }, 00:24:48.491 { 00:24:48.491 "name": "BaseBdev3", 00:24:48.491 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:48.491 "is_configured": true, 00:24:48.491 "data_offset": 2048, 00:24:48.491 "data_size": 63488 00:24:48.491 }, 00:24:48.491 { 00:24:48.491 "name": "BaseBdev4", 00:24:48.491 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:48.491 "is_configured": true, 00:24:48.491 "data_offset": 2048, 00:24:48.491 "data_size": 63488 00:24:48.491 } 00:24:48.491 ] 00:24:48.491 }' 00:24:48.491 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:48.491 08:38:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:48.750 08:38:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:49.008 [2024-07-23 08:38:01.411968] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:49.008 [2024-07-23 08:38:01.412028] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:49.008 [2024-07-23 08:38:01.412051] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600003ab80 00:24:49.008 [2024-07-23 08:38:01.412059] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:49.008 [2024-07-23 08:38:01.412544] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:49.008 [2024-07-23 08:38:01.412564] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:49.008 [2024-07-23 08:38:01.412665] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:49.008 [2024-07-23 08:38:01.412679] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:24:49.008 [2024-07-23 08:38:01.412694] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:49.008 [2024-07-23 08:38:01.412712] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:49.008 [2024-07-23 08:38:01.428561] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000403a0 00:24:49.008 spare 00:24:49.008 [2024-07-23 08:38:01.430195] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:49.008 08:38:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:49.945 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:49.945 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:49.945 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:49.945 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:49.945 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:49.945 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.945 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:50.204 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:50.204 "name": "raid_bdev1", 00:24:50.204 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:50.204 "strip_size_kb": 0, 00:24:50.204 "state": "online", 00:24:50.204 "raid_level": "raid1", 00:24:50.204 "superblock": true, 00:24:50.204 "num_base_bdevs": 4, 00:24:50.204 "num_base_bdevs_discovered": 3, 00:24:50.204 "num_base_bdevs_operational": 3, 00:24:50.204 "process": { 00:24:50.204 "type": "rebuild", 00:24:50.204 "target": "spare", 00:24:50.204 "progress": { 00:24:50.204 "blocks": 22528, 00:24:50.204 "percent": 35 00:24:50.204 } 00:24:50.204 }, 00:24:50.204 "base_bdevs_list": [ 00:24:50.204 { 00:24:50.204 "name": "spare", 00:24:50.204 "uuid": "54b6a74d-f9e0-5441-9914-2ba65af33c52", 00:24:50.204 "is_configured": true, 00:24:50.204 "data_offset": 2048, 00:24:50.204 "data_size": 63488 00:24:50.204 }, 00:24:50.204 { 00:24:50.204 "name": null, 00:24:50.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:50.204 "is_configured": false, 00:24:50.204 "data_offset": 2048, 00:24:50.204 "data_size": 63488 00:24:50.204 }, 00:24:50.204 { 00:24:50.204 "name": "BaseBdev3", 00:24:50.204 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:50.204 "is_configured": true, 00:24:50.204 "data_offset": 2048, 00:24:50.204 "data_size": 63488 00:24:50.204 }, 00:24:50.204 { 00:24:50.204 "name": "BaseBdev4", 00:24:50.204 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:50.204 "is_configured": true, 00:24:50.204 "data_offset": 2048, 00:24:50.204 "data_size": 63488 00:24:50.204 } 00:24:50.204 ] 00:24:50.204 }' 00:24:50.204 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:50.204 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:50.204 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:50.204 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:50.204 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:50.464 [2024-07-23 08:38:02.839784] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:50.464 [2024-07-23 08:38:02.841771] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:50.464 [2024-07-23 08:38:02.841834] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:50.464 [2024-07-23 08:38:02.841866] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:50.464 [2024-07-23 08:38:02.841876] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:50.464 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:50.464 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:50.464 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:50.464 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:50.464 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:50.464 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:50.464 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:50.464 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:50.464 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:50.464 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:50.464 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:50.464 08:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:50.723 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:50.723 "name": "raid_bdev1", 00:24:50.723 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:50.723 "strip_size_kb": 0, 00:24:50.723 "state": "online", 00:24:50.723 "raid_level": "raid1", 00:24:50.723 "superblock": true, 00:24:50.723 "num_base_bdevs": 4, 00:24:50.723 "num_base_bdevs_discovered": 2, 00:24:50.723 "num_base_bdevs_operational": 2, 00:24:50.723 "base_bdevs_list": [ 00:24:50.723 { 00:24:50.723 "name": null, 00:24:50.723 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:50.723 "is_configured": false, 00:24:50.723 "data_offset": 2048, 00:24:50.723 "data_size": 63488 00:24:50.723 }, 00:24:50.723 { 00:24:50.723 "name": null, 00:24:50.723 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:50.723 "is_configured": false, 00:24:50.723 "data_offset": 2048, 00:24:50.723 "data_size": 63488 00:24:50.723 }, 00:24:50.723 { 00:24:50.723 "name": "BaseBdev3", 00:24:50.723 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:50.723 "is_configured": true, 00:24:50.723 "data_offset": 2048, 00:24:50.723 "data_size": 63488 00:24:50.723 }, 00:24:50.723 { 00:24:50.723 "name": "BaseBdev4", 00:24:50.723 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:50.723 "is_configured": true, 00:24:50.723 "data_offset": 2048, 00:24:50.723 "data_size": 63488 00:24:50.723 } 00:24:50.723 ] 00:24:50.723 }' 00:24:50.723 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:50.723 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:51.290 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:51.290 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:51.290 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:51.290 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:51.290 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:51.290 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.290 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.290 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:51.290 "name": "raid_bdev1", 00:24:51.290 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:51.290 "strip_size_kb": 0, 00:24:51.290 "state": "online", 00:24:51.290 "raid_level": "raid1", 00:24:51.290 "superblock": true, 00:24:51.290 "num_base_bdevs": 4, 00:24:51.290 "num_base_bdevs_discovered": 2, 00:24:51.290 "num_base_bdevs_operational": 2, 00:24:51.290 "base_bdevs_list": [ 00:24:51.290 { 00:24:51.290 "name": null, 00:24:51.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:51.290 "is_configured": false, 00:24:51.290 "data_offset": 2048, 00:24:51.290 "data_size": 63488 00:24:51.290 }, 00:24:51.290 { 00:24:51.290 "name": null, 00:24:51.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:51.290 "is_configured": false, 00:24:51.290 "data_offset": 2048, 00:24:51.290 "data_size": 63488 00:24:51.290 }, 00:24:51.290 { 00:24:51.290 "name": "BaseBdev3", 00:24:51.290 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:51.290 "is_configured": true, 00:24:51.290 "data_offset": 2048, 00:24:51.290 "data_size": 63488 00:24:51.290 }, 00:24:51.290 { 00:24:51.290 "name": "BaseBdev4", 00:24:51.290 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:51.290 "is_configured": true, 00:24:51.290 "data_offset": 2048, 00:24:51.290 "data_size": 63488 00:24:51.290 } 00:24:51.290 ] 00:24:51.290 }' 00:24:51.290 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:51.290 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:51.290 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:51.549 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:51.549 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:51.549 08:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:51.808 [2024-07-23 08:38:04.149123] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:51.808 [2024-07-23 08:38:04.149184] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:51.808 [2024-07-23 08:38:04.149203] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600003b180 00:24:51.808 [2024-07-23 08:38:04.149215] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:51.808 [2024-07-23 08:38:04.149694] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:51.808 [2024-07-23 08:38:04.149714] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:51.808 [2024-07-23 08:38:04.149800] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:51.808 [2024-07-23 08:38:04.149817] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:24:51.808 [2024-07-23 08:38:04.149825] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:51.808 BaseBdev1 00:24:51.808 08:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:52.745 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:52.745 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:52.745 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:52.745 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:52.745 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:52.745 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:52.745 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:52.745 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:52.745 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:52.745 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:52.745 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.745 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:53.004 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:53.004 "name": "raid_bdev1", 00:24:53.004 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:53.004 "strip_size_kb": 0, 00:24:53.004 "state": "online", 00:24:53.004 "raid_level": "raid1", 00:24:53.004 "superblock": true, 00:24:53.004 "num_base_bdevs": 4, 00:24:53.004 "num_base_bdevs_discovered": 2, 00:24:53.004 "num_base_bdevs_operational": 2, 00:24:53.004 "base_bdevs_list": [ 00:24:53.004 { 00:24:53.004 "name": null, 00:24:53.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:53.004 "is_configured": false, 00:24:53.004 "data_offset": 2048, 00:24:53.004 "data_size": 63488 00:24:53.004 }, 00:24:53.004 { 00:24:53.004 "name": null, 00:24:53.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:53.004 "is_configured": false, 00:24:53.004 "data_offset": 2048, 00:24:53.004 "data_size": 63488 00:24:53.004 }, 00:24:53.004 { 00:24:53.004 "name": "BaseBdev3", 00:24:53.004 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:53.004 "is_configured": true, 00:24:53.004 "data_offset": 2048, 00:24:53.004 "data_size": 63488 00:24:53.004 }, 00:24:53.004 { 00:24:53.004 "name": "BaseBdev4", 00:24:53.004 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:53.004 "is_configured": true, 00:24:53.004 "data_offset": 2048, 00:24:53.004 "data_size": 63488 00:24:53.004 } 00:24:53.004 ] 00:24:53.004 }' 00:24:53.004 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:53.004 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:53.611 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:53.611 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:53.611 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:53.611 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:53.611 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:53.611 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.611 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:53.611 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:53.611 "name": "raid_bdev1", 00:24:53.611 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:53.611 "strip_size_kb": 0, 00:24:53.611 "state": "online", 00:24:53.611 "raid_level": "raid1", 00:24:53.611 "superblock": true, 00:24:53.611 "num_base_bdevs": 4, 00:24:53.611 "num_base_bdevs_discovered": 2, 00:24:53.611 "num_base_bdevs_operational": 2, 00:24:53.611 "base_bdevs_list": [ 00:24:53.611 { 00:24:53.611 "name": null, 00:24:53.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:53.611 "is_configured": false, 00:24:53.611 "data_offset": 2048, 00:24:53.611 "data_size": 63488 00:24:53.611 }, 00:24:53.611 { 00:24:53.611 "name": null, 00:24:53.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:53.611 "is_configured": false, 00:24:53.611 "data_offset": 2048, 00:24:53.611 "data_size": 63488 00:24:53.611 }, 00:24:53.611 { 00:24:53.611 "name": "BaseBdev3", 00:24:53.611 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:53.611 "is_configured": true, 00:24:53.611 "data_offset": 2048, 00:24:53.611 "data_size": 63488 00:24:53.611 }, 00:24:53.611 { 00:24:53.611 "name": "BaseBdev4", 00:24:53.611 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:53.611 "is_configured": true, 00:24:53.611 "data_offset": 2048, 00:24:53.611 "data_size": 63488 00:24:53.611 } 00:24:53.611 ] 00:24:53.611 }' 00:24:53.611 08:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:53.611 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:53.611 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:53.611 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:53.611 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:53.611 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:24:53.611 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:53.611 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:53.611 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:53.611 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:53.611 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:53.611 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:53.611 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:53.611 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:53.611 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:53.611 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:53.900 [2024-07-23 08:38:06.230831] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:53.900 [2024-07-23 08:38:06.230973] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:24:53.900 [2024-07-23 08:38:06.230988] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:53.900 request: 00:24:53.900 { 00:24:53.900 "base_bdev": "BaseBdev1", 00:24:53.900 "raid_bdev": "raid_bdev1", 00:24:53.900 "method": "bdev_raid_add_base_bdev", 00:24:53.900 "req_id": 1 00:24:53.900 } 00:24:53.900 Got JSON-RPC error response 00:24:53.900 response: 00:24:53.900 { 00:24:53.900 "code": -22, 00:24:53.900 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:53.900 } 00:24:53.900 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:24:53.900 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:53.900 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:53.900 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:53.900 08:38:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:54.833 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:54.833 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:54.833 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:54.833 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:54.833 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:54.833 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:54.833 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:54.833 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:54.833 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:54.833 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:54.833 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:54.833 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:55.091 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:55.091 "name": "raid_bdev1", 00:24:55.091 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:55.091 "strip_size_kb": 0, 00:24:55.091 "state": "online", 00:24:55.091 "raid_level": "raid1", 00:24:55.091 "superblock": true, 00:24:55.091 "num_base_bdevs": 4, 00:24:55.091 "num_base_bdevs_discovered": 2, 00:24:55.091 "num_base_bdevs_operational": 2, 00:24:55.091 "base_bdevs_list": [ 00:24:55.091 { 00:24:55.091 "name": null, 00:24:55.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:55.091 "is_configured": false, 00:24:55.091 "data_offset": 2048, 00:24:55.091 "data_size": 63488 00:24:55.091 }, 00:24:55.091 { 00:24:55.091 "name": null, 00:24:55.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:55.091 "is_configured": false, 00:24:55.091 "data_offset": 2048, 00:24:55.091 "data_size": 63488 00:24:55.091 }, 00:24:55.091 { 00:24:55.091 "name": "BaseBdev3", 00:24:55.091 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:55.091 "is_configured": true, 00:24:55.091 "data_offset": 2048, 00:24:55.091 "data_size": 63488 00:24:55.091 }, 00:24:55.091 { 00:24:55.091 "name": "BaseBdev4", 00:24:55.091 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:55.091 "is_configured": true, 00:24:55.091 "data_offset": 2048, 00:24:55.091 "data_size": 63488 00:24:55.091 } 00:24:55.091 ] 00:24:55.091 }' 00:24:55.091 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:55.091 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:55.657 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:55.657 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:55.657 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:55.657 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:55.657 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:55.657 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.657 08:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:55.657 08:38:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:55.657 "name": "raid_bdev1", 00:24:55.657 "uuid": "25f2d806-dd28-4464-9985-cf8bee12bb47", 00:24:55.657 "strip_size_kb": 0, 00:24:55.657 "state": "online", 00:24:55.657 "raid_level": "raid1", 00:24:55.657 "superblock": true, 00:24:55.657 "num_base_bdevs": 4, 00:24:55.657 "num_base_bdevs_discovered": 2, 00:24:55.657 "num_base_bdevs_operational": 2, 00:24:55.657 "base_bdevs_list": [ 00:24:55.657 { 00:24:55.657 "name": null, 00:24:55.657 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:55.657 "is_configured": false, 00:24:55.657 "data_offset": 2048, 00:24:55.657 "data_size": 63488 00:24:55.657 }, 00:24:55.657 { 00:24:55.657 "name": null, 00:24:55.657 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:55.657 "is_configured": false, 00:24:55.657 "data_offset": 2048, 00:24:55.657 "data_size": 63488 00:24:55.657 }, 00:24:55.657 { 00:24:55.657 "name": "BaseBdev3", 00:24:55.657 "uuid": "5fc361a9-f1af-5bb9-b524-9dbc49c652f0", 00:24:55.657 "is_configured": true, 00:24:55.657 "data_offset": 2048, 00:24:55.657 "data_size": 63488 00:24:55.657 }, 00:24:55.657 { 00:24:55.657 "name": "BaseBdev4", 00:24:55.657 "uuid": "b617bec3-2df8-579a-8a6c-a584f4b59c9b", 00:24:55.657 "is_configured": true, 00:24:55.657 "data_offset": 2048, 00:24:55.657 "data_size": 63488 00:24:55.657 } 00:24:55.657 ] 00:24:55.657 }' 00:24:55.657 08:38:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:55.657 08:38:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:55.657 08:38:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:55.915 08:38:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:55.915 08:38:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 1555715 00:24:55.915 08:38:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 1555715 ']' 00:24:55.915 08:38:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 1555715 00:24:55.915 08:38:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:24:55.915 08:38:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:55.915 08:38:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1555715 00:24:55.915 08:38:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:55.915 08:38:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:55.915 08:38:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1555715' 00:24:55.915 killing process with pid 1555715 00:24:55.915 08:38:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 1555715 00:24:55.915 Received shutdown signal, test time was about 24.446709 seconds 00:24:55.915 00:24:55.915 Latency(us) 00:24:55.915 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:55.915 =================================================================================================================== 00:24:55.915 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:55.915 [2024-07-23 08:38:08.222320] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:55.915 [2024-07-23 08:38:08.222444] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:55.915 08:38:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 1555715 00:24:55.915 [2024-07-23 08:38:08.222509] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:55.915 [2024-07-23 08:38:08.222526] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x61600003a580 name raid_bdev1, state offline 00:24:56.173 [2024-07-23 08:38:08.617277] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:57.548 08:38:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:24:57.548 00:24:57.548 real 0m30.425s 00:24:57.548 user 0m45.394s 00:24:57.548 sys 0m3.519s 00:24:57.548 08:38:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:57.548 08:38:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:57.548 ************************************ 00:24:57.548 END TEST raid_rebuild_test_sb_io 00:24:57.548 ************************************ 00:24:57.548 08:38:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:57.548 08:38:10 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:24:57.548 08:38:10 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:24:57.548 08:38:10 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:24:57.548 08:38:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:24:57.548 08:38:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:57.548 08:38:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:57.548 ************************************ 00:24:57.548 START TEST raid_state_function_test_sb_4k 00:24:57.548 ************************************ 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=1561733 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1561733' 00:24:57.548 Process raid pid: 1561733 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 1561733 /var/tmp/spdk-raid.sock 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 1561733 ']' 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:57.548 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:57.548 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:57.806 [2024-07-23 08:38:10.133230] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:24:57.806 [2024-07-23 08:38:10.133311] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:24:57.806 [2024-07-23 08:38:10.258959] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:58.065 [2024-07-23 08:38:10.469300] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:58.323 [2024-07-23 08:38:10.724585] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:58.323 [2024-07-23 08:38:10.724622] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:58.581 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:58.581 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:24:58.582 08:38:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:58.582 [2024-07-23 08:38:11.051962] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:58.582 [2024-07-23 08:38:11.052008] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:58.582 [2024-07-23 08:38:11.052018] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:58.582 [2024-07-23 08:38:11.052028] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:58.582 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:58.582 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:58.582 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:58.582 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:58.582 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:58.582 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:58.582 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:58.582 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:58.582 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:58.582 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:58.582 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.582 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:58.841 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:58.841 "name": "Existed_Raid", 00:24:58.841 "uuid": "b39f2a36-9640-43d7-939f-34d24e74568e", 00:24:58.841 "strip_size_kb": 0, 00:24:58.841 "state": "configuring", 00:24:58.841 "raid_level": "raid1", 00:24:58.841 "superblock": true, 00:24:58.841 "num_base_bdevs": 2, 00:24:58.841 "num_base_bdevs_discovered": 0, 00:24:58.841 "num_base_bdevs_operational": 2, 00:24:58.841 "base_bdevs_list": [ 00:24:58.841 { 00:24:58.841 "name": "BaseBdev1", 00:24:58.841 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:58.841 "is_configured": false, 00:24:58.841 "data_offset": 0, 00:24:58.841 "data_size": 0 00:24:58.841 }, 00:24:58.841 { 00:24:58.841 "name": "BaseBdev2", 00:24:58.841 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:58.841 "is_configured": false, 00:24:58.841 "data_offset": 0, 00:24:58.841 "data_size": 0 00:24:58.841 } 00:24:58.841 ] 00:24:58.841 }' 00:24:58.841 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:58.841 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:59.409 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:59.409 [2024-07-23 08:38:11.854004] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:59.409 [2024-07-23 08:38:11.854037] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:24:59.409 08:38:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:59.668 [2024-07-23 08:38:12.022451] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:59.668 [2024-07-23 08:38:12.022486] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:59.668 [2024-07-23 08:38:12.022495] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:59.668 [2024-07-23 08:38:12.022504] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:59.668 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:24:59.927 [2024-07-23 08:38:12.219901] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:59.927 BaseBdev1 00:24:59.927 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:24:59.927 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:24:59.927 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:59.927 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:24:59.927 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:59.927 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:59.927 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:59.927 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:25:00.185 [ 00:25:00.185 { 00:25:00.185 "name": "BaseBdev1", 00:25:00.185 "aliases": [ 00:25:00.185 "bab85152-db61-440a-88c0-5ac4b529dde8" 00:25:00.185 ], 00:25:00.185 "product_name": "Malloc disk", 00:25:00.185 "block_size": 4096, 00:25:00.185 "num_blocks": 8192, 00:25:00.185 "uuid": "bab85152-db61-440a-88c0-5ac4b529dde8", 00:25:00.185 "assigned_rate_limits": { 00:25:00.186 "rw_ios_per_sec": 0, 00:25:00.186 "rw_mbytes_per_sec": 0, 00:25:00.186 "r_mbytes_per_sec": 0, 00:25:00.186 "w_mbytes_per_sec": 0 00:25:00.186 }, 00:25:00.186 "claimed": true, 00:25:00.186 "claim_type": "exclusive_write", 00:25:00.186 "zoned": false, 00:25:00.186 "supported_io_types": { 00:25:00.186 "read": true, 00:25:00.186 "write": true, 00:25:00.186 "unmap": true, 00:25:00.186 "flush": true, 00:25:00.186 "reset": true, 00:25:00.186 "nvme_admin": false, 00:25:00.186 "nvme_io": false, 00:25:00.186 "nvme_io_md": false, 00:25:00.186 "write_zeroes": true, 00:25:00.186 "zcopy": true, 00:25:00.186 "get_zone_info": false, 00:25:00.186 "zone_management": false, 00:25:00.186 "zone_append": false, 00:25:00.186 "compare": false, 00:25:00.186 "compare_and_write": false, 00:25:00.186 "abort": true, 00:25:00.186 "seek_hole": false, 00:25:00.186 "seek_data": false, 00:25:00.186 "copy": true, 00:25:00.186 "nvme_iov_md": false 00:25:00.186 }, 00:25:00.186 "memory_domains": [ 00:25:00.186 { 00:25:00.186 "dma_device_id": "system", 00:25:00.186 "dma_device_type": 1 00:25:00.186 }, 00:25:00.186 { 00:25:00.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:00.186 "dma_device_type": 2 00:25:00.186 } 00:25:00.186 ], 00:25:00.186 "driver_specific": {} 00:25:00.186 } 00:25:00.186 ] 00:25:00.186 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:25:00.186 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:00.186 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:00.186 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:00.186 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:00.186 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:00.186 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:00.186 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:00.186 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:00.186 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:00.186 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:00.186 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.186 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:00.444 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:00.444 "name": "Existed_Raid", 00:25:00.444 "uuid": "53a9c9b6-19b9-4bf4-8071-75cdf05d110d", 00:25:00.444 "strip_size_kb": 0, 00:25:00.444 "state": "configuring", 00:25:00.444 "raid_level": "raid1", 00:25:00.444 "superblock": true, 00:25:00.444 "num_base_bdevs": 2, 00:25:00.444 "num_base_bdevs_discovered": 1, 00:25:00.444 "num_base_bdevs_operational": 2, 00:25:00.444 "base_bdevs_list": [ 00:25:00.444 { 00:25:00.444 "name": "BaseBdev1", 00:25:00.444 "uuid": "bab85152-db61-440a-88c0-5ac4b529dde8", 00:25:00.444 "is_configured": true, 00:25:00.444 "data_offset": 256, 00:25:00.444 "data_size": 7936 00:25:00.444 }, 00:25:00.444 { 00:25:00.444 "name": "BaseBdev2", 00:25:00.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:00.444 "is_configured": false, 00:25:00.444 "data_offset": 0, 00:25:00.444 "data_size": 0 00:25:00.444 } 00:25:00.444 ] 00:25:00.444 }' 00:25:00.444 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:00.444 08:38:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:01.011 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:01.011 [2024-07-23 08:38:13.399068] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:01.011 [2024-07-23 08:38:13.399114] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:25:01.011 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:01.269 [2024-07-23 08:38:13.571554] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:01.269 [2024-07-23 08:38:13.573119] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:01.270 [2024-07-23 08:38:13.573152] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:01.270 "name": "Existed_Raid", 00:25:01.270 "uuid": "43d3c39b-9873-4d58-b277-bb711bb73afb", 00:25:01.270 "strip_size_kb": 0, 00:25:01.270 "state": "configuring", 00:25:01.270 "raid_level": "raid1", 00:25:01.270 "superblock": true, 00:25:01.270 "num_base_bdevs": 2, 00:25:01.270 "num_base_bdevs_discovered": 1, 00:25:01.270 "num_base_bdevs_operational": 2, 00:25:01.270 "base_bdevs_list": [ 00:25:01.270 { 00:25:01.270 "name": "BaseBdev1", 00:25:01.270 "uuid": "bab85152-db61-440a-88c0-5ac4b529dde8", 00:25:01.270 "is_configured": true, 00:25:01.270 "data_offset": 256, 00:25:01.270 "data_size": 7936 00:25:01.270 }, 00:25:01.270 { 00:25:01.270 "name": "BaseBdev2", 00:25:01.270 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:01.270 "is_configured": false, 00:25:01.270 "data_offset": 0, 00:25:01.270 "data_size": 0 00:25:01.270 } 00:25:01.270 ] 00:25:01.270 }' 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:01.270 08:38:13 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:01.836 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:25:02.094 [2024-07-23 08:38:14.462640] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:02.094 [2024-07-23 08:38:14.462857] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:25:02.094 [2024-07-23 08:38:14.462873] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:02.094 [2024-07-23 08:38:14.463114] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:25:02.094 [2024-07-23 08:38:14.463296] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:25:02.094 [2024-07-23 08:38:14.463309] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:25:02.094 [2024-07-23 08:38:14.463452] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:02.094 BaseBdev2 00:25:02.094 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:25:02.094 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:25:02.094 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:02.095 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:25:02.095 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:02.095 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:02.095 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:02.353 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:25:02.353 [ 00:25:02.353 { 00:25:02.353 "name": "BaseBdev2", 00:25:02.353 "aliases": [ 00:25:02.353 "5f05ab6d-bf19-4832-b9c0-3ced77e49a8b" 00:25:02.353 ], 00:25:02.353 "product_name": "Malloc disk", 00:25:02.353 "block_size": 4096, 00:25:02.353 "num_blocks": 8192, 00:25:02.353 "uuid": "5f05ab6d-bf19-4832-b9c0-3ced77e49a8b", 00:25:02.353 "assigned_rate_limits": { 00:25:02.353 "rw_ios_per_sec": 0, 00:25:02.353 "rw_mbytes_per_sec": 0, 00:25:02.353 "r_mbytes_per_sec": 0, 00:25:02.353 "w_mbytes_per_sec": 0 00:25:02.353 }, 00:25:02.353 "claimed": true, 00:25:02.353 "claim_type": "exclusive_write", 00:25:02.353 "zoned": false, 00:25:02.353 "supported_io_types": { 00:25:02.353 "read": true, 00:25:02.353 "write": true, 00:25:02.353 "unmap": true, 00:25:02.353 "flush": true, 00:25:02.353 "reset": true, 00:25:02.353 "nvme_admin": false, 00:25:02.353 "nvme_io": false, 00:25:02.353 "nvme_io_md": false, 00:25:02.353 "write_zeroes": true, 00:25:02.353 "zcopy": true, 00:25:02.353 "get_zone_info": false, 00:25:02.353 "zone_management": false, 00:25:02.353 "zone_append": false, 00:25:02.353 "compare": false, 00:25:02.353 "compare_and_write": false, 00:25:02.353 "abort": true, 00:25:02.353 "seek_hole": false, 00:25:02.353 "seek_data": false, 00:25:02.353 "copy": true, 00:25:02.353 "nvme_iov_md": false 00:25:02.353 }, 00:25:02.353 "memory_domains": [ 00:25:02.353 { 00:25:02.353 "dma_device_id": "system", 00:25:02.353 "dma_device_type": 1 00:25:02.353 }, 00:25:02.353 { 00:25:02.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:02.353 "dma_device_type": 2 00:25:02.353 } 00:25:02.353 ], 00:25:02.353 "driver_specific": {} 00:25:02.353 } 00:25:02.353 ] 00:25:02.353 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:25:02.353 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:25:02.353 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:02.353 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:25:02.353 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:02.353 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:02.353 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:02.353 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:02.353 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:02.353 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:02.353 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:02.353 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:02.353 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:02.354 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.354 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:02.612 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:02.612 "name": "Existed_Raid", 00:25:02.612 "uuid": "43d3c39b-9873-4d58-b277-bb711bb73afb", 00:25:02.612 "strip_size_kb": 0, 00:25:02.612 "state": "online", 00:25:02.612 "raid_level": "raid1", 00:25:02.612 "superblock": true, 00:25:02.612 "num_base_bdevs": 2, 00:25:02.612 "num_base_bdevs_discovered": 2, 00:25:02.612 "num_base_bdevs_operational": 2, 00:25:02.612 "base_bdevs_list": [ 00:25:02.612 { 00:25:02.612 "name": "BaseBdev1", 00:25:02.612 "uuid": "bab85152-db61-440a-88c0-5ac4b529dde8", 00:25:02.612 "is_configured": true, 00:25:02.612 "data_offset": 256, 00:25:02.612 "data_size": 7936 00:25:02.612 }, 00:25:02.612 { 00:25:02.612 "name": "BaseBdev2", 00:25:02.612 "uuid": "5f05ab6d-bf19-4832-b9c0-3ced77e49a8b", 00:25:02.612 "is_configured": true, 00:25:02.612 "data_offset": 256, 00:25:02.612 "data_size": 7936 00:25:02.612 } 00:25:02.612 ] 00:25:02.612 }' 00:25:02.612 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:02.612 08:38:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:03.179 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:25:03.179 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:25:03.179 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:03.179 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:03.179 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:03.179 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:25:03.179 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:25:03.179 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:03.179 [2024-07-23 08:38:15.609944] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:03.179 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:03.179 "name": "Existed_Raid", 00:25:03.179 "aliases": [ 00:25:03.179 "43d3c39b-9873-4d58-b277-bb711bb73afb" 00:25:03.179 ], 00:25:03.179 "product_name": "Raid Volume", 00:25:03.179 "block_size": 4096, 00:25:03.179 "num_blocks": 7936, 00:25:03.179 "uuid": "43d3c39b-9873-4d58-b277-bb711bb73afb", 00:25:03.179 "assigned_rate_limits": { 00:25:03.179 "rw_ios_per_sec": 0, 00:25:03.179 "rw_mbytes_per_sec": 0, 00:25:03.179 "r_mbytes_per_sec": 0, 00:25:03.179 "w_mbytes_per_sec": 0 00:25:03.179 }, 00:25:03.179 "claimed": false, 00:25:03.179 "zoned": false, 00:25:03.179 "supported_io_types": { 00:25:03.179 "read": true, 00:25:03.179 "write": true, 00:25:03.179 "unmap": false, 00:25:03.179 "flush": false, 00:25:03.179 "reset": true, 00:25:03.179 "nvme_admin": false, 00:25:03.179 "nvme_io": false, 00:25:03.179 "nvme_io_md": false, 00:25:03.179 "write_zeroes": true, 00:25:03.179 "zcopy": false, 00:25:03.179 "get_zone_info": false, 00:25:03.179 "zone_management": false, 00:25:03.179 "zone_append": false, 00:25:03.179 "compare": false, 00:25:03.179 "compare_and_write": false, 00:25:03.179 "abort": false, 00:25:03.179 "seek_hole": false, 00:25:03.179 "seek_data": false, 00:25:03.179 "copy": false, 00:25:03.179 "nvme_iov_md": false 00:25:03.179 }, 00:25:03.179 "memory_domains": [ 00:25:03.179 { 00:25:03.179 "dma_device_id": "system", 00:25:03.179 "dma_device_type": 1 00:25:03.179 }, 00:25:03.179 { 00:25:03.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:03.179 "dma_device_type": 2 00:25:03.179 }, 00:25:03.179 { 00:25:03.179 "dma_device_id": "system", 00:25:03.179 "dma_device_type": 1 00:25:03.179 }, 00:25:03.179 { 00:25:03.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:03.179 "dma_device_type": 2 00:25:03.179 } 00:25:03.179 ], 00:25:03.179 "driver_specific": { 00:25:03.179 "raid": { 00:25:03.179 "uuid": "43d3c39b-9873-4d58-b277-bb711bb73afb", 00:25:03.179 "strip_size_kb": 0, 00:25:03.179 "state": "online", 00:25:03.179 "raid_level": "raid1", 00:25:03.179 "superblock": true, 00:25:03.179 "num_base_bdevs": 2, 00:25:03.179 "num_base_bdevs_discovered": 2, 00:25:03.179 "num_base_bdevs_operational": 2, 00:25:03.179 "base_bdevs_list": [ 00:25:03.179 { 00:25:03.179 "name": "BaseBdev1", 00:25:03.179 "uuid": "bab85152-db61-440a-88c0-5ac4b529dde8", 00:25:03.179 "is_configured": true, 00:25:03.179 "data_offset": 256, 00:25:03.180 "data_size": 7936 00:25:03.180 }, 00:25:03.180 { 00:25:03.180 "name": "BaseBdev2", 00:25:03.180 "uuid": "5f05ab6d-bf19-4832-b9c0-3ced77e49a8b", 00:25:03.180 "is_configured": true, 00:25:03.180 "data_offset": 256, 00:25:03.180 "data_size": 7936 00:25:03.180 } 00:25:03.180 ] 00:25:03.180 } 00:25:03.180 } 00:25:03.180 }' 00:25:03.180 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:03.180 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:25:03.180 BaseBdev2' 00:25:03.180 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:03.180 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:25:03.180 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:03.439 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:03.439 "name": "BaseBdev1", 00:25:03.439 "aliases": [ 00:25:03.439 "bab85152-db61-440a-88c0-5ac4b529dde8" 00:25:03.439 ], 00:25:03.439 "product_name": "Malloc disk", 00:25:03.439 "block_size": 4096, 00:25:03.439 "num_blocks": 8192, 00:25:03.439 "uuid": "bab85152-db61-440a-88c0-5ac4b529dde8", 00:25:03.439 "assigned_rate_limits": { 00:25:03.439 "rw_ios_per_sec": 0, 00:25:03.439 "rw_mbytes_per_sec": 0, 00:25:03.439 "r_mbytes_per_sec": 0, 00:25:03.439 "w_mbytes_per_sec": 0 00:25:03.439 }, 00:25:03.439 "claimed": true, 00:25:03.439 "claim_type": "exclusive_write", 00:25:03.439 "zoned": false, 00:25:03.439 "supported_io_types": { 00:25:03.439 "read": true, 00:25:03.439 "write": true, 00:25:03.439 "unmap": true, 00:25:03.439 "flush": true, 00:25:03.439 "reset": true, 00:25:03.439 "nvme_admin": false, 00:25:03.439 "nvme_io": false, 00:25:03.439 "nvme_io_md": false, 00:25:03.439 "write_zeroes": true, 00:25:03.439 "zcopy": true, 00:25:03.439 "get_zone_info": false, 00:25:03.439 "zone_management": false, 00:25:03.439 "zone_append": false, 00:25:03.439 "compare": false, 00:25:03.439 "compare_and_write": false, 00:25:03.439 "abort": true, 00:25:03.439 "seek_hole": false, 00:25:03.439 "seek_data": false, 00:25:03.439 "copy": true, 00:25:03.439 "nvme_iov_md": false 00:25:03.439 }, 00:25:03.439 "memory_domains": [ 00:25:03.439 { 00:25:03.439 "dma_device_id": "system", 00:25:03.439 "dma_device_type": 1 00:25:03.439 }, 00:25:03.439 { 00:25:03.439 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:03.439 "dma_device_type": 2 00:25:03.439 } 00:25:03.439 ], 00:25:03.439 "driver_specific": {} 00:25:03.439 }' 00:25:03.439 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:03.439 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:03.439 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:03.439 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:03.698 08:38:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:03.698 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:03.698 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:03.698 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:03.698 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:03.698 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:03.698 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:03.698 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:03.698 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:03.698 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:25:03.698 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:03.957 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:03.957 "name": "BaseBdev2", 00:25:03.957 "aliases": [ 00:25:03.957 "5f05ab6d-bf19-4832-b9c0-3ced77e49a8b" 00:25:03.957 ], 00:25:03.957 "product_name": "Malloc disk", 00:25:03.957 "block_size": 4096, 00:25:03.957 "num_blocks": 8192, 00:25:03.957 "uuid": "5f05ab6d-bf19-4832-b9c0-3ced77e49a8b", 00:25:03.957 "assigned_rate_limits": { 00:25:03.957 "rw_ios_per_sec": 0, 00:25:03.957 "rw_mbytes_per_sec": 0, 00:25:03.957 "r_mbytes_per_sec": 0, 00:25:03.957 "w_mbytes_per_sec": 0 00:25:03.957 }, 00:25:03.957 "claimed": true, 00:25:03.957 "claim_type": "exclusive_write", 00:25:03.957 "zoned": false, 00:25:03.957 "supported_io_types": { 00:25:03.957 "read": true, 00:25:03.957 "write": true, 00:25:03.957 "unmap": true, 00:25:03.957 "flush": true, 00:25:03.957 "reset": true, 00:25:03.957 "nvme_admin": false, 00:25:03.957 "nvme_io": false, 00:25:03.957 "nvme_io_md": false, 00:25:03.957 "write_zeroes": true, 00:25:03.957 "zcopy": true, 00:25:03.957 "get_zone_info": false, 00:25:03.957 "zone_management": false, 00:25:03.957 "zone_append": false, 00:25:03.957 "compare": false, 00:25:03.957 "compare_and_write": false, 00:25:03.957 "abort": true, 00:25:03.957 "seek_hole": false, 00:25:03.957 "seek_data": false, 00:25:03.957 "copy": true, 00:25:03.957 "nvme_iov_md": false 00:25:03.957 }, 00:25:03.957 "memory_domains": [ 00:25:03.957 { 00:25:03.957 "dma_device_id": "system", 00:25:03.957 "dma_device_type": 1 00:25:03.957 }, 00:25:03.957 { 00:25:03.957 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:03.957 "dma_device_type": 2 00:25:03.957 } 00:25:03.957 ], 00:25:03.957 "driver_specific": {} 00:25:03.957 }' 00:25:03.957 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:03.957 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:03.957 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:03.957 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:03.957 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:04.216 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:04.216 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:04.216 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:04.216 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:04.216 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:04.216 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:04.216 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:04.216 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:25:04.475 [2024-07-23 08:38:16.776830] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.475 08:38:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:04.734 08:38:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:04.734 "name": "Existed_Raid", 00:25:04.734 "uuid": "43d3c39b-9873-4d58-b277-bb711bb73afb", 00:25:04.734 "strip_size_kb": 0, 00:25:04.734 "state": "online", 00:25:04.734 "raid_level": "raid1", 00:25:04.734 "superblock": true, 00:25:04.734 "num_base_bdevs": 2, 00:25:04.734 "num_base_bdevs_discovered": 1, 00:25:04.734 "num_base_bdevs_operational": 1, 00:25:04.734 "base_bdevs_list": [ 00:25:04.734 { 00:25:04.734 "name": null, 00:25:04.734 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:04.734 "is_configured": false, 00:25:04.734 "data_offset": 256, 00:25:04.734 "data_size": 7936 00:25:04.734 }, 00:25:04.734 { 00:25:04.734 "name": "BaseBdev2", 00:25:04.734 "uuid": "5f05ab6d-bf19-4832-b9c0-3ced77e49a8b", 00:25:04.734 "is_configured": true, 00:25:04.734 "data_offset": 256, 00:25:04.734 "data_size": 7936 00:25:04.734 } 00:25:04.734 ] 00:25:04.734 }' 00:25:04.734 08:38:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:04.734 08:38:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:04.992 08:38:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:25:04.992 08:38:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:04.992 08:38:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.992 08:38:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:25:05.251 08:38:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:25:05.251 08:38:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:25:05.251 08:38:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:25:05.509 [2024-07-23 08:38:17.836359] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:05.509 [2024-07-23 08:38:17.836460] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:05.509 [2024-07-23 08:38:17.934672] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:05.509 [2024-07-23 08:38:17.934718] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:05.509 [2024-07-23 08:38:17.934730] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:25:05.509 08:38:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:25:05.509 08:38:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:05.509 08:38:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.509 08:38:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:25:05.768 08:38:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:25:05.768 08:38:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:25:05.768 08:38:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:25:05.768 08:38:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 1561733 00:25:05.768 08:38:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 1561733 ']' 00:25:05.768 08:38:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 1561733 00:25:05.768 08:38:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:25:05.768 08:38:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:05.768 08:38:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1561733 00:25:05.768 08:38:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:05.768 08:38:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:05.768 08:38:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1561733' 00:25:05.768 killing process with pid 1561733 00:25:05.768 08:38:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 1561733 00:25:05.768 [2024-07-23 08:38:18.185997] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:05.768 08:38:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 1561733 00:25:05.768 [2024-07-23 08:38:18.204722] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:07.161 08:38:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:25:07.161 00:25:07.161 real 0m9.399s 00:25:07.161 user 0m15.711s 00:25:07.161 sys 0m1.471s 00:25:07.161 08:38:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:07.161 08:38:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:07.161 ************************************ 00:25:07.161 END TEST raid_state_function_test_sb_4k 00:25:07.161 ************************************ 00:25:07.161 08:38:19 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:07.161 08:38:19 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:25:07.161 08:38:19 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:25:07.161 08:38:19 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:07.161 08:38:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:07.161 ************************************ 00:25:07.161 START TEST raid_superblock_test_4k 00:25:07.161 ************************************ 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=1563601 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 1563601 /var/tmp/spdk-raid.sock 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 1563601 ']' 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:07.161 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:07.161 08:38:19 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:07.161 [2024-07-23 08:38:19.592922] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:25:07.161 [2024-07-23 08:38:19.593010] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1563601 ] 00:25:07.432 [2024-07-23 08:38:19.715777] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:07.432 [2024-07-23 08:38:19.929399] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:07.690 [2024-07-23 08:38:20.184769] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:07.690 [2024-07-23 08:38:20.184801] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:07.949 08:38:20 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:07.949 08:38:20 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:25:07.949 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:25:07.949 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:07.949 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:25:07.949 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:25:07.949 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:25:07.949 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:07.949 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:07.949 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:07.949 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:25:08.207 malloc1 00:25:08.207 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:08.207 [2024-07-23 08:38:20.725971] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:08.207 [2024-07-23 08:38:20.726023] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:08.207 [2024-07-23 08:38:20.726045] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:25:08.208 [2024-07-23 08:38:20.726058] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:08.466 [2024-07-23 08:38:20.728051] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:08.466 [2024-07-23 08:38:20.728075] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:08.466 pt1 00:25:08.466 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:08.466 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:08.466 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:25:08.466 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:25:08.466 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:25:08.466 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:08.466 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:08.466 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:08.466 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:25:08.466 malloc2 00:25:08.466 08:38:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:08.725 [2024-07-23 08:38:21.119505] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:08.725 [2024-07-23 08:38:21.119560] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:08.725 [2024-07-23 08:38:21.119582] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:25:08.725 [2024-07-23 08:38:21.119592] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:08.725 [2024-07-23 08:38:21.121603] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:08.725 [2024-07-23 08:38:21.121636] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:08.725 pt2 00:25:08.725 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:08.725 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:08.725 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:25:08.983 [2024-07-23 08:38:21.287962] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:08.983 [2024-07-23 08:38:21.289604] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:08.983 [2024-07-23 08:38:21.289818] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035a80 00:25:08.983 [2024-07-23 08:38:21.289832] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:08.983 [2024-07-23 08:38:21.290089] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:25:08.983 [2024-07-23 08:38:21.290298] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035a80 00:25:08.983 [2024-07-23 08:38:21.290311] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000035a80 00:25:08.983 [2024-07-23 08:38:21.290484] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:08.983 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:08.983 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:08.983 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:08.983 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:08.983 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:08.983 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:08.983 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:08.983 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:08.983 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:08.983 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:08.983 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:08.983 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:08.983 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:08.983 "name": "raid_bdev1", 00:25:08.983 "uuid": "eb424fe9-63f0-4380-bb51-05eef1efc5f8", 00:25:08.983 "strip_size_kb": 0, 00:25:08.983 "state": "online", 00:25:08.983 "raid_level": "raid1", 00:25:08.983 "superblock": true, 00:25:08.983 "num_base_bdevs": 2, 00:25:08.983 "num_base_bdevs_discovered": 2, 00:25:08.983 "num_base_bdevs_operational": 2, 00:25:08.983 "base_bdevs_list": [ 00:25:08.983 { 00:25:08.983 "name": "pt1", 00:25:08.983 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:08.983 "is_configured": true, 00:25:08.983 "data_offset": 256, 00:25:08.983 "data_size": 7936 00:25:08.983 }, 00:25:08.983 { 00:25:08.983 "name": "pt2", 00:25:08.983 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:08.983 "is_configured": true, 00:25:08.983 "data_offset": 256, 00:25:08.983 "data_size": 7936 00:25:08.983 } 00:25:08.983 ] 00:25:08.983 }' 00:25:08.983 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:08.983 08:38:21 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:09.550 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:25:09.550 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:25:09.550 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:09.550 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:09.550 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:09.550 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:25:09.550 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:09.550 08:38:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:09.809 [2024-07-23 08:38:22.130392] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:09.809 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:09.809 "name": "raid_bdev1", 00:25:09.809 "aliases": [ 00:25:09.809 "eb424fe9-63f0-4380-bb51-05eef1efc5f8" 00:25:09.809 ], 00:25:09.809 "product_name": "Raid Volume", 00:25:09.809 "block_size": 4096, 00:25:09.809 "num_blocks": 7936, 00:25:09.809 "uuid": "eb424fe9-63f0-4380-bb51-05eef1efc5f8", 00:25:09.809 "assigned_rate_limits": { 00:25:09.809 "rw_ios_per_sec": 0, 00:25:09.809 "rw_mbytes_per_sec": 0, 00:25:09.809 "r_mbytes_per_sec": 0, 00:25:09.809 "w_mbytes_per_sec": 0 00:25:09.809 }, 00:25:09.809 "claimed": false, 00:25:09.809 "zoned": false, 00:25:09.809 "supported_io_types": { 00:25:09.809 "read": true, 00:25:09.809 "write": true, 00:25:09.809 "unmap": false, 00:25:09.809 "flush": false, 00:25:09.809 "reset": true, 00:25:09.809 "nvme_admin": false, 00:25:09.809 "nvme_io": false, 00:25:09.809 "nvme_io_md": false, 00:25:09.809 "write_zeroes": true, 00:25:09.809 "zcopy": false, 00:25:09.809 "get_zone_info": false, 00:25:09.809 "zone_management": false, 00:25:09.809 "zone_append": false, 00:25:09.809 "compare": false, 00:25:09.809 "compare_and_write": false, 00:25:09.809 "abort": false, 00:25:09.809 "seek_hole": false, 00:25:09.809 "seek_data": false, 00:25:09.809 "copy": false, 00:25:09.809 "nvme_iov_md": false 00:25:09.809 }, 00:25:09.809 "memory_domains": [ 00:25:09.809 { 00:25:09.809 "dma_device_id": "system", 00:25:09.809 "dma_device_type": 1 00:25:09.809 }, 00:25:09.809 { 00:25:09.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:09.809 "dma_device_type": 2 00:25:09.809 }, 00:25:09.809 { 00:25:09.809 "dma_device_id": "system", 00:25:09.809 "dma_device_type": 1 00:25:09.809 }, 00:25:09.809 { 00:25:09.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:09.809 "dma_device_type": 2 00:25:09.809 } 00:25:09.809 ], 00:25:09.809 "driver_specific": { 00:25:09.809 "raid": { 00:25:09.809 "uuid": "eb424fe9-63f0-4380-bb51-05eef1efc5f8", 00:25:09.809 "strip_size_kb": 0, 00:25:09.809 "state": "online", 00:25:09.809 "raid_level": "raid1", 00:25:09.809 "superblock": true, 00:25:09.809 "num_base_bdevs": 2, 00:25:09.809 "num_base_bdevs_discovered": 2, 00:25:09.809 "num_base_bdevs_operational": 2, 00:25:09.809 "base_bdevs_list": [ 00:25:09.809 { 00:25:09.809 "name": "pt1", 00:25:09.809 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:09.809 "is_configured": true, 00:25:09.809 "data_offset": 256, 00:25:09.809 "data_size": 7936 00:25:09.809 }, 00:25:09.809 { 00:25:09.809 "name": "pt2", 00:25:09.809 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:09.809 "is_configured": true, 00:25:09.809 "data_offset": 256, 00:25:09.809 "data_size": 7936 00:25:09.809 } 00:25:09.809 ] 00:25:09.809 } 00:25:09.809 } 00:25:09.809 }' 00:25:09.809 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:09.809 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:25:09.809 pt2' 00:25:09.809 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:09.809 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:09.809 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:10.067 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:10.067 "name": "pt1", 00:25:10.067 "aliases": [ 00:25:10.067 "00000000-0000-0000-0000-000000000001" 00:25:10.067 ], 00:25:10.067 "product_name": "passthru", 00:25:10.067 "block_size": 4096, 00:25:10.067 "num_blocks": 8192, 00:25:10.067 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:10.067 "assigned_rate_limits": { 00:25:10.067 "rw_ios_per_sec": 0, 00:25:10.067 "rw_mbytes_per_sec": 0, 00:25:10.067 "r_mbytes_per_sec": 0, 00:25:10.067 "w_mbytes_per_sec": 0 00:25:10.067 }, 00:25:10.067 "claimed": true, 00:25:10.067 "claim_type": "exclusive_write", 00:25:10.067 "zoned": false, 00:25:10.067 "supported_io_types": { 00:25:10.067 "read": true, 00:25:10.067 "write": true, 00:25:10.067 "unmap": true, 00:25:10.067 "flush": true, 00:25:10.067 "reset": true, 00:25:10.067 "nvme_admin": false, 00:25:10.067 "nvme_io": false, 00:25:10.067 "nvme_io_md": false, 00:25:10.067 "write_zeroes": true, 00:25:10.067 "zcopy": true, 00:25:10.067 "get_zone_info": false, 00:25:10.067 "zone_management": false, 00:25:10.067 "zone_append": false, 00:25:10.067 "compare": false, 00:25:10.067 "compare_and_write": false, 00:25:10.067 "abort": true, 00:25:10.067 "seek_hole": false, 00:25:10.067 "seek_data": false, 00:25:10.067 "copy": true, 00:25:10.067 "nvme_iov_md": false 00:25:10.067 }, 00:25:10.067 "memory_domains": [ 00:25:10.067 { 00:25:10.067 "dma_device_id": "system", 00:25:10.068 "dma_device_type": 1 00:25:10.068 }, 00:25:10.068 { 00:25:10.068 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:10.068 "dma_device_type": 2 00:25:10.068 } 00:25:10.068 ], 00:25:10.068 "driver_specific": { 00:25:10.068 "passthru": { 00:25:10.068 "name": "pt1", 00:25:10.068 "base_bdev_name": "malloc1" 00:25:10.068 } 00:25:10.068 } 00:25:10.068 }' 00:25:10.068 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:10.068 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:10.068 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:10.068 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:10.068 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:10.068 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:10.068 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:10.068 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:10.326 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:10.326 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:10.326 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:10.326 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:10.326 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:10.326 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:10.326 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:10.585 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:10.585 "name": "pt2", 00:25:10.585 "aliases": [ 00:25:10.585 "00000000-0000-0000-0000-000000000002" 00:25:10.585 ], 00:25:10.585 "product_name": "passthru", 00:25:10.585 "block_size": 4096, 00:25:10.585 "num_blocks": 8192, 00:25:10.585 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:10.585 "assigned_rate_limits": { 00:25:10.585 "rw_ios_per_sec": 0, 00:25:10.585 "rw_mbytes_per_sec": 0, 00:25:10.585 "r_mbytes_per_sec": 0, 00:25:10.585 "w_mbytes_per_sec": 0 00:25:10.585 }, 00:25:10.585 "claimed": true, 00:25:10.585 "claim_type": "exclusive_write", 00:25:10.585 "zoned": false, 00:25:10.585 "supported_io_types": { 00:25:10.585 "read": true, 00:25:10.585 "write": true, 00:25:10.585 "unmap": true, 00:25:10.585 "flush": true, 00:25:10.585 "reset": true, 00:25:10.585 "nvme_admin": false, 00:25:10.585 "nvme_io": false, 00:25:10.585 "nvme_io_md": false, 00:25:10.585 "write_zeroes": true, 00:25:10.585 "zcopy": true, 00:25:10.585 "get_zone_info": false, 00:25:10.585 "zone_management": false, 00:25:10.585 "zone_append": false, 00:25:10.585 "compare": false, 00:25:10.585 "compare_and_write": false, 00:25:10.585 "abort": true, 00:25:10.585 "seek_hole": false, 00:25:10.585 "seek_data": false, 00:25:10.585 "copy": true, 00:25:10.585 "nvme_iov_md": false 00:25:10.585 }, 00:25:10.585 "memory_domains": [ 00:25:10.585 { 00:25:10.585 "dma_device_id": "system", 00:25:10.585 "dma_device_type": 1 00:25:10.585 }, 00:25:10.585 { 00:25:10.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:10.585 "dma_device_type": 2 00:25:10.585 } 00:25:10.585 ], 00:25:10.585 "driver_specific": { 00:25:10.585 "passthru": { 00:25:10.585 "name": "pt2", 00:25:10.585 "base_bdev_name": "malloc2" 00:25:10.585 } 00:25:10.585 } 00:25:10.585 }' 00:25:10.585 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:10.585 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:10.585 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:10.585 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:10.585 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:10.585 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:10.585 08:38:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:10.585 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:10.585 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:10.585 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:10.585 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:10.844 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:10.844 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:10.844 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:25:10.844 [2024-07-23 08:38:23.293463] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:10.844 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=eb424fe9-63f0-4380-bb51-05eef1efc5f8 00:25:10.844 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z eb424fe9-63f0-4380-bb51-05eef1efc5f8 ']' 00:25:10.844 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:11.101 [2024-07-23 08:38:23.469667] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:11.101 [2024-07-23 08:38:23.469693] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:11.101 [2024-07-23 08:38:23.469770] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:11.101 [2024-07-23 08:38:23.469827] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:11.101 [2024-07-23 08:38:23.469842] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035a80 name raid_bdev1, state offline 00:25:11.101 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.101 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:25:11.360 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:25:11.360 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:25:11.360 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:11.360 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:11.360 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:11.360 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:11.619 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:25:11.619 08:38:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:25:11.877 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:11.878 [2024-07-23 08:38:24.319908] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:25:11.878 [2024-07-23 08:38:24.321504] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:25:11.878 [2024-07-23 08:38:24.321568] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:25:11.878 [2024-07-23 08:38:24.321623] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:25:11.878 [2024-07-23 08:38:24.321639] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:11.878 [2024-07-23 08:38:24.321650] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036080 name raid_bdev1, state configuring 00:25:11.878 request: 00:25:11.878 { 00:25:11.878 "name": "raid_bdev1", 00:25:11.878 "raid_level": "raid1", 00:25:11.878 "base_bdevs": [ 00:25:11.878 "malloc1", 00:25:11.878 "malloc2" 00:25:11.878 ], 00:25:11.878 "superblock": false, 00:25:11.878 "method": "bdev_raid_create", 00:25:11.878 "req_id": 1 00:25:11.878 } 00:25:11.878 Got JSON-RPC error response 00:25:11.878 response: 00:25:11.878 { 00:25:11.878 "code": -17, 00:25:11.878 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:25:11.878 } 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:25:11.878 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.136 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:25:12.136 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:25:12.136 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:12.395 [2024-07-23 08:38:24.664726] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:12.395 [2024-07-23 08:38:24.664779] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:12.395 [2024-07-23 08:38:24.664796] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036680 00:25:12.395 [2024-07-23 08:38:24.664807] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:12.395 [2024-07-23 08:38:24.666778] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:12.395 [2024-07-23 08:38:24.666807] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:12.395 [2024-07-23 08:38:24.666902] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:25:12.395 [2024-07-23 08:38:24.666976] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:12.395 pt1 00:25:12.395 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:25:12.395 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:12.395 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:12.395 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:12.395 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:12.395 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:12.395 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:12.395 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:12.395 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:12.395 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:12.395 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.395 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.395 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:12.395 "name": "raid_bdev1", 00:25:12.395 "uuid": "eb424fe9-63f0-4380-bb51-05eef1efc5f8", 00:25:12.395 "strip_size_kb": 0, 00:25:12.395 "state": "configuring", 00:25:12.395 "raid_level": "raid1", 00:25:12.395 "superblock": true, 00:25:12.395 "num_base_bdevs": 2, 00:25:12.395 "num_base_bdevs_discovered": 1, 00:25:12.395 "num_base_bdevs_operational": 2, 00:25:12.395 "base_bdevs_list": [ 00:25:12.395 { 00:25:12.395 "name": "pt1", 00:25:12.395 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:12.395 "is_configured": true, 00:25:12.395 "data_offset": 256, 00:25:12.395 "data_size": 7936 00:25:12.395 }, 00:25:12.395 { 00:25:12.395 "name": null, 00:25:12.395 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:12.395 "is_configured": false, 00:25:12.395 "data_offset": 256, 00:25:12.395 "data_size": 7936 00:25:12.395 } 00:25:12.395 ] 00:25:12.395 }' 00:25:12.396 08:38:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:12.396 08:38:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:12.964 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:25:12.964 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:25:12.964 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:12.965 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:13.224 [2024-07-23 08:38:25.486915] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:13.224 [2024-07-23 08:38:25.486977] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:13.224 [2024-07-23 08:38:25.486996] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036f80 00:25:13.224 [2024-07-23 08:38:25.487006] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:13.224 [2024-07-23 08:38:25.487448] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:13.224 [2024-07-23 08:38:25.487469] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:13.224 [2024-07-23 08:38:25.487541] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:25:13.224 [2024-07-23 08:38:25.487564] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:13.224 [2024-07-23 08:38:25.487711] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036c80 00:25:13.224 [2024-07-23 08:38:25.487725] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:13.224 [2024-07-23 08:38:25.487941] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:25:13.224 [2024-07-23 08:38:25.488114] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036c80 00:25:13.224 [2024-07-23 08:38:25.488123] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036c80 00:25:13.224 [2024-07-23 08:38:25.488264] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:13.224 pt2 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:13.224 "name": "raid_bdev1", 00:25:13.224 "uuid": "eb424fe9-63f0-4380-bb51-05eef1efc5f8", 00:25:13.224 "strip_size_kb": 0, 00:25:13.224 "state": "online", 00:25:13.224 "raid_level": "raid1", 00:25:13.224 "superblock": true, 00:25:13.224 "num_base_bdevs": 2, 00:25:13.224 "num_base_bdevs_discovered": 2, 00:25:13.224 "num_base_bdevs_operational": 2, 00:25:13.224 "base_bdevs_list": [ 00:25:13.224 { 00:25:13.224 "name": "pt1", 00:25:13.224 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:13.224 "is_configured": true, 00:25:13.224 "data_offset": 256, 00:25:13.224 "data_size": 7936 00:25:13.224 }, 00:25:13.224 { 00:25:13.224 "name": "pt2", 00:25:13.224 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:13.224 "is_configured": true, 00:25:13.224 "data_offset": 256, 00:25:13.224 "data_size": 7936 00:25:13.224 } 00:25:13.224 ] 00:25:13.224 }' 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:13.224 08:38:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:13.792 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:25:13.792 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:25:13.792 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:13.792 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:13.792 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:13.792 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:25:13.792 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:13.792 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:14.051 [2024-07-23 08:38:26.313324] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:14.051 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:14.051 "name": "raid_bdev1", 00:25:14.051 "aliases": [ 00:25:14.051 "eb424fe9-63f0-4380-bb51-05eef1efc5f8" 00:25:14.051 ], 00:25:14.051 "product_name": "Raid Volume", 00:25:14.051 "block_size": 4096, 00:25:14.051 "num_blocks": 7936, 00:25:14.051 "uuid": "eb424fe9-63f0-4380-bb51-05eef1efc5f8", 00:25:14.051 "assigned_rate_limits": { 00:25:14.051 "rw_ios_per_sec": 0, 00:25:14.051 "rw_mbytes_per_sec": 0, 00:25:14.051 "r_mbytes_per_sec": 0, 00:25:14.051 "w_mbytes_per_sec": 0 00:25:14.051 }, 00:25:14.051 "claimed": false, 00:25:14.051 "zoned": false, 00:25:14.051 "supported_io_types": { 00:25:14.051 "read": true, 00:25:14.051 "write": true, 00:25:14.051 "unmap": false, 00:25:14.051 "flush": false, 00:25:14.051 "reset": true, 00:25:14.051 "nvme_admin": false, 00:25:14.051 "nvme_io": false, 00:25:14.051 "nvme_io_md": false, 00:25:14.051 "write_zeroes": true, 00:25:14.051 "zcopy": false, 00:25:14.051 "get_zone_info": false, 00:25:14.051 "zone_management": false, 00:25:14.051 "zone_append": false, 00:25:14.051 "compare": false, 00:25:14.051 "compare_and_write": false, 00:25:14.051 "abort": false, 00:25:14.051 "seek_hole": false, 00:25:14.051 "seek_data": false, 00:25:14.051 "copy": false, 00:25:14.051 "nvme_iov_md": false 00:25:14.051 }, 00:25:14.051 "memory_domains": [ 00:25:14.051 { 00:25:14.051 "dma_device_id": "system", 00:25:14.051 "dma_device_type": 1 00:25:14.051 }, 00:25:14.051 { 00:25:14.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:14.051 "dma_device_type": 2 00:25:14.051 }, 00:25:14.051 { 00:25:14.051 "dma_device_id": "system", 00:25:14.051 "dma_device_type": 1 00:25:14.051 }, 00:25:14.051 { 00:25:14.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:14.051 "dma_device_type": 2 00:25:14.051 } 00:25:14.051 ], 00:25:14.051 "driver_specific": { 00:25:14.051 "raid": { 00:25:14.051 "uuid": "eb424fe9-63f0-4380-bb51-05eef1efc5f8", 00:25:14.051 "strip_size_kb": 0, 00:25:14.051 "state": "online", 00:25:14.051 "raid_level": "raid1", 00:25:14.051 "superblock": true, 00:25:14.051 "num_base_bdevs": 2, 00:25:14.051 "num_base_bdevs_discovered": 2, 00:25:14.051 "num_base_bdevs_operational": 2, 00:25:14.051 "base_bdevs_list": [ 00:25:14.051 { 00:25:14.051 "name": "pt1", 00:25:14.051 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:14.051 "is_configured": true, 00:25:14.051 "data_offset": 256, 00:25:14.051 "data_size": 7936 00:25:14.051 }, 00:25:14.051 { 00:25:14.051 "name": "pt2", 00:25:14.051 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:14.051 "is_configured": true, 00:25:14.051 "data_offset": 256, 00:25:14.051 "data_size": 7936 00:25:14.051 } 00:25:14.051 ] 00:25:14.051 } 00:25:14.051 } 00:25:14.051 }' 00:25:14.051 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:14.051 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:25:14.051 pt2' 00:25:14.051 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:14.051 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:14.051 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:14.051 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:14.051 "name": "pt1", 00:25:14.051 "aliases": [ 00:25:14.051 "00000000-0000-0000-0000-000000000001" 00:25:14.051 ], 00:25:14.051 "product_name": "passthru", 00:25:14.051 "block_size": 4096, 00:25:14.051 "num_blocks": 8192, 00:25:14.051 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:14.051 "assigned_rate_limits": { 00:25:14.051 "rw_ios_per_sec": 0, 00:25:14.051 "rw_mbytes_per_sec": 0, 00:25:14.051 "r_mbytes_per_sec": 0, 00:25:14.051 "w_mbytes_per_sec": 0 00:25:14.051 }, 00:25:14.051 "claimed": true, 00:25:14.051 "claim_type": "exclusive_write", 00:25:14.051 "zoned": false, 00:25:14.051 "supported_io_types": { 00:25:14.051 "read": true, 00:25:14.051 "write": true, 00:25:14.051 "unmap": true, 00:25:14.051 "flush": true, 00:25:14.051 "reset": true, 00:25:14.051 "nvme_admin": false, 00:25:14.051 "nvme_io": false, 00:25:14.051 "nvme_io_md": false, 00:25:14.051 "write_zeroes": true, 00:25:14.051 "zcopy": true, 00:25:14.051 "get_zone_info": false, 00:25:14.051 "zone_management": false, 00:25:14.051 "zone_append": false, 00:25:14.051 "compare": false, 00:25:14.051 "compare_and_write": false, 00:25:14.051 "abort": true, 00:25:14.051 "seek_hole": false, 00:25:14.051 "seek_data": false, 00:25:14.051 "copy": true, 00:25:14.051 "nvme_iov_md": false 00:25:14.051 }, 00:25:14.051 "memory_domains": [ 00:25:14.051 { 00:25:14.051 "dma_device_id": "system", 00:25:14.051 "dma_device_type": 1 00:25:14.051 }, 00:25:14.051 { 00:25:14.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:14.051 "dma_device_type": 2 00:25:14.051 } 00:25:14.051 ], 00:25:14.051 "driver_specific": { 00:25:14.051 "passthru": { 00:25:14.051 "name": "pt1", 00:25:14.051 "base_bdev_name": "malloc1" 00:25:14.051 } 00:25:14.051 } 00:25:14.051 }' 00:25:14.051 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:14.309 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:14.309 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:14.309 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:14.309 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:14.309 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:14.309 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:14.309 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:14.309 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:14.309 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:14.568 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:14.568 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:14.568 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:14.568 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:14.568 08:38:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:14.568 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:14.568 "name": "pt2", 00:25:14.568 "aliases": [ 00:25:14.568 "00000000-0000-0000-0000-000000000002" 00:25:14.568 ], 00:25:14.568 "product_name": "passthru", 00:25:14.568 "block_size": 4096, 00:25:14.568 "num_blocks": 8192, 00:25:14.568 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:14.568 "assigned_rate_limits": { 00:25:14.568 "rw_ios_per_sec": 0, 00:25:14.568 "rw_mbytes_per_sec": 0, 00:25:14.568 "r_mbytes_per_sec": 0, 00:25:14.568 "w_mbytes_per_sec": 0 00:25:14.568 }, 00:25:14.568 "claimed": true, 00:25:14.568 "claim_type": "exclusive_write", 00:25:14.568 "zoned": false, 00:25:14.568 "supported_io_types": { 00:25:14.568 "read": true, 00:25:14.568 "write": true, 00:25:14.568 "unmap": true, 00:25:14.568 "flush": true, 00:25:14.568 "reset": true, 00:25:14.568 "nvme_admin": false, 00:25:14.568 "nvme_io": false, 00:25:14.568 "nvme_io_md": false, 00:25:14.568 "write_zeroes": true, 00:25:14.568 "zcopy": true, 00:25:14.568 "get_zone_info": false, 00:25:14.568 "zone_management": false, 00:25:14.568 "zone_append": false, 00:25:14.568 "compare": false, 00:25:14.568 "compare_and_write": false, 00:25:14.568 "abort": true, 00:25:14.568 "seek_hole": false, 00:25:14.568 "seek_data": false, 00:25:14.568 "copy": true, 00:25:14.568 "nvme_iov_md": false 00:25:14.568 }, 00:25:14.568 "memory_domains": [ 00:25:14.568 { 00:25:14.568 "dma_device_id": "system", 00:25:14.568 "dma_device_type": 1 00:25:14.568 }, 00:25:14.568 { 00:25:14.568 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:14.568 "dma_device_type": 2 00:25:14.568 } 00:25:14.568 ], 00:25:14.568 "driver_specific": { 00:25:14.568 "passthru": { 00:25:14.568 "name": "pt2", 00:25:14.568 "base_bdev_name": "malloc2" 00:25:14.568 } 00:25:14.568 } 00:25:14.568 }' 00:25:14.568 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:14.568 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:14.827 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:14.827 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:14.827 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:14.827 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:14.827 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:14.827 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:14.827 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:14.827 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:14.827 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:14.827 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:15.085 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:15.085 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:25:15.085 [2024-07-23 08:38:27.492397] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:15.085 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' eb424fe9-63f0-4380-bb51-05eef1efc5f8 '!=' eb424fe9-63f0-4380-bb51-05eef1efc5f8 ']' 00:25:15.085 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:25:15.085 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:15.085 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:25:15.085 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:15.343 [2024-07-23 08:38:27.664646] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:25:15.343 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:15.343 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:15.343 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:15.343 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:15.343 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:15.343 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:15.343 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:15.343 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:15.343 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:15.343 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:15.343 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.343 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:15.343 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:15.343 "name": "raid_bdev1", 00:25:15.343 "uuid": "eb424fe9-63f0-4380-bb51-05eef1efc5f8", 00:25:15.343 "strip_size_kb": 0, 00:25:15.343 "state": "online", 00:25:15.343 "raid_level": "raid1", 00:25:15.343 "superblock": true, 00:25:15.343 "num_base_bdevs": 2, 00:25:15.343 "num_base_bdevs_discovered": 1, 00:25:15.343 "num_base_bdevs_operational": 1, 00:25:15.343 "base_bdevs_list": [ 00:25:15.343 { 00:25:15.343 "name": null, 00:25:15.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:15.343 "is_configured": false, 00:25:15.343 "data_offset": 256, 00:25:15.343 "data_size": 7936 00:25:15.343 }, 00:25:15.343 { 00:25:15.343 "name": "pt2", 00:25:15.343 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:15.343 "is_configured": true, 00:25:15.343 "data_offset": 256, 00:25:15.343 "data_size": 7936 00:25:15.343 } 00:25:15.343 ] 00:25:15.343 }' 00:25:15.343 08:38:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:15.343 08:38:27 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:15.910 08:38:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:16.168 [2024-07-23 08:38:28.490792] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:16.168 [2024-07-23 08:38:28.490826] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:16.168 [2024-07-23 08:38:28.490894] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:16.168 [2024-07-23 08:38:28.490938] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:16.168 [2024-07-23 08:38:28.490949] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036c80 name raid_bdev1, state offline 00:25:16.168 08:38:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.168 08:38:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:25:16.168 08:38:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:25:16.168 08:38:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:25:16.168 08:38:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:25:16.426 08:38:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:25:16.426 08:38:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:16.426 08:38:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:25:16.426 08:38:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:25:16.426 08:38:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:25:16.426 08:38:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:25:16.426 08:38:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:25:16.426 08:38:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:16.684 [2024-07-23 08:38:29.000143] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:16.684 [2024-07-23 08:38:29.000211] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:16.684 [2024-07-23 08:38:29.000228] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037280 00:25:16.684 [2024-07-23 08:38:29.000239] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:16.684 [2024-07-23 08:38:29.002204] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:16.684 [2024-07-23 08:38:29.002234] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:16.684 [2024-07-23 08:38:29.002317] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:25:16.684 [2024-07-23 08:38:29.002370] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:16.684 [2024-07-23 08:38:29.002494] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037880 00:25:16.684 [2024-07-23 08:38:29.002505] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:16.684 [2024-07-23 08:38:29.002738] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:25:16.684 [2024-07-23 08:38:29.002922] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037880 00:25:16.684 [2024-07-23 08:38:29.002932] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037880 00:25:16.684 [2024-07-23 08:38:29.003075] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:16.684 pt2 00:25:16.684 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:16.684 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:16.684 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:16.684 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:16.684 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:16.684 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:16.684 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:16.684 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:16.684 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:16.684 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:16.684 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.684 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:16.684 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:16.684 "name": "raid_bdev1", 00:25:16.684 "uuid": "eb424fe9-63f0-4380-bb51-05eef1efc5f8", 00:25:16.684 "strip_size_kb": 0, 00:25:16.684 "state": "online", 00:25:16.684 "raid_level": "raid1", 00:25:16.684 "superblock": true, 00:25:16.684 "num_base_bdevs": 2, 00:25:16.684 "num_base_bdevs_discovered": 1, 00:25:16.684 "num_base_bdevs_operational": 1, 00:25:16.684 "base_bdevs_list": [ 00:25:16.684 { 00:25:16.684 "name": null, 00:25:16.684 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:16.684 "is_configured": false, 00:25:16.684 "data_offset": 256, 00:25:16.684 "data_size": 7936 00:25:16.684 }, 00:25:16.684 { 00:25:16.684 "name": "pt2", 00:25:16.684 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:16.684 "is_configured": true, 00:25:16.684 "data_offset": 256, 00:25:16.684 "data_size": 7936 00:25:16.684 } 00:25:16.684 ] 00:25:16.684 }' 00:25:16.684 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:16.685 08:38:29 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:17.250 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:17.507 [2024-07-23 08:38:29.842391] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:17.507 [2024-07-23 08:38:29.842420] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:17.507 [2024-07-23 08:38:29.842485] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:17.507 [2024-07-23 08:38:29.842534] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:17.507 [2024-07-23 08:38:29.842544] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037880 name raid_bdev1, state offline 00:25:17.507 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.507 08:38:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:17.766 [2024-07-23 08:38:30.171232] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:17.766 [2024-07-23 08:38:30.171282] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:17.766 [2024-07-23 08:38:30.171299] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037b80 00:25:17.766 [2024-07-23 08:38:30.171308] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:17.766 [2024-07-23 08:38:30.173247] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:17.766 [2024-07-23 08:38:30.173276] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:17.766 [2024-07-23 08:38:30.173354] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:25:17.766 [2024-07-23 08:38:30.173403] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:17.766 [2024-07-23 08:38:30.173563] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:25:17.766 [2024-07-23 08:38:30.173574] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:17.766 [2024-07-23 08:38:30.173590] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038180 name raid_bdev1, state configuring 00:25:17.766 [2024-07-23 08:38:30.173656] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:17.766 [2024-07-23 08:38:30.173728] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000038480 00:25:17.766 [2024-07-23 08:38:30.173737] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:17.766 [2024-07-23 08:38:30.173961] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:25:17.766 [2024-07-23 08:38:30.174132] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000038480 00:25:17.766 [2024-07-23 08:38:30.174143] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000038480 00:25:17.766 [2024-07-23 08:38:30.174298] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:17.766 pt1 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.766 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:18.025 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:18.025 "name": "raid_bdev1", 00:25:18.025 "uuid": "eb424fe9-63f0-4380-bb51-05eef1efc5f8", 00:25:18.025 "strip_size_kb": 0, 00:25:18.025 "state": "online", 00:25:18.025 "raid_level": "raid1", 00:25:18.025 "superblock": true, 00:25:18.025 "num_base_bdevs": 2, 00:25:18.025 "num_base_bdevs_discovered": 1, 00:25:18.025 "num_base_bdevs_operational": 1, 00:25:18.025 "base_bdevs_list": [ 00:25:18.025 { 00:25:18.025 "name": null, 00:25:18.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:18.025 "is_configured": false, 00:25:18.025 "data_offset": 256, 00:25:18.025 "data_size": 7936 00:25:18.025 }, 00:25:18.025 { 00:25:18.025 "name": "pt2", 00:25:18.025 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:18.025 "is_configured": true, 00:25:18.025 "data_offset": 256, 00:25:18.025 "data_size": 7936 00:25:18.025 } 00:25:18.025 ] 00:25:18.025 }' 00:25:18.025 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:18.025 08:38:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:18.283 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:25:18.283 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:25:18.541 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:25:18.541 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:18.541 08:38:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:25:18.799 [2024-07-23 08:38:31.113971] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:18.799 08:38:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' eb424fe9-63f0-4380-bb51-05eef1efc5f8 '!=' eb424fe9-63f0-4380-bb51-05eef1efc5f8 ']' 00:25:18.799 08:38:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 1563601 00:25:18.799 08:38:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 1563601 ']' 00:25:18.799 08:38:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 1563601 00:25:18.799 08:38:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:25:18.799 08:38:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:18.799 08:38:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1563601 00:25:18.799 08:38:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:18.799 08:38:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:18.799 08:38:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1563601' 00:25:18.799 killing process with pid 1563601 00:25:18.799 08:38:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 1563601 00:25:18.799 [2024-07-23 08:38:31.172548] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:18.799 08:38:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 1563601 00:25:18.799 [2024-07-23 08:38:31.172637] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:18.799 [2024-07-23 08:38:31.172686] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:18.799 [2024-07-23 08:38:31.172698] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038480 name raid_bdev1, state offline 00:25:18.799 [2024-07-23 08:38:31.313270] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:20.174 08:38:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:25:20.174 00:25:20.174 real 0m13.094s 00:25:20.174 user 0m22.769s 00:25:20.174 sys 0m2.033s 00:25:20.174 08:38:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:20.174 08:38:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:20.174 ************************************ 00:25:20.174 END TEST raid_superblock_test_4k 00:25:20.174 ************************************ 00:25:20.174 08:38:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:20.174 08:38:32 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:25:20.174 08:38:32 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:25:20.174 08:38:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:20.174 08:38:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:20.174 08:38:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:20.174 ************************************ 00:25:20.174 START TEST raid_rebuild_test_sb_4k 00:25:20.174 ************************************ 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:20.174 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=1566325 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 1566325 /var/tmp/spdk-raid.sock 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 1566325 ']' 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:20.175 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:20.175 08:38:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:20.434 [2024-07-23 08:38:32.753373] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:25:20.434 [2024-07-23 08:38:32.753467] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1566325 ] 00:25:20.434 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:20.434 Zero copy mechanism will not be used. 00:25:20.434 [2024-07-23 08:38:32.876628] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:20.693 [2024-07-23 08:38:33.086773] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:20.959 [2024-07-23 08:38:33.348583] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:20.959 [2024-07-23 08:38:33.348624] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:21.217 08:38:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:21.217 08:38:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:25:21.217 08:38:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:21.217 08:38:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:25:21.217 BaseBdev1_malloc 00:25:21.217 08:38:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:21.475 [2024-07-23 08:38:33.880508] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:21.475 [2024-07-23 08:38:33.880566] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:21.475 [2024-07-23 08:38:33.880588] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:25:21.475 [2024-07-23 08:38:33.880602] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:21.475 [2024-07-23 08:38:33.882621] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:21.475 [2024-07-23 08:38:33.882653] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:21.475 BaseBdev1 00:25:21.475 08:38:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:21.475 08:38:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:25:21.733 BaseBdev2_malloc 00:25:21.733 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:21.733 [2024-07-23 08:38:34.245502] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:21.733 [2024-07-23 08:38:34.245555] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:21.733 [2024-07-23 08:38:34.245574] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:25:21.733 [2024-07-23 08:38:34.245590] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:21.733 [2024-07-23 08:38:34.247561] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:21.733 [2024-07-23 08:38:34.247589] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:21.733 BaseBdev2 00:25:22.020 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:25:22.020 spare_malloc 00:25:22.020 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:22.279 spare_delay 00:25:22.279 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:22.279 [2024-07-23 08:38:34.782312] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:22.279 [2024-07-23 08:38:34.782364] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:22.279 [2024-07-23 08:38:34.782385] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036680 00:25:22.279 [2024-07-23 08:38:34.782396] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:22.279 [2024-07-23 08:38:34.784400] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:22.279 [2024-07-23 08:38:34.784432] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:22.279 spare 00:25:22.279 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:25:22.538 [2024-07-23 08:38:34.950787] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:22.538 [2024-07-23 08:38:34.952376] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:22.538 [2024-07-23 08:38:34.952581] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036c80 00:25:22.538 [2024-07-23 08:38:34.952598] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:22.538 [2024-07-23 08:38:34.952865] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:25:22.538 [2024-07-23 08:38:34.953083] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036c80 00:25:22.538 [2024-07-23 08:38:34.953094] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036c80 00:25:22.538 [2024-07-23 08:38:34.953273] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:22.538 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:22.538 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:22.538 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:22.538 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:22.538 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:22.538 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:22.538 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:22.538 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:22.538 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:22.538 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:22.538 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:22.538 08:38:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:22.796 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:22.796 "name": "raid_bdev1", 00:25:22.796 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:22.796 "strip_size_kb": 0, 00:25:22.796 "state": "online", 00:25:22.796 "raid_level": "raid1", 00:25:22.796 "superblock": true, 00:25:22.796 "num_base_bdevs": 2, 00:25:22.796 "num_base_bdevs_discovered": 2, 00:25:22.796 "num_base_bdevs_operational": 2, 00:25:22.796 "base_bdevs_list": [ 00:25:22.796 { 00:25:22.796 "name": "BaseBdev1", 00:25:22.796 "uuid": "70a47675-3af7-51be-be44-d030b6cec5a4", 00:25:22.796 "is_configured": true, 00:25:22.796 "data_offset": 256, 00:25:22.796 "data_size": 7936 00:25:22.797 }, 00:25:22.797 { 00:25:22.797 "name": "BaseBdev2", 00:25:22.797 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:22.797 "is_configured": true, 00:25:22.797 "data_offset": 256, 00:25:22.797 "data_size": 7936 00:25:22.797 } 00:25:22.797 ] 00:25:22.797 }' 00:25:22.797 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:22.797 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:23.363 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:23.363 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:23.363 [2024-07-23 08:38:35.753120] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:23.363 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:25:23.363 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:23.363 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:23.621 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:25:23.621 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:25:23.621 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:25:23.621 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:25:23.621 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:23.621 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:23.621 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:23.621 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:23.621 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:23.621 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:23.621 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:25:23.621 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:23.621 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:23.621 08:38:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:23.621 [2024-07-23 08:38:36.085790] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:25:23.621 /dev/nbd0 00:25:23.621 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:23.621 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:23.621 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:23.622 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:25:23.622 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:23.622 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:23.622 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:23.880 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:25:23.880 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:23.880 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:23.880 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:23.880 1+0 records in 00:25:23.880 1+0 records out 00:25:23.880 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000238049 s, 17.2 MB/s 00:25:23.880 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:23.880 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:25:23.880 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:23.880 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:23.880 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:25:23.880 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:23.880 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:23.880 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:25:23.880 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:25:23.880 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:25:24.447 7936+0 records in 00:25:24.447 7936+0 records out 00:25:24.447 32505856 bytes (33 MB, 31 MiB) copied, 0.601942 s, 54.0 MB/s 00:25:24.447 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:24.447 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:24.447 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:24.447 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:24.447 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:25:24.447 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:24.447 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:24.447 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:24.447 [2024-07-23 08:38:36.953277] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:24.447 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:24.447 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:24.447 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:24.447 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:24.447 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:24.447 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:25:24.447 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:25:24.447 08:38:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:24.706 [2024-07-23 08:38:37.117779] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:24.706 08:38:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:24.706 08:38:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:24.706 08:38:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:24.706 08:38:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:24.706 08:38:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:24.706 08:38:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:24.706 08:38:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:24.706 08:38:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:24.706 08:38:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:24.706 08:38:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:24.706 08:38:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:24.706 08:38:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.965 08:38:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:24.965 "name": "raid_bdev1", 00:25:24.965 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:24.965 "strip_size_kb": 0, 00:25:24.965 "state": "online", 00:25:24.965 "raid_level": "raid1", 00:25:24.965 "superblock": true, 00:25:24.965 "num_base_bdevs": 2, 00:25:24.965 "num_base_bdevs_discovered": 1, 00:25:24.965 "num_base_bdevs_operational": 1, 00:25:24.965 "base_bdevs_list": [ 00:25:24.965 { 00:25:24.965 "name": null, 00:25:24.965 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:24.965 "is_configured": false, 00:25:24.965 "data_offset": 256, 00:25:24.965 "data_size": 7936 00:25:24.965 }, 00:25:24.965 { 00:25:24.965 "name": "BaseBdev2", 00:25:24.965 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:24.965 "is_configured": true, 00:25:24.965 "data_offset": 256, 00:25:24.965 "data_size": 7936 00:25:24.965 } 00:25:24.965 ] 00:25:24.965 }' 00:25:24.965 08:38:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:24.965 08:38:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:25.532 08:38:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:25.532 [2024-07-23 08:38:37.980073] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:25.532 [2024-07-23 08:38:37.996402] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0001a2c80 00:25:25.532 [2024-07-23 08:38:37.998002] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:25.532 08:38:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:26.910 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:26.910 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:26.910 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:26.910 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:26.910 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:26.910 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.910 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.910 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:26.910 "name": "raid_bdev1", 00:25:26.910 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:26.910 "strip_size_kb": 0, 00:25:26.910 "state": "online", 00:25:26.910 "raid_level": "raid1", 00:25:26.910 "superblock": true, 00:25:26.910 "num_base_bdevs": 2, 00:25:26.910 "num_base_bdevs_discovered": 2, 00:25:26.910 "num_base_bdevs_operational": 2, 00:25:26.910 "process": { 00:25:26.910 "type": "rebuild", 00:25:26.910 "target": "spare", 00:25:26.910 "progress": { 00:25:26.910 "blocks": 2816, 00:25:26.910 "percent": 35 00:25:26.910 } 00:25:26.910 }, 00:25:26.910 "base_bdevs_list": [ 00:25:26.910 { 00:25:26.910 "name": "spare", 00:25:26.910 "uuid": "2e3b1c90-96da-54d5-a776-ea2fbbe4583f", 00:25:26.910 "is_configured": true, 00:25:26.910 "data_offset": 256, 00:25:26.910 "data_size": 7936 00:25:26.910 }, 00:25:26.910 { 00:25:26.910 "name": "BaseBdev2", 00:25:26.910 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:26.910 "is_configured": true, 00:25:26.910 "data_offset": 256, 00:25:26.910 "data_size": 7936 00:25:26.910 } 00:25:26.910 ] 00:25:26.910 }' 00:25:26.910 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:26.910 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:26.910 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:26.910 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:26.910 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:26.910 [2024-07-23 08:38:39.427472] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:27.169 [2024-07-23 08:38:39.509623] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:27.169 [2024-07-23 08:38:39.509672] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:27.169 [2024-07-23 08:38:39.509703] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:27.169 [2024-07-23 08:38:39.509716] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:27.169 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:27.169 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:27.169 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:27.169 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:27.169 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:27.169 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:27.169 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:27.169 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:27.170 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:27.170 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:27.170 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.170 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.429 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:27.429 "name": "raid_bdev1", 00:25:27.429 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:27.429 "strip_size_kb": 0, 00:25:27.429 "state": "online", 00:25:27.429 "raid_level": "raid1", 00:25:27.429 "superblock": true, 00:25:27.429 "num_base_bdevs": 2, 00:25:27.429 "num_base_bdevs_discovered": 1, 00:25:27.429 "num_base_bdevs_operational": 1, 00:25:27.429 "base_bdevs_list": [ 00:25:27.429 { 00:25:27.429 "name": null, 00:25:27.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:27.429 "is_configured": false, 00:25:27.429 "data_offset": 256, 00:25:27.429 "data_size": 7936 00:25:27.429 }, 00:25:27.429 { 00:25:27.429 "name": "BaseBdev2", 00:25:27.429 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:27.429 "is_configured": true, 00:25:27.429 "data_offset": 256, 00:25:27.429 "data_size": 7936 00:25:27.429 } 00:25:27.429 ] 00:25:27.429 }' 00:25:27.429 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:27.429 08:38:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:27.997 08:38:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:27.997 08:38:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:27.997 08:38:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:27.997 08:38:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:27.997 08:38:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:27.997 08:38:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.997 08:38:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.997 08:38:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:27.997 "name": "raid_bdev1", 00:25:27.997 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:27.997 "strip_size_kb": 0, 00:25:27.997 "state": "online", 00:25:27.997 "raid_level": "raid1", 00:25:27.997 "superblock": true, 00:25:27.997 "num_base_bdevs": 2, 00:25:27.997 "num_base_bdevs_discovered": 1, 00:25:27.997 "num_base_bdevs_operational": 1, 00:25:27.997 "base_bdevs_list": [ 00:25:27.997 { 00:25:27.997 "name": null, 00:25:27.997 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:27.997 "is_configured": false, 00:25:27.997 "data_offset": 256, 00:25:27.997 "data_size": 7936 00:25:27.997 }, 00:25:27.997 { 00:25:27.997 "name": "BaseBdev2", 00:25:27.997 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:27.997 "is_configured": true, 00:25:27.997 "data_offset": 256, 00:25:27.997 "data_size": 7936 00:25:27.997 } 00:25:27.997 ] 00:25:27.997 }' 00:25:27.997 08:38:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:27.997 08:38:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:27.997 08:38:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:27.997 08:38:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:27.997 08:38:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:28.256 [2024-07-23 08:38:40.639036] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:28.256 [2024-07-23 08:38:40.657841] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0001a2d50 00:25:28.256 [2024-07-23 08:38:40.659446] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:28.256 08:38:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:29.193 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:29.193 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:29.193 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:29.193 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:29.193 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:29.193 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.193 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:29.453 "name": "raid_bdev1", 00:25:29.453 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:29.453 "strip_size_kb": 0, 00:25:29.453 "state": "online", 00:25:29.453 "raid_level": "raid1", 00:25:29.453 "superblock": true, 00:25:29.453 "num_base_bdevs": 2, 00:25:29.453 "num_base_bdevs_discovered": 2, 00:25:29.453 "num_base_bdevs_operational": 2, 00:25:29.453 "process": { 00:25:29.453 "type": "rebuild", 00:25:29.453 "target": "spare", 00:25:29.453 "progress": { 00:25:29.453 "blocks": 2816, 00:25:29.453 "percent": 35 00:25:29.453 } 00:25:29.453 }, 00:25:29.453 "base_bdevs_list": [ 00:25:29.453 { 00:25:29.453 "name": "spare", 00:25:29.453 "uuid": "2e3b1c90-96da-54d5-a776-ea2fbbe4583f", 00:25:29.453 "is_configured": true, 00:25:29.453 "data_offset": 256, 00:25:29.453 "data_size": 7936 00:25:29.453 }, 00:25:29.453 { 00:25:29.453 "name": "BaseBdev2", 00:25:29.453 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:29.453 "is_configured": true, 00:25:29.453 "data_offset": 256, 00:25:29.453 "data_size": 7936 00:25:29.453 } 00:25:29.453 ] 00:25:29.453 }' 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:29.453 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=874 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.453 08:38:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.712 08:38:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:29.712 "name": "raid_bdev1", 00:25:29.712 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:29.712 "strip_size_kb": 0, 00:25:29.712 "state": "online", 00:25:29.712 "raid_level": "raid1", 00:25:29.712 "superblock": true, 00:25:29.712 "num_base_bdevs": 2, 00:25:29.712 "num_base_bdevs_discovered": 2, 00:25:29.712 "num_base_bdevs_operational": 2, 00:25:29.712 "process": { 00:25:29.712 "type": "rebuild", 00:25:29.712 "target": "spare", 00:25:29.712 "progress": { 00:25:29.712 "blocks": 3584, 00:25:29.712 "percent": 45 00:25:29.712 } 00:25:29.712 }, 00:25:29.712 "base_bdevs_list": [ 00:25:29.712 { 00:25:29.712 "name": "spare", 00:25:29.712 "uuid": "2e3b1c90-96da-54d5-a776-ea2fbbe4583f", 00:25:29.712 "is_configured": true, 00:25:29.712 "data_offset": 256, 00:25:29.712 "data_size": 7936 00:25:29.712 }, 00:25:29.712 { 00:25:29.712 "name": "BaseBdev2", 00:25:29.712 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:29.712 "is_configured": true, 00:25:29.712 "data_offset": 256, 00:25:29.712 "data_size": 7936 00:25:29.712 } 00:25:29.712 ] 00:25:29.712 }' 00:25:29.712 08:38:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:29.712 08:38:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:29.712 08:38:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:29.712 08:38:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:29.712 08:38:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:31.091 08:38:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:31.091 08:38:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:31.091 08:38:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:31.091 08:38:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:31.091 08:38:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:31.091 08:38:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:31.091 08:38:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.091 08:38:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.091 08:38:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:31.091 "name": "raid_bdev1", 00:25:31.091 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:31.091 "strip_size_kb": 0, 00:25:31.091 "state": "online", 00:25:31.091 "raid_level": "raid1", 00:25:31.091 "superblock": true, 00:25:31.091 "num_base_bdevs": 2, 00:25:31.091 "num_base_bdevs_discovered": 2, 00:25:31.091 "num_base_bdevs_operational": 2, 00:25:31.091 "process": { 00:25:31.091 "type": "rebuild", 00:25:31.091 "target": "spare", 00:25:31.091 "progress": { 00:25:31.091 "blocks": 6656, 00:25:31.091 "percent": 83 00:25:31.091 } 00:25:31.091 }, 00:25:31.091 "base_bdevs_list": [ 00:25:31.091 { 00:25:31.091 "name": "spare", 00:25:31.091 "uuid": "2e3b1c90-96da-54d5-a776-ea2fbbe4583f", 00:25:31.091 "is_configured": true, 00:25:31.091 "data_offset": 256, 00:25:31.091 "data_size": 7936 00:25:31.091 }, 00:25:31.091 { 00:25:31.091 "name": "BaseBdev2", 00:25:31.091 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:31.091 "is_configured": true, 00:25:31.091 "data_offset": 256, 00:25:31.091 "data_size": 7936 00:25:31.091 } 00:25:31.091 ] 00:25:31.091 }' 00:25:31.091 08:38:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:31.091 08:38:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:31.091 08:38:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:31.091 08:38:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:31.091 08:38:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:31.350 [2024-07-23 08:38:43.783658] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:31.350 [2024-07-23 08:38:43.783717] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:31.350 [2024-07-23 08:38:43.783796] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:32.286 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:32.286 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:32.286 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:32.286 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:32.286 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:32.286 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:32.286 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.286 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.286 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:32.286 "name": "raid_bdev1", 00:25:32.286 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:32.286 "strip_size_kb": 0, 00:25:32.286 "state": "online", 00:25:32.286 "raid_level": "raid1", 00:25:32.286 "superblock": true, 00:25:32.286 "num_base_bdevs": 2, 00:25:32.286 "num_base_bdevs_discovered": 2, 00:25:32.286 "num_base_bdevs_operational": 2, 00:25:32.286 "base_bdevs_list": [ 00:25:32.286 { 00:25:32.286 "name": "spare", 00:25:32.286 "uuid": "2e3b1c90-96da-54d5-a776-ea2fbbe4583f", 00:25:32.286 "is_configured": true, 00:25:32.286 "data_offset": 256, 00:25:32.286 "data_size": 7936 00:25:32.286 }, 00:25:32.286 { 00:25:32.286 "name": "BaseBdev2", 00:25:32.286 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:32.286 "is_configured": true, 00:25:32.286 "data_offset": 256, 00:25:32.286 "data_size": 7936 00:25:32.286 } 00:25:32.287 ] 00:25:32.287 }' 00:25:32.287 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:32.287 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:32.287 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:32.287 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:32.287 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:25:32.287 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:32.287 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:32.287 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:32.287 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:32.287 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:32.287 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.287 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.546 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:32.546 "name": "raid_bdev1", 00:25:32.546 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:32.546 "strip_size_kb": 0, 00:25:32.546 "state": "online", 00:25:32.546 "raid_level": "raid1", 00:25:32.546 "superblock": true, 00:25:32.546 "num_base_bdevs": 2, 00:25:32.546 "num_base_bdevs_discovered": 2, 00:25:32.546 "num_base_bdevs_operational": 2, 00:25:32.546 "base_bdevs_list": [ 00:25:32.546 { 00:25:32.546 "name": "spare", 00:25:32.546 "uuid": "2e3b1c90-96da-54d5-a776-ea2fbbe4583f", 00:25:32.546 "is_configured": true, 00:25:32.546 "data_offset": 256, 00:25:32.546 "data_size": 7936 00:25:32.546 }, 00:25:32.546 { 00:25:32.546 "name": "BaseBdev2", 00:25:32.546 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:32.546 "is_configured": true, 00:25:32.546 "data_offset": 256, 00:25:32.546 "data_size": 7936 00:25:32.546 } 00:25:32.546 ] 00:25:32.546 }' 00:25:32.546 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:32.546 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:32.546 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:32.546 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:32.546 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:32.546 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:32.546 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:32.546 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:32.546 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:32.546 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:32.546 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:32.546 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:32.546 08:38:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:32.546 08:38:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:32.546 08:38:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.546 08:38:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.805 08:38:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:32.805 "name": "raid_bdev1", 00:25:32.805 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:32.805 "strip_size_kb": 0, 00:25:32.805 "state": "online", 00:25:32.805 "raid_level": "raid1", 00:25:32.805 "superblock": true, 00:25:32.805 "num_base_bdevs": 2, 00:25:32.805 "num_base_bdevs_discovered": 2, 00:25:32.805 "num_base_bdevs_operational": 2, 00:25:32.805 "base_bdevs_list": [ 00:25:32.805 { 00:25:32.805 "name": "spare", 00:25:32.805 "uuid": "2e3b1c90-96da-54d5-a776-ea2fbbe4583f", 00:25:32.805 "is_configured": true, 00:25:32.805 "data_offset": 256, 00:25:32.805 "data_size": 7936 00:25:32.805 }, 00:25:32.805 { 00:25:32.805 "name": "BaseBdev2", 00:25:32.805 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:32.805 "is_configured": true, 00:25:32.805 "data_offset": 256, 00:25:32.805 "data_size": 7936 00:25:32.805 } 00:25:32.805 ] 00:25:32.805 }' 00:25:32.805 08:38:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:32.805 08:38:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:33.374 08:38:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:33.374 [2024-07-23 08:38:45.827977] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:33.374 [2024-07-23 08:38:45.828012] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:33.374 [2024-07-23 08:38:45.828091] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:33.374 [2024-07-23 08:38:45.828161] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:33.374 [2024-07-23 08:38:45.828177] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036c80 name raid_bdev1, state offline 00:25:33.374 08:38:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.374 08:38:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:25:33.633 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:33.633 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:33.633 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:33.633 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:33.633 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:33.633 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:33.633 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:33.633 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:33.633 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:33.633 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:25:33.633 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:33.633 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:33.633 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:33.893 /dev/nbd0 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:33.893 1+0 records in 00:25:33.893 1+0 records out 00:25:33.893 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000219556 s, 18.7 MB/s 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:33.893 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:34.153 /dev/nbd1 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:34.153 1+0 records in 00:25:34.153 1+0 records out 00:25:34.153 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242938 s, 16.9 MB/s 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:34.153 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:34.412 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:34.412 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:34.412 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:34.412 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:34.412 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:34.412 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:34.412 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:25:34.413 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:25:34.413 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:34.413 08:38:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:34.672 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:34.672 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:34.672 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:34.672 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:34.672 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:34.672 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:34.672 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:25:34.672 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:25:34.672 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:34.672 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:34.931 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:34.931 [2024-07-23 08:38:47.354911] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:34.931 [2024-07-23 08:38:47.354971] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:34.931 [2024-07-23 08:38:47.354996] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038480 00:25:34.931 [2024-07-23 08:38:47.355008] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:34.931 [2024-07-23 08:38:47.357346] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:34.931 [2024-07-23 08:38:47.357378] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:34.931 [2024-07-23 08:38:47.357481] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:34.931 [2024-07-23 08:38:47.357536] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:34.931 [2024-07-23 08:38:47.357747] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:34.931 spare 00:25:34.931 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:34.931 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:34.931 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:34.931 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:34.931 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:34.931 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:34.931 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:34.931 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:34.931 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:34.931 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:34.931 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.931 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:35.197 [2024-07-23 08:38:47.458092] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000038a80 00:25:35.197 [2024-07-23 08:38:47.458133] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:35.197 [2024-07-23 08:38:47.458415] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0001c7c00 00:25:35.197 [2024-07-23 08:38:47.458634] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000038a80 00:25:35.197 [2024-07-23 08:38:47.458645] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000038a80 00:25:35.197 [2024-07-23 08:38:47.458818] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:35.197 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:35.197 "name": "raid_bdev1", 00:25:35.197 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:35.197 "strip_size_kb": 0, 00:25:35.197 "state": "online", 00:25:35.197 "raid_level": "raid1", 00:25:35.197 "superblock": true, 00:25:35.197 "num_base_bdevs": 2, 00:25:35.197 "num_base_bdevs_discovered": 2, 00:25:35.197 "num_base_bdevs_operational": 2, 00:25:35.197 "base_bdevs_list": [ 00:25:35.197 { 00:25:35.197 "name": "spare", 00:25:35.197 "uuid": "2e3b1c90-96da-54d5-a776-ea2fbbe4583f", 00:25:35.197 "is_configured": true, 00:25:35.197 "data_offset": 256, 00:25:35.197 "data_size": 7936 00:25:35.197 }, 00:25:35.197 { 00:25:35.197 "name": "BaseBdev2", 00:25:35.197 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:35.197 "is_configured": true, 00:25:35.197 "data_offset": 256, 00:25:35.197 "data_size": 7936 00:25:35.197 } 00:25:35.197 ] 00:25:35.197 }' 00:25:35.197 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:35.197 08:38:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:35.765 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:35.765 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:35.765 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:35.765 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:35.765 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:35.765 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.765 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:35.765 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:35.765 "name": "raid_bdev1", 00:25:35.765 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:35.765 "strip_size_kb": 0, 00:25:35.765 "state": "online", 00:25:35.766 "raid_level": "raid1", 00:25:35.766 "superblock": true, 00:25:35.766 "num_base_bdevs": 2, 00:25:35.766 "num_base_bdevs_discovered": 2, 00:25:35.766 "num_base_bdevs_operational": 2, 00:25:35.766 "base_bdevs_list": [ 00:25:35.766 { 00:25:35.766 "name": "spare", 00:25:35.766 "uuid": "2e3b1c90-96da-54d5-a776-ea2fbbe4583f", 00:25:35.766 "is_configured": true, 00:25:35.766 "data_offset": 256, 00:25:35.766 "data_size": 7936 00:25:35.766 }, 00:25:35.766 { 00:25:35.766 "name": "BaseBdev2", 00:25:35.766 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:35.766 "is_configured": true, 00:25:35.766 "data_offset": 256, 00:25:35.766 "data_size": 7936 00:25:35.766 } 00:25:35.766 ] 00:25:35.766 }' 00:25:35.766 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:35.766 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:35.766 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:36.024 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:36.024 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.024 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:36.024 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:36.024 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:36.283 [2024-07-23 08:38:48.630351] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:36.283 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:36.283 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:36.283 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:36.283 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:36.283 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:36.283 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:36.283 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:36.283 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:36.283 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:36.283 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:36.283 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.283 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.542 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:36.542 "name": "raid_bdev1", 00:25:36.542 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:36.542 "strip_size_kb": 0, 00:25:36.542 "state": "online", 00:25:36.542 "raid_level": "raid1", 00:25:36.542 "superblock": true, 00:25:36.542 "num_base_bdevs": 2, 00:25:36.542 "num_base_bdevs_discovered": 1, 00:25:36.542 "num_base_bdevs_operational": 1, 00:25:36.542 "base_bdevs_list": [ 00:25:36.542 { 00:25:36.542 "name": null, 00:25:36.542 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.542 "is_configured": false, 00:25:36.542 "data_offset": 256, 00:25:36.542 "data_size": 7936 00:25:36.542 }, 00:25:36.542 { 00:25:36.542 "name": "BaseBdev2", 00:25:36.542 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:36.542 "is_configured": true, 00:25:36.542 "data_offset": 256, 00:25:36.542 "data_size": 7936 00:25:36.542 } 00:25:36.542 ] 00:25:36.542 }' 00:25:36.542 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:36.542 08:38:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:36.803 08:38:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:37.093 [2024-07-23 08:38:49.412437] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:37.093 [2024-07-23 08:38:49.412632] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:37.093 [2024-07-23 08:38:49.412651] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:37.093 [2024-07-23 08:38:49.412678] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:37.093 [2024-07-23 08:38:49.429516] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0001c7cd0 00:25:37.093 [2024-07-23 08:38:49.431107] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:37.093 08:38:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:38.028 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:38.028 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:38.028 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:38.028 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:38.028 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:38.028 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.028 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:38.287 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:38.287 "name": "raid_bdev1", 00:25:38.287 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:38.287 "strip_size_kb": 0, 00:25:38.287 "state": "online", 00:25:38.287 "raid_level": "raid1", 00:25:38.287 "superblock": true, 00:25:38.287 "num_base_bdevs": 2, 00:25:38.287 "num_base_bdevs_discovered": 2, 00:25:38.287 "num_base_bdevs_operational": 2, 00:25:38.287 "process": { 00:25:38.287 "type": "rebuild", 00:25:38.287 "target": "spare", 00:25:38.287 "progress": { 00:25:38.287 "blocks": 2816, 00:25:38.287 "percent": 35 00:25:38.287 } 00:25:38.287 }, 00:25:38.287 "base_bdevs_list": [ 00:25:38.287 { 00:25:38.287 "name": "spare", 00:25:38.287 "uuid": "2e3b1c90-96da-54d5-a776-ea2fbbe4583f", 00:25:38.287 "is_configured": true, 00:25:38.287 "data_offset": 256, 00:25:38.287 "data_size": 7936 00:25:38.287 }, 00:25:38.287 { 00:25:38.287 "name": "BaseBdev2", 00:25:38.287 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:38.287 "is_configured": true, 00:25:38.287 "data_offset": 256, 00:25:38.287 "data_size": 7936 00:25:38.287 } 00:25:38.287 ] 00:25:38.287 }' 00:25:38.287 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:38.287 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:38.287 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:38.287 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:38.287 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:38.546 [2024-07-23 08:38:50.840701] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:38.546 [2024-07-23 08:38:50.842167] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:38.546 [2024-07-23 08:38:50.842216] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:38.546 [2024-07-23 08:38:50.842231] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:38.546 [2024-07-23 08:38:50.842240] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:38.546 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:38.546 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:38.546 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:38.546 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:38.546 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:38.546 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:38.546 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:38.546 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:38.547 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:38.547 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:38.547 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.547 08:38:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:38.805 08:38:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:38.806 "name": "raid_bdev1", 00:25:38.806 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:38.806 "strip_size_kb": 0, 00:25:38.806 "state": "online", 00:25:38.806 "raid_level": "raid1", 00:25:38.806 "superblock": true, 00:25:38.806 "num_base_bdevs": 2, 00:25:38.806 "num_base_bdevs_discovered": 1, 00:25:38.806 "num_base_bdevs_operational": 1, 00:25:38.806 "base_bdevs_list": [ 00:25:38.806 { 00:25:38.806 "name": null, 00:25:38.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:38.806 "is_configured": false, 00:25:38.806 "data_offset": 256, 00:25:38.806 "data_size": 7936 00:25:38.806 }, 00:25:38.806 { 00:25:38.806 "name": "BaseBdev2", 00:25:38.806 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:38.806 "is_configured": true, 00:25:38.806 "data_offset": 256, 00:25:38.806 "data_size": 7936 00:25:38.806 } 00:25:38.806 ] 00:25:38.806 }' 00:25:38.806 08:38:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:38.806 08:38:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:39.373 08:38:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:39.373 [2024-07-23 08:38:51.737118] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:39.373 [2024-07-23 08:38:51.737180] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:39.373 [2024-07-23 08:38:51.737202] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039080 00:25:39.373 [2024-07-23 08:38:51.737214] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:39.373 [2024-07-23 08:38:51.737695] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:39.373 [2024-07-23 08:38:51.737719] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:39.373 [2024-07-23 08:38:51.737804] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:39.373 [2024-07-23 08:38:51.737819] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:39.373 [2024-07-23 08:38:51.737829] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:39.373 [2024-07-23 08:38:51.737850] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:39.373 [2024-07-23 08:38:51.755987] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0001c7da0 00:25:39.373 spare 00:25:39.373 [2024-07-23 08:38:51.757600] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:39.373 08:38:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:40.309 08:38:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:40.309 08:38:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:40.309 08:38:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:40.309 08:38:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:40.309 08:38:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:40.309 08:38:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:40.309 08:38:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:40.568 08:38:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:40.568 "name": "raid_bdev1", 00:25:40.568 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:40.568 "strip_size_kb": 0, 00:25:40.568 "state": "online", 00:25:40.568 "raid_level": "raid1", 00:25:40.568 "superblock": true, 00:25:40.568 "num_base_bdevs": 2, 00:25:40.568 "num_base_bdevs_discovered": 2, 00:25:40.568 "num_base_bdevs_operational": 2, 00:25:40.568 "process": { 00:25:40.568 "type": "rebuild", 00:25:40.568 "target": "spare", 00:25:40.568 "progress": { 00:25:40.568 "blocks": 2816, 00:25:40.568 "percent": 35 00:25:40.568 } 00:25:40.569 }, 00:25:40.569 "base_bdevs_list": [ 00:25:40.569 { 00:25:40.569 "name": "spare", 00:25:40.569 "uuid": "2e3b1c90-96da-54d5-a776-ea2fbbe4583f", 00:25:40.569 "is_configured": true, 00:25:40.569 "data_offset": 256, 00:25:40.569 "data_size": 7936 00:25:40.569 }, 00:25:40.569 { 00:25:40.569 "name": "BaseBdev2", 00:25:40.569 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:40.569 "is_configured": true, 00:25:40.569 "data_offset": 256, 00:25:40.569 "data_size": 7936 00:25:40.569 } 00:25:40.569 ] 00:25:40.569 }' 00:25:40.569 08:38:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:40.569 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:40.569 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:40.569 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:40.569 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:40.828 [2024-07-23 08:38:53.199465] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:40.828 [2024-07-23 08:38:53.269464] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:40.828 [2024-07-23 08:38:53.269508] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:40.828 [2024-07-23 08:38:53.269540] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:40.828 [2024-07-23 08:38:53.269547] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:40.828 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:40.828 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:40.828 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:40.828 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:40.828 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:40.828 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:40.828 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:40.828 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:40.828 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:40.828 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:40.828 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:40.828 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:41.087 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:41.087 "name": "raid_bdev1", 00:25:41.087 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:41.087 "strip_size_kb": 0, 00:25:41.087 "state": "online", 00:25:41.087 "raid_level": "raid1", 00:25:41.087 "superblock": true, 00:25:41.087 "num_base_bdevs": 2, 00:25:41.087 "num_base_bdevs_discovered": 1, 00:25:41.087 "num_base_bdevs_operational": 1, 00:25:41.087 "base_bdevs_list": [ 00:25:41.087 { 00:25:41.087 "name": null, 00:25:41.087 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:41.087 "is_configured": false, 00:25:41.087 "data_offset": 256, 00:25:41.087 "data_size": 7936 00:25:41.087 }, 00:25:41.087 { 00:25:41.087 "name": "BaseBdev2", 00:25:41.087 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:41.087 "is_configured": true, 00:25:41.087 "data_offset": 256, 00:25:41.087 "data_size": 7936 00:25:41.087 } 00:25:41.087 ] 00:25:41.087 }' 00:25:41.087 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:41.087 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:41.653 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:41.653 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:41.653 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:41.653 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:41.653 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:41.653 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.653 08:38:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:41.653 08:38:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:41.653 "name": "raid_bdev1", 00:25:41.653 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:41.653 "strip_size_kb": 0, 00:25:41.653 "state": "online", 00:25:41.653 "raid_level": "raid1", 00:25:41.653 "superblock": true, 00:25:41.653 "num_base_bdevs": 2, 00:25:41.653 "num_base_bdevs_discovered": 1, 00:25:41.653 "num_base_bdevs_operational": 1, 00:25:41.653 "base_bdevs_list": [ 00:25:41.653 { 00:25:41.653 "name": null, 00:25:41.653 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:41.653 "is_configured": false, 00:25:41.653 "data_offset": 256, 00:25:41.653 "data_size": 7936 00:25:41.653 }, 00:25:41.653 { 00:25:41.653 "name": "BaseBdev2", 00:25:41.653 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:41.653 "is_configured": true, 00:25:41.653 "data_offset": 256, 00:25:41.653 "data_size": 7936 00:25:41.653 } 00:25:41.653 ] 00:25:41.653 }' 00:25:41.653 08:38:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:41.912 08:38:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:41.912 08:38:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:41.912 08:38:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:41.912 08:38:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:41.912 08:38:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:42.171 [2024-07-23 08:38:54.560645] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:42.171 [2024-07-23 08:38:54.560695] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:42.171 [2024-07-23 08:38:54.560716] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039680 00:25:42.171 [2024-07-23 08:38:54.560726] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:42.171 [2024-07-23 08:38:54.561184] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:42.171 [2024-07-23 08:38:54.561201] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:42.171 [2024-07-23 08:38:54.561279] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:42.171 [2024-07-23 08:38:54.561294] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:42.171 [2024-07-23 08:38:54.561306] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:42.171 BaseBdev1 00:25:42.171 08:38:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:43.106 08:38:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:43.106 08:38:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:43.106 08:38:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:43.106 08:38:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:43.106 08:38:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:43.106 08:38:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:43.106 08:38:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:43.106 08:38:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:43.106 08:38:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:43.106 08:38:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:43.106 08:38:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.106 08:38:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:43.365 08:38:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:43.365 "name": "raid_bdev1", 00:25:43.365 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:43.365 "strip_size_kb": 0, 00:25:43.365 "state": "online", 00:25:43.365 "raid_level": "raid1", 00:25:43.365 "superblock": true, 00:25:43.365 "num_base_bdevs": 2, 00:25:43.365 "num_base_bdevs_discovered": 1, 00:25:43.365 "num_base_bdevs_operational": 1, 00:25:43.365 "base_bdevs_list": [ 00:25:43.365 { 00:25:43.365 "name": null, 00:25:43.365 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:43.365 "is_configured": false, 00:25:43.365 "data_offset": 256, 00:25:43.365 "data_size": 7936 00:25:43.365 }, 00:25:43.365 { 00:25:43.365 "name": "BaseBdev2", 00:25:43.365 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:43.365 "is_configured": true, 00:25:43.365 "data_offset": 256, 00:25:43.365 "data_size": 7936 00:25:43.365 } 00:25:43.365 ] 00:25:43.365 }' 00:25:43.365 08:38:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:43.365 08:38:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:43.932 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:43.932 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:43.932 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:43.932 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:43.932 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:43.932 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.932 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:43.932 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:43.932 "name": "raid_bdev1", 00:25:43.932 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:43.932 "strip_size_kb": 0, 00:25:43.932 "state": "online", 00:25:43.932 "raid_level": "raid1", 00:25:43.932 "superblock": true, 00:25:43.932 "num_base_bdevs": 2, 00:25:43.932 "num_base_bdevs_discovered": 1, 00:25:43.932 "num_base_bdevs_operational": 1, 00:25:43.932 "base_bdevs_list": [ 00:25:43.932 { 00:25:43.932 "name": null, 00:25:43.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:43.932 "is_configured": false, 00:25:43.932 "data_offset": 256, 00:25:43.932 "data_size": 7936 00:25:43.932 }, 00:25:43.932 { 00:25:43.932 "name": "BaseBdev2", 00:25:43.932 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:43.932 "is_configured": true, 00:25:43.932 "data_offset": 256, 00:25:43.932 "data_size": 7936 00:25:43.932 } 00:25:43.932 ] 00:25:43.932 }' 00:25:43.932 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:43.932 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:43.932 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:44.191 [2024-07-23 08:38:56.630133] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:44.191 [2024-07-23 08:38:56.630289] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:44.191 [2024-07-23 08:38:56.630306] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:44.191 request: 00:25:44.191 { 00:25:44.191 "base_bdev": "BaseBdev1", 00:25:44.191 "raid_bdev": "raid_bdev1", 00:25:44.191 "method": "bdev_raid_add_base_bdev", 00:25:44.191 "req_id": 1 00:25:44.191 } 00:25:44.191 Got JSON-RPC error response 00:25:44.191 response: 00:25:44.191 { 00:25:44.191 "code": -22, 00:25:44.191 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:44.191 } 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:44.191 08:38:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:45.570 08:38:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:45.570 08:38:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:45.570 08:38:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:45.570 08:38:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:45.570 08:38:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:45.570 08:38:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:45.570 08:38:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:45.570 08:38:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:45.570 08:38:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:45.570 08:38:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:45.570 08:38:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.570 08:38:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:45.570 08:38:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:45.570 "name": "raid_bdev1", 00:25:45.570 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:45.570 "strip_size_kb": 0, 00:25:45.570 "state": "online", 00:25:45.570 "raid_level": "raid1", 00:25:45.570 "superblock": true, 00:25:45.570 "num_base_bdevs": 2, 00:25:45.570 "num_base_bdevs_discovered": 1, 00:25:45.570 "num_base_bdevs_operational": 1, 00:25:45.570 "base_bdevs_list": [ 00:25:45.570 { 00:25:45.570 "name": null, 00:25:45.570 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:45.570 "is_configured": false, 00:25:45.570 "data_offset": 256, 00:25:45.570 "data_size": 7936 00:25:45.570 }, 00:25:45.570 { 00:25:45.570 "name": "BaseBdev2", 00:25:45.570 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:45.570 "is_configured": true, 00:25:45.570 "data_offset": 256, 00:25:45.570 "data_size": 7936 00:25:45.570 } 00:25:45.570 ] 00:25:45.570 }' 00:25:45.570 08:38:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:45.570 08:38:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:45.829 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:45.829 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:45.829 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:45.829 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:45.829 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:45.829 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:45.829 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:46.088 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:46.088 "name": "raid_bdev1", 00:25:46.088 "uuid": "1573d3a0-0940-43c3-8cff-cc5fba27fbd1", 00:25:46.088 "strip_size_kb": 0, 00:25:46.088 "state": "online", 00:25:46.088 "raid_level": "raid1", 00:25:46.088 "superblock": true, 00:25:46.088 "num_base_bdevs": 2, 00:25:46.088 "num_base_bdevs_discovered": 1, 00:25:46.088 "num_base_bdevs_operational": 1, 00:25:46.088 "base_bdevs_list": [ 00:25:46.088 { 00:25:46.088 "name": null, 00:25:46.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:46.088 "is_configured": false, 00:25:46.088 "data_offset": 256, 00:25:46.088 "data_size": 7936 00:25:46.088 }, 00:25:46.088 { 00:25:46.088 "name": "BaseBdev2", 00:25:46.088 "uuid": "709abb1a-a713-531a-8c70-464272cf24f1", 00:25:46.088 "is_configured": true, 00:25:46.088 "data_offset": 256, 00:25:46.088 "data_size": 7936 00:25:46.088 } 00:25:46.088 ] 00:25:46.088 }' 00:25:46.088 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:46.088 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:46.088 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:46.088 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:46.088 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 1566325 00:25:46.088 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 1566325 ']' 00:25:46.088 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 1566325 00:25:46.088 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:25:46.347 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:46.347 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1566325 00:25:46.347 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:46.348 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:46.348 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1566325' 00:25:46.348 killing process with pid 1566325 00:25:46.348 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 1566325 00:25:46.348 Received shutdown signal, test time was about 60.000000 seconds 00:25:46.348 00:25:46.348 Latency(us) 00:25:46.348 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:46.348 =================================================================================================================== 00:25:46.348 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:46.348 [2024-07-23 08:38:58.645307] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:46.348 [2024-07-23 08:38:58.645427] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:46.348 08:38:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 1566325 00:25:46.348 [2024-07-23 08:38:58.645481] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:46.348 [2024-07-23 08:38:58.645493] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038a80 name raid_bdev1, state offline 00:25:46.607 [2024-07-23 08:38:58.889356] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:47.985 08:39:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:25:47.985 00:25:47.985 real 0m27.511s 00:25:47.985 user 0m41.020s 00:25:47.985 sys 0m3.420s 00:25:47.985 08:39:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:47.985 08:39:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:47.985 ************************************ 00:25:47.985 END TEST raid_rebuild_test_sb_4k 00:25:47.985 ************************************ 00:25:47.985 08:39:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:47.985 08:39:00 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:25:47.985 08:39:00 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:25:47.985 08:39:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:25:47.985 08:39:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:47.985 08:39:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:47.985 ************************************ 00:25:47.985 START TEST raid_state_function_test_sb_md_separate 00:25:47.985 ************************************ 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=1571829 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1571829' 00:25:47.985 Process raid pid: 1571829 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 1571829 /var/tmp/spdk-raid.sock 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 1571829 ']' 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:47.985 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:47.985 08:39:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:47.985 [2024-07-23 08:39:00.337639] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:25:47.985 [2024-07-23 08:39:00.337733] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:25:47.985 [2024-07-23 08:39:00.462485] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:48.245 [2024-07-23 08:39:00.691773] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:48.503 [2024-07-23 08:39:00.982603] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:48.503 [2024-07-23 08:39:00.982645] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:48.762 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:48.762 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:25:48.762 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:49.021 [2024-07-23 08:39:01.292912] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:49.021 [2024-07-23 08:39:01.292960] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:49.021 [2024-07-23 08:39:01.292970] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:49.021 [2024-07-23 08:39:01.292980] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:49.021 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:49.021 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:49.021 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:49.021 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:49.021 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:49.021 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:49.021 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:49.021 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:49.021 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:49.021 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:49.021 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:49.021 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:49.021 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:49.021 "name": "Existed_Raid", 00:25:49.021 "uuid": "3a270318-0b77-4a8b-8b82-64c9c3a413f5", 00:25:49.021 "strip_size_kb": 0, 00:25:49.021 "state": "configuring", 00:25:49.021 "raid_level": "raid1", 00:25:49.021 "superblock": true, 00:25:49.021 "num_base_bdevs": 2, 00:25:49.021 "num_base_bdevs_discovered": 0, 00:25:49.021 "num_base_bdevs_operational": 2, 00:25:49.021 "base_bdevs_list": [ 00:25:49.021 { 00:25:49.021 "name": "BaseBdev1", 00:25:49.021 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:49.021 "is_configured": false, 00:25:49.021 "data_offset": 0, 00:25:49.021 "data_size": 0 00:25:49.021 }, 00:25:49.021 { 00:25:49.021 "name": "BaseBdev2", 00:25:49.021 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:49.021 "is_configured": false, 00:25:49.021 "data_offset": 0, 00:25:49.021 "data_size": 0 00:25:49.021 } 00:25:49.021 ] 00:25:49.021 }' 00:25:49.021 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:49.021 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:49.589 08:39:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:49.848 [2024-07-23 08:39:02.143021] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:49.848 [2024-07-23 08:39:02.143056] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:25:49.848 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:49.848 [2024-07-23 08:39:02.323532] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:49.848 [2024-07-23 08:39:02.323576] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:49.848 [2024-07-23 08:39:02.323586] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:49.848 [2024-07-23 08:39:02.323598] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:49.848 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:25:50.106 [2024-07-23 08:39:02.560437] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:50.106 BaseBdev1 00:25:50.106 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:25:50.106 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:25:50.106 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:50.106 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:25:50.106 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:50.106 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:50.106 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:50.365 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:25:50.623 [ 00:25:50.623 { 00:25:50.623 "name": "BaseBdev1", 00:25:50.623 "aliases": [ 00:25:50.623 "7d9a2fa7-debe-4bed-ac59-9ef8d5c2ff90" 00:25:50.623 ], 00:25:50.623 "product_name": "Malloc disk", 00:25:50.623 "block_size": 4096, 00:25:50.623 "num_blocks": 8192, 00:25:50.623 "uuid": "7d9a2fa7-debe-4bed-ac59-9ef8d5c2ff90", 00:25:50.623 "md_size": 32, 00:25:50.623 "md_interleave": false, 00:25:50.623 "dif_type": 0, 00:25:50.623 "assigned_rate_limits": { 00:25:50.623 "rw_ios_per_sec": 0, 00:25:50.623 "rw_mbytes_per_sec": 0, 00:25:50.623 "r_mbytes_per_sec": 0, 00:25:50.623 "w_mbytes_per_sec": 0 00:25:50.623 }, 00:25:50.623 "claimed": true, 00:25:50.623 "claim_type": "exclusive_write", 00:25:50.623 "zoned": false, 00:25:50.623 "supported_io_types": { 00:25:50.623 "read": true, 00:25:50.623 "write": true, 00:25:50.623 "unmap": true, 00:25:50.623 "flush": true, 00:25:50.623 "reset": true, 00:25:50.623 "nvme_admin": false, 00:25:50.623 "nvme_io": false, 00:25:50.623 "nvme_io_md": false, 00:25:50.623 "write_zeroes": true, 00:25:50.623 "zcopy": true, 00:25:50.623 "get_zone_info": false, 00:25:50.623 "zone_management": false, 00:25:50.623 "zone_append": false, 00:25:50.623 "compare": false, 00:25:50.623 "compare_and_write": false, 00:25:50.623 "abort": true, 00:25:50.623 "seek_hole": false, 00:25:50.623 "seek_data": false, 00:25:50.623 "copy": true, 00:25:50.623 "nvme_iov_md": false 00:25:50.623 }, 00:25:50.623 "memory_domains": [ 00:25:50.623 { 00:25:50.623 "dma_device_id": "system", 00:25:50.623 "dma_device_type": 1 00:25:50.623 }, 00:25:50.623 { 00:25:50.623 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:50.623 "dma_device_type": 2 00:25:50.623 } 00:25:50.623 ], 00:25:50.623 "driver_specific": {} 00:25:50.623 } 00:25:50.623 ] 00:25:50.623 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:25:50.623 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:50.623 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:50.623 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:50.624 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:50.624 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:50.624 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:50.624 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:50.624 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:50.624 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:50.624 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:50.624 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:50.624 08:39:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:50.624 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:50.624 "name": "Existed_Raid", 00:25:50.624 "uuid": "6c6251be-28d0-4458-b716-3a76bde5fd72", 00:25:50.624 "strip_size_kb": 0, 00:25:50.624 "state": "configuring", 00:25:50.624 "raid_level": "raid1", 00:25:50.624 "superblock": true, 00:25:50.624 "num_base_bdevs": 2, 00:25:50.624 "num_base_bdevs_discovered": 1, 00:25:50.624 "num_base_bdevs_operational": 2, 00:25:50.624 "base_bdevs_list": [ 00:25:50.624 { 00:25:50.624 "name": "BaseBdev1", 00:25:50.624 "uuid": "7d9a2fa7-debe-4bed-ac59-9ef8d5c2ff90", 00:25:50.624 "is_configured": true, 00:25:50.624 "data_offset": 256, 00:25:50.624 "data_size": 7936 00:25:50.624 }, 00:25:50.624 { 00:25:50.624 "name": "BaseBdev2", 00:25:50.624 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:50.624 "is_configured": false, 00:25:50.624 "data_offset": 0, 00:25:50.624 "data_size": 0 00:25:50.624 } 00:25:50.624 ] 00:25:50.624 }' 00:25:50.624 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:50.624 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:51.190 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:51.448 [2024-07-23 08:39:03.759676] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:51.448 [2024-07-23 08:39:03.759735] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:25:51.448 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:51.448 [2024-07-23 08:39:03.928166] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:51.448 [2024-07-23 08:39:03.929788] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:51.448 [2024-07-23 08:39:03.929826] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:51.448 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:25:51.448 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:51.448 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:51.448 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:51.448 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:51.448 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:51.448 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:51.448 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:51.448 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:51.448 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:51.448 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:51.448 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:51.448 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.448 08:39:03 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:51.707 08:39:04 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:51.707 "name": "Existed_Raid", 00:25:51.707 "uuid": "d0897014-621c-4cdf-b4e6-5c4e3c3e0bb2", 00:25:51.707 "strip_size_kb": 0, 00:25:51.707 "state": "configuring", 00:25:51.707 "raid_level": "raid1", 00:25:51.707 "superblock": true, 00:25:51.707 "num_base_bdevs": 2, 00:25:51.707 "num_base_bdevs_discovered": 1, 00:25:51.707 "num_base_bdevs_operational": 2, 00:25:51.707 "base_bdevs_list": [ 00:25:51.707 { 00:25:51.707 "name": "BaseBdev1", 00:25:51.707 "uuid": "7d9a2fa7-debe-4bed-ac59-9ef8d5c2ff90", 00:25:51.707 "is_configured": true, 00:25:51.707 "data_offset": 256, 00:25:51.707 "data_size": 7936 00:25:51.707 }, 00:25:51.707 { 00:25:51.707 "name": "BaseBdev2", 00:25:51.707 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.707 "is_configured": false, 00:25:51.707 "data_offset": 0, 00:25:51.707 "data_size": 0 00:25:51.707 } 00:25:51.707 ] 00:25:51.707 }' 00:25:51.707 08:39:04 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:51.707 08:39:04 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:52.289 08:39:04 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:25:52.566 [2024-07-23 08:39:04.805662] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:52.566 [2024-07-23 08:39:04.805880] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:25:52.566 [2024-07-23 08:39:04.805896] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:52.566 [2024-07-23 08:39:04.805985] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:25:52.567 [2024-07-23 08:39:04.806144] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:25:52.567 [2024-07-23 08:39:04.806159] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:25:52.567 [2024-07-23 08:39:04.806276] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:52.567 BaseBdev2 00:25:52.567 08:39:04 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:25:52.567 08:39:04 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:25:52.567 08:39:04 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:52.567 08:39:04 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:25:52.567 08:39:04 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:52.567 08:39:04 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:52.567 08:39:04 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:52.567 08:39:04 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:25:52.825 [ 00:25:52.825 { 00:25:52.825 "name": "BaseBdev2", 00:25:52.825 "aliases": [ 00:25:52.825 "e35fded1-b595-4df7-8795-1a5dd82a8871" 00:25:52.825 ], 00:25:52.825 "product_name": "Malloc disk", 00:25:52.825 "block_size": 4096, 00:25:52.825 "num_blocks": 8192, 00:25:52.825 "uuid": "e35fded1-b595-4df7-8795-1a5dd82a8871", 00:25:52.825 "md_size": 32, 00:25:52.825 "md_interleave": false, 00:25:52.825 "dif_type": 0, 00:25:52.825 "assigned_rate_limits": { 00:25:52.825 "rw_ios_per_sec": 0, 00:25:52.825 "rw_mbytes_per_sec": 0, 00:25:52.825 "r_mbytes_per_sec": 0, 00:25:52.825 "w_mbytes_per_sec": 0 00:25:52.825 }, 00:25:52.825 "claimed": true, 00:25:52.825 "claim_type": "exclusive_write", 00:25:52.825 "zoned": false, 00:25:52.825 "supported_io_types": { 00:25:52.825 "read": true, 00:25:52.825 "write": true, 00:25:52.825 "unmap": true, 00:25:52.825 "flush": true, 00:25:52.825 "reset": true, 00:25:52.825 "nvme_admin": false, 00:25:52.825 "nvme_io": false, 00:25:52.825 "nvme_io_md": false, 00:25:52.825 "write_zeroes": true, 00:25:52.825 "zcopy": true, 00:25:52.825 "get_zone_info": false, 00:25:52.825 "zone_management": false, 00:25:52.825 "zone_append": false, 00:25:52.825 "compare": false, 00:25:52.825 "compare_and_write": false, 00:25:52.825 "abort": true, 00:25:52.825 "seek_hole": false, 00:25:52.825 "seek_data": false, 00:25:52.825 "copy": true, 00:25:52.825 "nvme_iov_md": false 00:25:52.825 }, 00:25:52.825 "memory_domains": [ 00:25:52.825 { 00:25:52.825 "dma_device_id": "system", 00:25:52.825 "dma_device_type": 1 00:25:52.825 }, 00:25:52.825 { 00:25:52.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:52.825 "dma_device_type": 2 00:25:52.825 } 00:25:52.825 ], 00:25:52.825 "driver_specific": {} 00:25:52.825 } 00:25:52.825 ] 00:25:52.825 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:25:52.825 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:25:52.825 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:52.825 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:25:52.825 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:52.825 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:52.825 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:52.825 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:52.825 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:52.825 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:52.825 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:52.825 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:52.825 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:52.825 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.825 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:52.825 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:52.825 "name": "Existed_Raid", 00:25:52.825 "uuid": "d0897014-621c-4cdf-b4e6-5c4e3c3e0bb2", 00:25:52.825 "strip_size_kb": 0, 00:25:52.825 "state": "online", 00:25:52.825 "raid_level": "raid1", 00:25:52.825 "superblock": true, 00:25:52.825 "num_base_bdevs": 2, 00:25:52.825 "num_base_bdevs_discovered": 2, 00:25:52.825 "num_base_bdevs_operational": 2, 00:25:52.825 "base_bdevs_list": [ 00:25:52.825 { 00:25:52.825 "name": "BaseBdev1", 00:25:52.825 "uuid": "7d9a2fa7-debe-4bed-ac59-9ef8d5c2ff90", 00:25:52.825 "is_configured": true, 00:25:52.825 "data_offset": 256, 00:25:52.825 "data_size": 7936 00:25:52.826 }, 00:25:52.826 { 00:25:52.826 "name": "BaseBdev2", 00:25:52.826 "uuid": "e35fded1-b595-4df7-8795-1a5dd82a8871", 00:25:52.826 "is_configured": true, 00:25:52.826 "data_offset": 256, 00:25:52.826 "data_size": 7936 00:25:52.826 } 00:25:52.826 ] 00:25:52.826 }' 00:25:52.826 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:52.826 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:53.392 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:25:53.392 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:25:53.392 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:53.392 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:53.392 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:53.392 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:25:53.392 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:25:53.392 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:53.651 [2024-07-23 08:39:05.985154] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:53.651 08:39:05 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:53.651 "name": "Existed_Raid", 00:25:53.651 "aliases": [ 00:25:53.651 "d0897014-621c-4cdf-b4e6-5c4e3c3e0bb2" 00:25:53.651 ], 00:25:53.651 "product_name": "Raid Volume", 00:25:53.651 "block_size": 4096, 00:25:53.651 "num_blocks": 7936, 00:25:53.651 "uuid": "d0897014-621c-4cdf-b4e6-5c4e3c3e0bb2", 00:25:53.651 "md_size": 32, 00:25:53.651 "md_interleave": false, 00:25:53.651 "dif_type": 0, 00:25:53.651 "assigned_rate_limits": { 00:25:53.651 "rw_ios_per_sec": 0, 00:25:53.651 "rw_mbytes_per_sec": 0, 00:25:53.651 "r_mbytes_per_sec": 0, 00:25:53.651 "w_mbytes_per_sec": 0 00:25:53.651 }, 00:25:53.651 "claimed": false, 00:25:53.651 "zoned": false, 00:25:53.651 "supported_io_types": { 00:25:53.651 "read": true, 00:25:53.651 "write": true, 00:25:53.651 "unmap": false, 00:25:53.651 "flush": false, 00:25:53.651 "reset": true, 00:25:53.651 "nvme_admin": false, 00:25:53.651 "nvme_io": false, 00:25:53.651 "nvme_io_md": false, 00:25:53.651 "write_zeroes": true, 00:25:53.651 "zcopy": false, 00:25:53.651 "get_zone_info": false, 00:25:53.651 "zone_management": false, 00:25:53.651 "zone_append": false, 00:25:53.651 "compare": false, 00:25:53.651 "compare_and_write": false, 00:25:53.651 "abort": false, 00:25:53.651 "seek_hole": false, 00:25:53.651 "seek_data": false, 00:25:53.651 "copy": false, 00:25:53.651 "nvme_iov_md": false 00:25:53.651 }, 00:25:53.651 "memory_domains": [ 00:25:53.651 { 00:25:53.651 "dma_device_id": "system", 00:25:53.651 "dma_device_type": 1 00:25:53.651 }, 00:25:53.651 { 00:25:53.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:53.651 "dma_device_type": 2 00:25:53.651 }, 00:25:53.651 { 00:25:53.651 "dma_device_id": "system", 00:25:53.651 "dma_device_type": 1 00:25:53.651 }, 00:25:53.651 { 00:25:53.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:53.651 "dma_device_type": 2 00:25:53.651 } 00:25:53.651 ], 00:25:53.651 "driver_specific": { 00:25:53.651 "raid": { 00:25:53.651 "uuid": "d0897014-621c-4cdf-b4e6-5c4e3c3e0bb2", 00:25:53.651 "strip_size_kb": 0, 00:25:53.651 "state": "online", 00:25:53.651 "raid_level": "raid1", 00:25:53.651 "superblock": true, 00:25:53.651 "num_base_bdevs": 2, 00:25:53.651 "num_base_bdevs_discovered": 2, 00:25:53.651 "num_base_bdevs_operational": 2, 00:25:53.651 "base_bdevs_list": [ 00:25:53.651 { 00:25:53.651 "name": "BaseBdev1", 00:25:53.651 "uuid": "7d9a2fa7-debe-4bed-ac59-9ef8d5c2ff90", 00:25:53.651 "is_configured": true, 00:25:53.651 "data_offset": 256, 00:25:53.651 "data_size": 7936 00:25:53.651 }, 00:25:53.651 { 00:25:53.651 "name": "BaseBdev2", 00:25:53.651 "uuid": "e35fded1-b595-4df7-8795-1a5dd82a8871", 00:25:53.651 "is_configured": true, 00:25:53.651 "data_offset": 256, 00:25:53.651 "data_size": 7936 00:25:53.651 } 00:25:53.651 ] 00:25:53.651 } 00:25:53.651 } 00:25:53.651 }' 00:25:53.651 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:53.651 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:25:53.651 BaseBdev2' 00:25:53.651 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:53.651 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:25:53.651 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:53.909 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:53.909 "name": "BaseBdev1", 00:25:53.909 "aliases": [ 00:25:53.909 "7d9a2fa7-debe-4bed-ac59-9ef8d5c2ff90" 00:25:53.909 ], 00:25:53.909 "product_name": "Malloc disk", 00:25:53.909 "block_size": 4096, 00:25:53.909 "num_blocks": 8192, 00:25:53.909 "uuid": "7d9a2fa7-debe-4bed-ac59-9ef8d5c2ff90", 00:25:53.909 "md_size": 32, 00:25:53.909 "md_interleave": false, 00:25:53.909 "dif_type": 0, 00:25:53.909 "assigned_rate_limits": { 00:25:53.909 "rw_ios_per_sec": 0, 00:25:53.909 "rw_mbytes_per_sec": 0, 00:25:53.909 "r_mbytes_per_sec": 0, 00:25:53.909 "w_mbytes_per_sec": 0 00:25:53.909 }, 00:25:53.909 "claimed": true, 00:25:53.909 "claim_type": "exclusive_write", 00:25:53.909 "zoned": false, 00:25:53.909 "supported_io_types": { 00:25:53.909 "read": true, 00:25:53.909 "write": true, 00:25:53.909 "unmap": true, 00:25:53.909 "flush": true, 00:25:53.909 "reset": true, 00:25:53.909 "nvme_admin": false, 00:25:53.909 "nvme_io": false, 00:25:53.909 "nvme_io_md": false, 00:25:53.909 "write_zeroes": true, 00:25:53.909 "zcopy": true, 00:25:53.909 "get_zone_info": false, 00:25:53.909 "zone_management": false, 00:25:53.909 "zone_append": false, 00:25:53.909 "compare": false, 00:25:53.909 "compare_and_write": false, 00:25:53.909 "abort": true, 00:25:53.909 "seek_hole": false, 00:25:53.909 "seek_data": false, 00:25:53.909 "copy": true, 00:25:53.909 "nvme_iov_md": false 00:25:53.909 }, 00:25:53.909 "memory_domains": [ 00:25:53.909 { 00:25:53.909 "dma_device_id": "system", 00:25:53.909 "dma_device_type": 1 00:25:53.910 }, 00:25:53.910 { 00:25:53.910 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:53.910 "dma_device_type": 2 00:25:53.910 } 00:25:53.910 ], 00:25:53.910 "driver_specific": {} 00:25:53.910 }' 00:25:53.910 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:53.910 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:53.910 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:53.910 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:53.910 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:53.910 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:53.910 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:53.910 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:54.168 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:25:54.168 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:54.168 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:54.168 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:54.168 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:54.168 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:54.168 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:25:54.427 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:54.427 "name": "BaseBdev2", 00:25:54.427 "aliases": [ 00:25:54.427 "e35fded1-b595-4df7-8795-1a5dd82a8871" 00:25:54.427 ], 00:25:54.427 "product_name": "Malloc disk", 00:25:54.427 "block_size": 4096, 00:25:54.427 "num_blocks": 8192, 00:25:54.427 "uuid": "e35fded1-b595-4df7-8795-1a5dd82a8871", 00:25:54.427 "md_size": 32, 00:25:54.427 "md_interleave": false, 00:25:54.427 "dif_type": 0, 00:25:54.427 "assigned_rate_limits": { 00:25:54.427 "rw_ios_per_sec": 0, 00:25:54.427 "rw_mbytes_per_sec": 0, 00:25:54.427 "r_mbytes_per_sec": 0, 00:25:54.427 "w_mbytes_per_sec": 0 00:25:54.427 }, 00:25:54.427 "claimed": true, 00:25:54.427 "claim_type": "exclusive_write", 00:25:54.427 "zoned": false, 00:25:54.427 "supported_io_types": { 00:25:54.427 "read": true, 00:25:54.427 "write": true, 00:25:54.427 "unmap": true, 00:25:54.427 "flush": true, 00:25:54.427 "reset": true, 00:25:54.427 "nvme_admin": false, 00:25:54.427 "nvme_io": false, 00:25:54.427 "nvme_io_md": false, 00:25:54.427 "write_zeroes": true, 00:25:54.427 "zcopy": true, 00:25:54.427 "get_zone_info": false, 00:25:54.427 "zone_management": false, 00:25:54.427 "zone_append": false, 00:25:54.427 "compare": false, 00:25:54.427 "compare_and_write": false, 00:25:54.427 "abort": true, 00:25:54.427 "seek_hole": false, 00:25:54.427 "seek_data": false, 00:25:54.427 "copy": true, 00:25:54.427 "nvme_iov_md": false 00:25:54.427 }, 00:25:54.427 "memory_domains": [ 00:25:54.427 { 00:25:54.427 "dma_device_id": "system", 00:25:54.427 "dma_device_type": 1 00:25:54.427 }, 00:25:54.427 { 00:25:54.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:54.427 "dma_device_type": 2 00:25:54.427 } 00:25:54.427 ], 00:25:54.427 "driver_specific": {} 00:25:54.427 }' 00:25:54.427 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:54.427 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:54.427 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:54.427 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:54.427 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:54.427 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:54.427 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:54.427 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:54.686 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:25:54.686 08:39:06 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:54.686 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:54.686 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:54.686 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:25:54.686 [2024-07-23 08:39:07.188356] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.946 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:55.205 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:55.205 "name": "Existed_Raid", 00:25:55.205 "uuid": "d0897014-621c-4cdf-b4e6-5c4e3c3e0bb2", 00:25:55.205 "strip_size_kb": 0, 00:25:55.205 "state": "online", 00:25:55.205 "raid_level": "raid1", 00:25:55.205 "superblock": true, 00:25:55.205 "num_base_bdevs": 2, 00:25:55.205 "num_base_bdevs_discovered": 1, 00:25:55.205 "num_base_bdevs_operational": 1, 00:25:55.205 "base_bdevs_list": [ 00:25:55.205 { 00:25:55.205 "name": null, 00:25:55.205 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.205 "is_configured": false, 00:25:55.205 "data_offset": 256, 00:25:55.205 "data_size": 7936 00:25:55.205 }, 00:25:55.205 { 00:25:55.205 "name": "BaseBdev2", 00:25:55.205 "uuid": "e35fded1-b595-4df7-8795-1a5dd82a8871", 00:25:55.205 "is_configured": true, 00:25:55.205 "data_offset": 256, 00:25:55.205 "data_size": 7936 00:25:55.205 } 00:25:55.205 ] 00:25:55.205 }' 00:25:55.205 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:55.205 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:55.773 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:25:55.773 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:55.773 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.773 08:39:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:25:55.773 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:25:55.773 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:25:55.773 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:25:56.032 [2024-07-23 08:39:08.308215] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:56.032 [2024-07-23 08:39:08.308324] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:56.032 [2024-07-23 08:39:08.411836] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:56.032 [2024-07-23 08:39:08.411889] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:56.032 [2024-07-23 08:39:08.411902] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:25:56.032 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:25:56.032 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:56.032 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.032 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:25:56.291 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:25:56.292 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:25:56.292 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:25:56.292 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 1571829 00:25:56.292 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 1571829 ']' 00:25:56.292 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 1571829 00:25:56.292 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:25:56.292 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:56.292 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1571829 00:25:56.292 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:56.292 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:56.292 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1571829' 00:25:56.292 killing process with pid 1571829 00:25:56.292 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 1571829 00:25:56.292 [2024-07-23 08:39:08.642409] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:56.292 08:39:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 1571829 00:25:56.292 [2024-07-23 08:39:08.659124] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:57.673 08:39:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:25:57.673 00:25:57.673 real 0m9.669s 00:25:57.673 user 0m16.028s 00:25:57.673 sys 0m1.525s 00:25:57.673 08:39:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:57.673 08:39:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:57.673 ************************************ 00:25:57.673 END TEST raid_state_function_test_sb_md_separate 00:25:57.673 ************************************ 00:25:57.673 08:39:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:57.673 08:39:09 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:25:57.673 08:39:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:25:57.673 08:39:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:57.673 08:39:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:57.673 ************************************ 00:25:57.673 START TEST raid_superblock_test_md_separate 00:25:57.673 ************************************ 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=1573817 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 1573817 /var/tmp/spdk-raid.sock 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 1573817 ']' 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:57.673 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:57.673 08:39:09 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:57.673 [2024-07-23 08:39:10.068199] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:25:57.673 [2024-07-23 08:39:10.068307] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1573817 ] 00:25:57.673 [2024-07-23 08:39:10.190391] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:57.932 [2024-07-23 08:39:10.402822] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:58.191 [2024-07-23 08:39:10.656503] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:58.191 [2024-07-23 08:39:10.656537] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:58.450 08:39:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:58.450 08:39:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:25:58.450 08:39:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:25:58.450 08:39:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:58.450 08:39:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:25:58.450 08:39:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:25:58.450 08:39:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:25:58.450 08:39:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:58.450 08:39:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:58.450 08:39:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:58.450 08:39:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:25:58.710 malloc1 00:25:58.710 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:58.710 [2024-07-23 08:39:11.225340] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:58.710 [2024-07-23 08:39:11.225398] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:58.710 [2024-07-23 08:39:11.225426] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:25:58.710 [2024-07-23 08:39:11.225436] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:58.710 [2024-07-23 08:39:11.227205] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:58.710 [2024-07-23 08:39:11.227233] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:58.969 pt1 00:25:58.969 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:58.969 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:58.969 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:25:58.969 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:25:58.969 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:25:58.969 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:58.969 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:58.969 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:58.969 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:25:58.969 malloc2 00:25:58.969 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:59.228 [2024-07-23 08:39:11.607518] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:59.228 [2024-07-23 08:39:11.607568] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:59.228 [2024-07-23 08:39:11.607604] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:25:59.228 [2024-07-23 08:39:11.607619] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:59.228 [2024-07-23 08:39:11.609442] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:59.228 [2024-07-23 08:39:11.609466] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:59.228 pt2 00:25:59.228 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:59.228 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:59.228 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:25:59.488 [2024-07-23 08:39:11.775987] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:59.488 [2024-07-23 08:39:11.777595] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:59.488 [2024-07-23 08:39:11.777836] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035a80 00:25:59.488 [2024-07-23 08:39:11.777851] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:59.488 [2024-07-23 08:39:11.777942] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:25:59.488 [2024-07-23 08:39:11.778096] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035a80 00:25:59.488 [2024-07-23 08:39:11.778109] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000035a80 00:25:59.488 [2024-07-23 08:39:11.778215] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:59.488 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:59.488 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:59.488 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:59.488 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:59.488 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:59.488 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:59.488 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:59.488 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:59.488 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:59.488 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:59.488 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.488 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:59.488 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:59.488 "name": "raid_bdev1", 00:25:59.488 "uuid": "0dd889d7-33d3-4936-996b-f104b7310a2d", 00:25:59.488 "strip_size_kb": 0, 00:25:59.488 "state": "online", 00:25:59.488 "raid_level": "raid1", 00:25:59.488 "superblock": true, 00:25:59.488 "num_base_bdevs": 2, 00:25:59.488 "num_base_bdevs_discovered": 2, 00:25:59.488 "num_base_bdevs_operational": 2, 00:25:59.488 "base_bdevs_list": [ 00:25:59.488 { 00:25:59.488 "name": "pt1", 00:25:59.488 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:59.488 "is_configured": true, 00:25:59.488 "data_offset": 256, 00:25:59.488 "data_size": 7936 00:25:59.488 }, 00:25:59.488 { 00:25:59.488 "name": "pt2", 00:25:59.488 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:59.488 "is_configured": true, 00:25:59.488 "data_offset": 256, 00:25:59.488 "data_size": 7936 00:25:59.488 } 00:25:59.488 ] 00:25:59.488 }' 00:25:59.488 08:39:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:59.488 08:39:11 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:00.056 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:26:00.056 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:00.056 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:00.056 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:00.056 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:00.056 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:26:00.056 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:00.056 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:00.315 [2024-07-23 08:39:12.582326] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:00.315 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:00.315 "name": "raid_bdev1", 00:26:00.315 "aliases": [ 00:26:00.315 "0dd889d7-33d3-4936-996b-f104b7310a2d" 00:26:00.315 ], 00:26:00.315 "product_name": "Raid Volume", 00:26:00.315 "block_size": 4096, 00:26:00.315 "num_blocks": 7936, 00:26:00.315 "uuid": "0dd889d7-33d3-4936-996b-f104b7310a2d", 00:26:00.315 "md_size": 32, 00:26:00.315 "md_interleave": false, 00:26:00.315 "dif_type": 0, 00:26:00.315 "assigned_rate_limits": { 00:26:00.315 "rw_ios_per_sec": 0, 00:26:00.315 "rw_mbytes_per_sec": 0, 00:26:00.315 "r_mbytes_per_sec": 0, 00:26:00.315 "w_mbytes_per_sec": 0 00:26:00.315 }, 00:26:00.315 "claimed": false, 00:26:00.315 "zoned": false, 00:26:00.315 "supported_io_types": { 00:26:00.315 "read": true, 00:26:00.315 "write": true, 00:26:00.315 "unmap": false, 00:26:00.315 "flush": false, 00:26:00.315 "reset": true, 00:26:00.315 "nvme_admin": false, 00:26:00.315 "nvme_io": false, 00:26:00.315 "nvme_io_md": false, 00:26:00.315 "write_zeroes": true, 00:26:00.315 "zcopy": false, 00:26:00.315 "get_zone_info": false, 00:26:00.315 "zone_management": false, 00:26:00.315 "zone_append": false, 00:26:00.315 "compare": false, 00:26:00.315 "compare_and_write": false, 00:26:00.315 "abort": false, 00:26:00.315 "seek_hole": false, 00:26:00.315 "seek_data": false, 00:26:00.315 "copy": false, 00:26:00.315 "nvme_iov_md": false 00:26:00.315 }, 00:26:00.315 "memory_domains": [ 00:26:00.315 { 00:26:00.315 "dma_device_id": "system", 00:26:00.315 "dma_device_type": 1 00:26:00.315 }, 00:26:00.315 { 00:26:00.315 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:00.315 "dma_device_type": 2 00:26:00.315 }, 00:26:00.315 { 00:26:00.315 "dma_device_id": "system", 00:26:00.315 "dma_device_type": 1 00:26:00.315 }, 00:26:00.315 { 00:26:00.315 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:00.315 "dma_device_type": 2 00:26:00.315 } 00:26:00.315 ], 00:26:00.315 "driver_specific": { 00:26:00.315 "raid": { 00:26:00.315 "uuid": "0dd889d7-33d3-4936-996b-f104b7310a2d", 00:26:00.315 "strip_size_kb": 0, 00:26:00.315 "state": "online", 00:26:00.315 "raid_level": "raid1", 00:26:00.315 "superblock": true, 00:26:00.315 "num_base_bdevs": 2, 00:26:00.315 "num_base_bdevs_discovered": 2, 00:26:00.315 "num_base_bdevs_operational": 2, 00:26:00.315 "base_bdevs_list": [ 00:26:00.315 { 00:26:00.315 "name": "pt1", 00:26:00.315 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:00.315 "is_configured": true, 00:26:00.315 "data_offset": 256, 00:26:00.315 "data_size": 7936 00:26:00.315 }, 00:26:00.315 { 00:26:00.315 "name": "pt2", 00:26:00.315 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:00.315 "is_configured": true, 00:26:00.315 "data_offset": 256, 00:26:00.315 "data_size": 7936 00:26:00.315 } 00:26:00.315 ] 00:26:00.315 } 00:26:00.315 } 00:26:00.315 }' 00:26:00.315 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:00.315 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:00.315 pt2' 00:26:00.315 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:00.315 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:00.315 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:00.315 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:00.315 "name": "pt1", 00:26:00.315 "aliases": [ 00:26:00.315 "00000000-0000-0000-0000-000000000001" 00:26:00.315 ], 00:26:00.315 "product_name": "passthru", 00:26:00.315 "block_size": 4096, 00:26:00.315 "num_blocks": 8192, 00:26:00.315 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:00.315 "md_size": 32, 00:26:00.315 "md_interleave": false, 00:26:00.315 "dif_type": 0, 00:26:00.315 "assigned_rate_limits": { 00:26:00.315 "rw_ios_per_sec": 0, 00:26:00.315 "rw_mbytes_per_sec": 0, 00:26:00.315 "r_mbytes_per_sec": 0, 00:26:00.315 "w_mbytes_per_sec": 0 00:26:00.315 }, 00:26:00.315 "claimed": true, 00:26:00.315 "claim_type": "exclusive_write", 00:26:00.315 "zoned": false, 00:26:00.315 "supported_io_types": { 00:26:00.315 "read": true, 00:26:00.315 "write": true, 00:26:00.315 "unmap": true, 00:26:00.315 "flush": true, 00:26:00.315 "reset": true, 00:26:00.315 "nvme_admin": false, 00:26:00.315 "nvme_io": false, 00:26:00.315 "nvme_io_md": false, 00:26:00.315 "write_zeroes": true, 00:26:00.315 "zcopy": true, 00:26:00.315 "get_zone_info": false, 00:26:00.315 "zone_management": false, 00:26:00.315 "zone_append": false, 00:26:00.315 "compare": false, 00:26:00.315 "compare_and_write": false, 00:26:00.315 "abort": true, 00:26:00.315 "seek_hole": false, 00:26:00.315 "seek_data": false, 00:26:00.315 "copy": true, 00:26:00.315 "nvme_iov_md": false 00:26:00.315 }, 00:26:00.315 "memory_domains": [ 00:26:00.315 { 00:26:00.315 "dma_device_id": "system", 00:26:00.315 "dma_device_type": 1 00:26:00.315 }, 00:26:00.315 { 00:26:00.315 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:00.315 "dma_device_type": 2 00:26:00.315 } 00:26:00.315 ], 00:26:00.315 "driver_specific": { 00:26:00.315 "passthru": { 00:26:00.315 "name": "pt1", 00:26:00.315 "base_bdev_name": "malloc1" 00:26:00.315 } 00:26:00.315 } 00:26:00.315 }' 00:26:00.315 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:00.573 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:00.573 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:00.573 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:00.573 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:00.573 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:00.573 08:39:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:00.573 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:00.573 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:00.573 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:00.573 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:00.831 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:00.831 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:00.831 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:00.831 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:00.831 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:00.831 "name": "pt2", 00:26:00.831 "aliases": [ 00:26:00.831 "00000000-0000-0000-0000-000000000002" 00:26:00.831 ], 00:26:00.831 "product_name": "passthru", 00:26:00.831 "block_size": 4096, 00:26:00.831 "num_blocks": 8192, 00:26:00.831 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:00.831 "md_size": 32, 00:26:00.831 "md_interleave": false, 00:26:00.831 "dif_type": 0, 00:26:00.831 "assigned_rate_limits": { 00:26:00.831 "rw_ios_per_sec": 0, 00:26:00.831 "rw_mbytes_per_sec": 0, 00:26:00.831 "r_mbytes_per_sec": 0, 00:26:00.831 "w_mbytes_per_sec": 0 00:26:00.831 }, 00:26:00.831 "claimed": true, 00:26:00.832 "claim_type": "exclusive_write", 00:26:00.832 "zoned": false, 00:26:00.832 "supported_io_types": { 00:26:00.832 "read": true, 00:26:00.832 "write": true, 00:26:00.832 "unmap": true, 00:26:00.832 "flush": true, 00:26:00.832 "reset": true, 00:26:00.832 "nvme_admin": false, 00:26:00.832 "nvme_io": false, 00:26:00.832 "nvme_io_md": false, 00:26:00.832 "write_zeroes": true, 00:26:00.832 "zcopy": true, 00:26:00.832 "get_zone_info": false, 00:26:00.832 "zone_management": false, 00:26:00.832 "zone_append": false, 00:26:00.832 "compare": false, 00:26:00.832 "compare_and_write": false, 00:26:00.832 "abort": true, 00:26:00.832 "seek_hole": false, 00:26:00.832 "seek_data": false, 00:26:00.832 "copy": true, 00:26:00.832 "nvme_iov_md": false 00:26:00.832 }, 00:26:00.832 "memory_domains": [ 00:26:00.832 { 00:26:00.832 "dma_device_id": "system", 00:26:00.832 "dma_device_type": 1 00:26:00.832 }, 00:26:00.832 { 00:26:00.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:00.832 "dma_device_type": 2 00:26:00.832 } 00:26:00.832 ], 00:26:00.832 "driver_specific": { 00:26:00.832 "passthru": { 00:26:00.832 "name": "pt2", 00:26:00.832 "base_bdev_name": "malloc2" 00:26:00.832 } 00:26:00.832 } 00:26:00.832 }' 00:26:00.832 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:00.832 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:01.090 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:01.090 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:01.090 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:01.090 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:01.090 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:01.090 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:01.090 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:01.090 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:01.090 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:01.090 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:01.090 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:01.091 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:26:01.348 [2024-07-23 08:39:13.749420] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:01.348 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=0dd889d7-33d3-4936-996b-f104b7310a2d 00:26:01.348 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z 0dd889d7-33d3-4936-996b-f104b7310a2d ']' 00:26:01.348 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:01.607 [2024-07-23 08:39:13.917584] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:01.607 [2024-07-23 08:39:13.917616] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:01.607 [2024-07-23 08:39:13.917696] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:01.607 [2024-07-23 08:39:13.917754] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:01.607 [2024-07-23 08:39:13.917769] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035a80 name raid_bdev1, state offline 00:26:01.607 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.607 08:39:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:26:01.607 08:39:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:26:01.607 08:39:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:26:01.607 08:39:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:01.607 08:39:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:01.867 08:39:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:01.867 08:39:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:02.126 08:39:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:26:02.126 08:39:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:26:02.126 08:39:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:26:02.126 08:39:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:02.126 08:39:14 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:26:02.126 08:39:14 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:02.126 08:39:14 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:02.126 08:39:14 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:02.126 08:39:14 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:02.126 08:39:14 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:02.126 08:39:14 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:02.126 08:39:14 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:02.126 08:39:14 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:02.126 08:39:14 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:02.126 08:39:14 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:02.385 [2024-07-23 08:39:14.759816] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:26:02.385 [2024-07-23 08:39:14.761429] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:26:02.385 [2024-07-23 08:39:14.761495] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:26:02.385 [2024-07-23 08:39:14.761539] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:26:02.385 [2024-07-23 08:39:14.761553] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:02.385 [2024-07-23 08:39:14.761567] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036080 name raid_bdev1, state configuring 00:26:02.385 request: 00:26:02.385 { 00:26:02.385 "name": "raid_bdev1", 00:26:02.385 "raid_level": "raid1", 00:26:02.385 "base_bdevs": [ 00:26:02.385 "malloc1", 00:26:02.385 "malloc2" 00:26:02.385 ], 00:26:02.385 "superblock": false, 00:26:02.385 "method": "bdev_raid_create", 00:26:02.385 "req_id": 1 00:26:02.385 } 00:26:02.385 Got JSON-RPC error response 00:26:02.385 response: 00:26:02.385 { 00:26:02.385 "code": -17, 00:26:02.385 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:26:02.385 } 00:26:02.385 08:39:14 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:26:02.385 08:39:14 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:02.385 08:39:14 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:02.385 08:39:14 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:02.385 08:39:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.386 08:39:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:26:02.645 08:39:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:26:02.645 08:39:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:26:02.645 08:39:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:02.645 [2024-07-23 08:39:15.128731] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:02.645 [2024-07-23 08:39:15.128785] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:02.645 [2024-07-23 08:39:15.128818] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036680 00:26:02.645 [2024-07-23 08:39:15.128829] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:02.645 [2024-07-23 08:39:15.130564] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:02.645 [2024-07-23 08:39:15.130594] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:02.645 [2024-07-23 08:39:15.130670] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:02.645 [2024-07-23 08:39:15.130735] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:02.645 pt1 00:26:02.645 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:26:02.645 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:02.645 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:02.645 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:02.645 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:02.645 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:02.645 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:02.645 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:02.645 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:02.645 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:02.645 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.645 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:02.904 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:02.904 "name": "raid_bdev1", 00:26:02.904 "uuid": "0dd889d7-33d3-4936-996b-f104b7310a2d", 00:26:02.904 "strip_size_kb": 0, 00:26:02.904 "state": "configuring", 00:26:02.904 "raid_level": "raid1", 00:26:02.904 "superblock": true, 00:26:02.904 "num_base_bdevs": 2, 00:26:02.904 "num_base_bdevs_discovered": 1, 00:26:02.904 "num_base_bdevs_operational": 2, 00:26:02.904 "base_bdevs_list": [ 00:26:02.904 { 00:26:02.904 "name": "pt1", 00:26:02.904 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:02.904 "is_configured": true, 00:26:02.904 "data_offset": 256, 00:26:02.904 "data_size": 7936 00:26:02.904 }, 00:26:02.904 { 00:26:02.904 "name": null, 00:26:02.904 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:02.904 "is_configured": false, 00:26:02.904 "data_offset": 256, 00:26:02.904 "data_size": 7936 00:26:02.904 } 00:26:02.904 ] 00:26:02.904 }' 00:26:02.904 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:02.904 08:39:15 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:03.473 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:26:03.473 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:26:03.473 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:03.473 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:03.473 [2024-07-23 08:39:15.974961] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:03.473 [2024-07-23 08:39:15.975023] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:03.473 [2024-07-23 08:39:15.975042] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036f80 00:26:03.473 [2024-07-23 08:39:15.975053] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:03.473 [2024-07-23 08:39:15.975309] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:03.473 [2024-07-23 08:39:15.975324] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:03.473 [2024-07-23 08:39:15.975372] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:03.473 [2024-07-23 08:39:15.975395] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:03.473 [2024-07-23 08:39:15.975532] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036c80 00:26:03.473 [2024-07-23 08:39:15.975545] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:03.473 [2024-07-23 08:39:15.975619] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:26:03.473 [2024-07-23 08:39:15.975760] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036c80 00:26:03.473 [2024-07-23 08:39:15.975769] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036c80 00:26:03.473 [2024-07-23 08:39:15.975864] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:03.473 pt2 00:26:03.473 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:26:03.473 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:03.473 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:03.473 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:03.473 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:03.473 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:03.473 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:03.473 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:03.473 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:03.473 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:03.473 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:03.473 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:03.732 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:03.732 08:39:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:03.732 08:39:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:03.732 "name": "raid_bdev1", 00:26:03.732 "uuid": "0dd889d7-33d3-4936-996b-f104b7310a2d", 00:26:03.732 "strip_size_kb": 0, 00:26:03.732 "state": "online", 00:26:03.732 "raid_level": "raid1", 00:26:03.732 "superblock": true, 00:26:03.732 "num_base_bdevs": 2, 00:26:03.732 "num_base_bdevs_discovered": 2, 00:26:03.732 "num_base_bdevs_operational": 2, 00:26:03.732 "base_bdevs_list": [ 00:26:03.732 { 00:26:03.732 "name": "pt1", 00:26:03.732 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:03.732 "is_configured": true, 00:26:03.732 "data_offset": 256, 00:26:03.732 "data_size": 7936 00:26:03.732 }, 00:26:03.732 { 00:26:03.732 "name": "pt2", 00:26:03.732 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:03.732 "is_configured": true, 00:26:03.732 "data_offset": 256, 00:26:03.732 "data_size": 7936 00:26:03.732 } 00:26:03.732 ] 00:26:03.732 }' 00:26:03.732 08:39:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:03.732 08:39:16 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:04.300 08:39:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:26:04.300 08:39:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:04.300 08:39:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:04.300 08:39:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:04.300 08:39:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:04.300 08:39:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:26:04.300 08:39:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:04.300 08:39:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:04.300 [2024-07-23 08:39:16.817453] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:04.559 08:39:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:04.559 "name": "raid_bdev1", 00:26:04.559 "aliases": [ 00:26:04.559 "0dd889d7-33d3-4936-996b-f104b7310a2d" 00:26:04.559 ], 00:26:04.559 "product_name": "Raid Volume", 00:26:04.559 "block_size": 4096, 00:26:04.559 "num_blocks": 7936, 00:26:04.559 "uuid": "0dd889d7-33d3-4936-996b-f104b7310a2d", 00:26:04.559 "md_size": 32, 00:26:04.559 "md_interleave": false, 00:26:04.559 "dif_type": 0, 00:26:04.559 "assigned_rate_limits": { 00:26:04.559 "rw_ios_per_sec": 0, 00:26:04.559 "rw_mbytes_per_sec": 0, 00:26:04.559 "r_mbytes_per_sec": 0, 00:26:04.559 "w_mbytes_per_sec": 0 00:26:04.559 }, 00:26:04.559 "claimed": false, 00:26:04.559 "zoned": false, 00:26:04.559 "supported_io_types": { 00:26:04.559 "read": true, 00:26:04.559 "write": true, 00:26:04.559 "unmap": false, 00:26:04.559 "flush": false, 00:26:04.559 "reset": true, 00:26:04.559 "nvme_admin": false, 00:26:04.559 "nvme_io": false, 00:26:04.559 "nvme_io_md": false, 00:26:04.559 "write_zeroes": true, 00:26:04.559 "zcopy": false, 00:26:04.559 "get_zone_info": false, 00:26:04.559 "zone_management": false, 00:26:04.559 "zone_append": false, 00:26:04.559 "compare": false, 00:26:04.559 "compare_and_write": false, 00:26:04.559 "abort": false, 00:26:04.559 "seek_hole": false, 00:26:04.559 "seek_data": false, 00:26:04.559 "copy": false, 00:26:04.559 "nvme_iov_md": false 00:26:04.559 }, 00:26:04.560 "memory_domains": [ 00:26:04.560 { 00:26:04.560 "dma_device_id": "system", 00:26:04.560 "dma_device_type": 1 00:26:04.560 }, 00:26:04.560 { 00:26:04.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:04.560 "dma_device_type": 2 00:26:04.560 }, 00:26:04.560 { 00:26:04.560 "dma_device_id": "system", 00:26:04.560 "dma_device_type": 1 00:26:04.560 }, 00:26:04.560 { 00:26:04.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:04.560 "dma_device_type": 2 00:26:04.560 } 00:26:04.560 ], 00:26:04.560 "driver_specific": { 00:26:04.560 "raid": { 00:26:04.560 "uuid": "0dd889d7-33d3-4936-996b-f104b7310a2d", 00:26:04.560 "strip_size_kb": 0, 00:26:04.560 "state": "online", 00:26:04.560 "raid_level": "raid1", 00:26:04.560 "superblock": true, 00:26:04.560 "num_base_bdevs": 2, 00:26:04.560 "num_base_bdevs_discovered": 2, 00:26:04.560 "num_base_bdevs_operational": 2, 00:26:04.560 "base_bdevs_list": [ 00:26:04.560 { 00:26:04.560 "name": "pt1", 00:26:04.560 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:04.560 "is_configured": true, 00:26:04.560 "data_offset": 256, 00:26:04.560 "data_size": 7936 00:26:04.560 }, 00:26:04.560 { 00:26:04.560 "name": "pt2", 00:26:04.560 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:04.560 "is_configured": true, 00:26:04.560 "data_offset": 256, 00:26:04.560 "data_size": 7936 00:26:04.560 } 00:26:04.560 ] 00:26:04.560 } 00:26:04.560 } 00:26:04.560 }' 00:26:04.560 08:39:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:04.560 08:39:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:04.560 pt2' 00:26:04.560 08:39:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:04.560 08:39:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:04.560 08:39:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:04.560 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:04.560 "name": "pt1", 00:26:04.560 "aliases": [ 00:26:04.560 "00000000-0000-0000-0000-000000000001" 00:26:04.560 ], 00:26:04.560 "product_name": "passthru", 00:26:04.560 "block_size": 4096, 00:26:04.560 "num_blocks": 8192, 00:26:04.560 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:04.560 "md_size": 32, 00:26:04.560 "md_interleave": false, 00:26:04.560 "dif_type": 0, 00:26:04.560 "assigned_rate_limits": { 00:26:04.560 "rw_ios_per_sec": 0, 00:26:04.560 "rw_mbytes_per_sec": 0, 00:26:04.560 "r_mbytes_per_sec": 0, 00:26:04.560 "w_mbytes_per_sec": 0 00:26:04.560 }, 00:26:04.560 "claimed": true, 00:26:04.560 "claim_type": "exclusive_write", 00:26:04.560 "zoned": false, 00:26:04.560 "supported_io_types": { 00:26:04.560 "read": true, 00:26:04.560 "write": true, 00:26:04.560 "unmap": true, 00:26:04.560 "flush": true, 00:26:04.560 "reset": true, 00:26:04.560 "nvme_admin": false, 00:26:04.560 "nvme_io": false, 00:26:04.560 "nvme_io_md": false, 00:26:04.560 "write_zeroes": true, 00:26:04.560 "zcopy": true, 00:26:04.560 "get_zone_info": false, 00:26:04.560 "zone_management": false, 00:26:04.560 "zone_append": false, 00:26:04.560 "compare": false, 00:26:04.560 "compare_and_write": false, 00:26:04.560 "abort": true, 00:26:04.560 "seek_hole": false, 00:26:04.560 "seek_data": false, 00:26:04.560 "copy": true, 00:26:04.560 "nvme_iov_md": false 00:26:04.560 }, 00:26:04.560 "memory_domains": [ 00:26:04.560 { 00:26:04.560 "dma_device_id": "system", 00:26:04.560 "dma_device_type": 1 00:26:04.560 }, 00:26:04.560 { 00:26:04.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:04.560 "dma_device_type": 2 00:26:04.560 } 00:26:04.560 ], 00:26:04.560 "driver_specific": { 00:26:04.560 "passthru": { 00:26:04.560 "name": "pt1", 00:26:04.560 "base_bdev_name": "malloc1" 00:26:04.560 } 00:26:04.560 } 00:26:04.560 }' 00:26:04.560 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:04.560 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:04.819 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:04.819 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:04.819 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:04.819 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:04.819 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:04.819 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:04.819 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:04.819 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:04.819 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:04.819 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:04.820 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:04.820 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:04.820 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:05.079 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:05.079 "name": "pt2", 00:26:05.079 "aliases": [ 00:26:05.079 "00000000-0000-0000-0000-000000000002" 00:26:05.079 ], 00:26:05.079 "product_name": "passthru", 00:26:05.079 "block_size": 4096, 00:26:05.079 "num_blocks": 8192, 00:26:05.079 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:05.079 "md_size": 32, 00:26:05.079 "md_interleave": false, 00:26:05.079 "dif_type": 0, 00:26:05.079 "assigned_rate_limits": { 00:26:05.079 "rw_ios_per_sec": 0, 00:26:05.079 "rw_mbytes_per_sec": 0, 00:26:05.079 "r_mbytes_per_sec": 0, 00:26:05.079 "w_mbytes_per_sec": 0 00:26:05.079 }, 00:26:05.079 "claimed": true, 00:26:05.079 "claim_type": "exclusive_write", 00:26:05.079 "zoned": false, 00:26:05.079 "supported_io_types": { 00:26:05.079 "read": true, 00:26:05.079 "write": true, 00:26:05.079 "unmap": true, 00:26:05.079 "flush": true, 00:26:05.079 "reset": true, 00:26:05.079 "nvme_admin": false, 00:26:05.079 "nvme_io": false, 00:26:05.079 "nvme_io_md": false, 00:26:05.079 "write_zeroes": true, 00:26:05.079 "zcopy": true, 00:26:05.079 "get_zone_info": false, 00:26:05.079 "zone_management": false, 00:26:05.079 "zone_append": false, 00:26:05.079 "compare": false, 00:26:05.079 "compare_and_write": false, 00:26:05.079 "abort": true, 00:26:05.079 "seek_hole": false, 00:26:05.079 "seek_data": false, 00:26:05.079 "copy": true, 00:26:05.079 "nvme_iov_md": false 00:26:05.079 }, 00:26:05.079 "memory_domains": [ 00:26:05.079 { 00:26:05.079 "dma_device_id": "system", 00:26:05.079 "dma_device_type": 1 00:26:05.079 }, 00:26:05.079 { 00:26:05.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:05.079 "dma_device_type": 2 00:26:05.079 } 00:26:05.079 ], 00:26:05.079 "driver_specific": { 00:26:05.079 "passthru": { 00:26:05.079 "name": "pt2", 00:26:05.079 "base_bdev_name": "malloc2" 00:26:05.079 } 00:26:05.079 } 00:26:05.079 }' 00:26:05.079 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:05.079 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:05.079 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:05.079 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:05.079 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:05.339 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:05.339 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:05.339 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:05.339 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:05.339 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:05.339 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:05.339 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:05.339 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:05.339 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:26:05.599 [2024-07-23 08:39:17.900293] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:05.599 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' 0dd889d7-33d3-4936-996b-f104b7310a2d '!=' 0dd889d7-33d3-4936-996b-f104b7310a2d ']' 00:26:05.599 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:26:05.599 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:05.599 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:26:05.599 08:39:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:05.599 [2024-07-23 08:39:18.068504] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:26:05.599 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:05.599 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:05.599 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:05.599 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:05.599 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:05.599 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:05.599 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:05.599 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:05.599 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:05.599 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:05.599 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:05.599 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:05.858 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:05.858 "name": "raid_bdev1", 00:26:05.858 "uuid": "0dd889d7-33d3-4936-996b-f104b7310a2d", 00:26:05.858 "strip_size_kb": 0, 00:26:05.858 "state": "online", 00:26:05.858 "raid_level": "raid1", 00:26:05.858 "superblock": true, 00:26:05.858 "num_base_bdevs": 2, 00:26:05.858 "num_base_bdevs_discovered": 1, 00:26:05.858 "num_base_bdevs_operational": 1, 00:26:05.858 "base_bdevs_list": [ 00:26:05.858 { 00:26:05.858 "name": null, 00:26:05.858 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:05.858 "is_configured": false, 00:26:05.858 "data_offset": 256, 00:26:05.858 "data_size": 7936 00:26:05.858 }, 00:26:05.858 { 00:26:05.858 "name": "pt2", 00:26:05.858 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:05.858 "is_configured": true, 00:26:05.858 "data_offset": 256, 00:26:05.858 "data_size": 7936 00:26:05.858 } 00:26:05.858 ] 00:26:05.858 }' 00:26:05.858 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:05.858 08:39:18 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:06.426 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:06.426 [2024-07-23 08:39:18.858541] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:06.426 [2024-07-23 08:39:18.858569] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:06.426 [2024-07-23 08:39:18.858643] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:06.426 [2024-07-23 08:39:18.858689] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:06.426 [2024-07-23 08:39:18.858701] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036c80 name raid_bdev1, state offline 00:26:06.426 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:26:06.426 08:39:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:06.720 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:26:06.720 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:26:06.720 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:26:06.720 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:06.720 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:06.720 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:26:06.720 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:06.720 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:26:06.721 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:26:06.721 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:26:06.721 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:06.980 [2024-07-23 08:39:19.391963] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:06.980 [2024-07-23 08:39:19.392024] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:06.980 [2024-07-23 08:39:19.392041] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037280 00:26:06.980 [2024-07-23 08:39:19.392052] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:06.980 [2024-07-23 08:39:19.393808] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:06.980 [2024-07-23 08:39:19.393837] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:06.980 [2024-07-23 08:39:19.393886] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:06.980 [2024-07-23 08:39:19.393933] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:06.980 [2024-07-23 08:39:19.394052] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037880 00:26:06.980 [2024-07-23 08:39:19.394063] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:06.980 [2024-07-23 08:39:19.394129] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:26:06.980 [2024-07-23 08:39:19.394274] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037880 00:26:06.980 [2024-07-23 08:39:19.394283] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037880 00:26:06.980 [2024-07-23 08:39:19.394388] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:06.980 pt2 00:26:06.980 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:06.980 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:06.980 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:06.980 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:06.980 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:06.980 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:06.980 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:06.980 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:06.980 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:06.980 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:06.980 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:06.980 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.239 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:07.239 "name": "raid_bdev1", 00:26:07.239 "uuid": "0dd889d7-33d3-4936-996b-f104b7310a2d", 00:26:07.239 "strip_size_kb": 0, 00:26:07.239 "state": "online", 00:26:07.239 "raid_level": "raid1", 00:26:07.239 "superblock": true, 00:26:07.239 "num_base_bdevs": 2, 00:26:07.239 "num_base_bdevs_discovered": 1, 00:26:07.239 "num_base_bdevs_operational": 1, 00:26:07.239 "base_bdevs_list": [ 00:26:07.239 { 00:26:07.239 "name": null, 00:26:07.239 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:07.239 "is_configured": false, 00:26:07.239 "data_offset": 256, 00:26:07.239 "data_size": 7936 00:26:07.239 }, 00:26:07.239 { 00:26:07.239 "name": "pt2", 00:26:07.239 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:07.239 "is_configured": true, 00:26:07.239 "data_offset": 256, 00:26:07.239 "data_size": 7936 00:26:07.239 } 00:26:07.239 ] 00:26:07.239 }' 00:26:07.239 08:39:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:07.239 08:39:19 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:07.807 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:07.807 [2024-07-23 08:39:20.210134] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:07.807 [2024-07-23 08:39:20.210174] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:07.807 [2024-07-23 08:39:20.210240] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:07.807 [2024-07-23 08:39:20.210290] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:07.807 [2024-07-23 08:39:20.210301] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037880 name raid_bdev1, state offline 00:26:07.807 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.807 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:26:08.065 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:26:08.065 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:26:08.065 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:26:08.065 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:08.065 [2024-07-23 08:39:20.567045] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:08.065 [2024-07-23 08:39:20.567100] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:08.065 [2024-07-23 08:39:20.567120] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037b80 00:26:08.065 [2024-07-23 08:39:20.567130] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:08.065 [2024-07-23 08:39:20.569053] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:08.065 [2024-07-23 08:39:20.569081] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:08.065 [2024-07-23 08:39:20.569135] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:08.065 [2024-07-23 08:39:20.569185] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:08.066 [2024-07-23 08:39:20.569356] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:26:08.066 [2024-07-23 08:39:20.569371] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:08.066 [2024-07-23 08:39:20.569392] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038180 name raid_bdev1, state configuring 00:26:08.066 [2024-07-23 08:39:20.569471] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:08.066 [2024-07-23 08:39:20.569546] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000038480 00:26:08.066 [2024-07-23 08:39:20.569556] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:08.066 [2024-07-23 08:39:20.569633] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:26:08.066 [2024-07-23 08:39:20.569792] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000038480 00:26:08.066 [2024-07-23 08:39:20.569804] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000038480 00:26:08.066 [2024-07-23 08:39:20.569913] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:08.066 pt1 00:26:08.324 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:26:08.324 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:08.324 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:08.324 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:08.324 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:08.324 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:08.324 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:08.324 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:08.324 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:08.324 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:08.324 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:08.324 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:08.324 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.324 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:08.324 "name": "raid_bdev1", 00:26:08.324 "uuid": "0dd889d7-33d3-4936-996b-f104b7310a2d", 00:26:08.324 "strip_size_kb": 0, 00:26:08.324 "state": "online", 00:26:08.324 "raid_level": "raid1", 00:26:08.324 "superblock": true, 00:26:08.324 "num_base_bdevs": 2, 00:26:08.324 "num_base_bdevs_discovered": 1, 00:26:08.324 "num_base_bdevs_operational": 1, 00:26:08.324 "base_bdevs_list": [ 00:26:08.324 { 00:26:08.324 "name": null, 00:26:08.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.324 "is_configured": false, 00:26:08.324 "data_offset": 256, 00:26:08.324 "data_size": 7936 00:26:08.324 }, 00:26:08.324 { 00:26:08.324 "name": "pt2", 00:26:08.324 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:08.324 "is_configured": true, 00:26:08.324 "data_offset": 256, 00:26:08.324 "data_size": 7936 00:26:08.324 } 00:26:08.324 ] 00:26:08.324 }' 00:26:08.324 08:39:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:08.324 08:39:20 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:08.892 08:39:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:26:08.892 08:39:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:26:09.150 08:39:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:26:09.150 08:39:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:09.150 08:39:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:26:09.150 [2024-07-23 08:39:21.589972] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:09.150 08:39:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 0dd889d7-33d3-4936-996b-f104b7310a2d '!=' 0dd889d7-33d3-4936-996b-f104b7310a2d ']' 00:26:09.150 08:39:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 1573817 00:26:09.150 08:39:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 1573817 ']' 00:26:09.150 08:39:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 1573817 00:26:09.150 08:39:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:26:09.150 08:39:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:09.150 08:39:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1573817 00:26:09.150 08:39:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:09.150 08:39:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:09.150 08:39:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1573817' 00:26:09.150 killing process with pid 1573817 00:26:09.150 08:39:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 1573817 00:26:09.150 [2024-07-23 08:39:21.655501] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:09.150 [2024-07-23 08:39:21.655594] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:09.150 [2024-07-23 08:39:21.655652] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:09.150 [2024-07-23 08:39:21.655665] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038480 name raid_bdev1, state offline 00:26:09.150 08:39:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 1573817 00:26:09.409 [2024-07-23 08:39:21.884560] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:10.784 08:39:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:26:10.784 00:26:10.784 real 0m13.156s 00:26:10.784 user 0m22.826s 00:26:10.784 sys 0m2.026s 00:26:10.784 08:39:23 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:10.784 08:39:23 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:10.784 ************************************ 00:26:10.784 END TEST raid_superblock_test_md_separate 00:26:10.784 ************************************ 00:26:10.784 08:39:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:10.784 08:39:23 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:26:10.784 08:39:23 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:26:10.784 08:39:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:10.784 08:39:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:10.784 08:39:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:10.784 ************************************ 00:26:10.784 START TEST raid_rebuild_test_sb_md_separate 00:26:10.784 ************************************ 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=1576572 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 1576572 /var/tmp/spdk-raid.sock 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 1576572 ']' 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:10.784 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:10.784 08:39:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:10.784 [2024-07-23 08:39:23.296307] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:26:10.784 [2024-07-23 08:39:23.296398] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1576572 ] 00:26:10.784 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:10.784 Zero copy mechanism will not be used. 00:26:11.042 [2024-07-23 08:39:23.415952] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:11.299 [2024-07-23 08:39:23.625921] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:11.556 [2024-07-23 08:39:23.868856] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:11.556 [2024-07-23 08:39:23.868890] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:11.557 08:39:24 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:11.557 08:39:24 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:26:11.557 08:39:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:11.557 08:39:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:26:11.815 BaseBdev1_malloc 00:26:11.815 08:39:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:12.073 [2024-07-23 08:39:24.435460] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:12.073 [2024-07-23 08:39:24.435519] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:12.073 [2024-07-23 08:39:24.435546] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:26:12.073 [2024-07-23 08:39:24.435557] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:12.073 [2024-07-23 08:39:24.437341] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:12.073 [2024-07-23 08:39:24.437374] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:12.073 BaseBdev1 00:26:12.073 08:39:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:12.073 08:39:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:26:12.331 BaseBdev2_malloc 00:26:12.331 08:39:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:12.331 [2024-07-23 08:39:24.828788] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:12.331 [2024-07-23 08:39:24.828841] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:12.331 [2024-07-23 08:39:24.828878] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:26:12.331 [2024-07-23 08:39:24.828890] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:12.331 [2024-07-23 08:39:24.830669] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:12.331 [2024-07-23 08:39:24.830698] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:12.331 BaseBdev2 00:26:12.331 08:39:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:26:12.589 spare_malloc 00:26:12.589 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:12.847 spare_delay 00:26:12.847 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:13.106 [2024-07-23 08:39:25.385034] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:13.106 [2024-07-23 08:39:25.385085] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:13.106 [2024-07-23 08:39:25.385108] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036680 00:26:13.106 [2024-07-23 08:39:25.385118] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:13.106 [2024-07-23 08:39:25.386878] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:13.106 [2024-07-23 08:39:25.386907] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:13.106 spare 00:26:13.106 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:26:13.106 [2024-07-23 08:39:25.553517] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:13.106 [2024-07-23 08:39:25.555130] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:13.106 [2024-07-23 08:39:25.555311] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036c80 00:26:13.106 [2024-07-23 08:39:25.555325] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:13.106 [2024-07-23 08:39:25.555407] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:26:13.106 [2024-07-23 08:39:25.555578] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036c80 00:26:13.106 [2024-07-23 08:39:25.555586] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036c80 00:26:13.106 [2024-07-23 08:39:25.555704] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:13.106 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:13.106 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:13.106 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:13.106 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:13.106 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:13.106 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:13.106 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:13.106 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:13.106 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:13.106 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:13.106 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.106 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:13.364 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:13.364 "name": "raid_bdev1", 00:26:13.364 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:13.364 "strip_size_kb": 0, 00:26:13.364 "state": "online", 00:26:13.364 "raid_level": "raid1", 00:26:13.364 "superblock": true, 00:26:13.364 "num_base_bdevs": 2, 00:26:13.364 "num_base_bdevs_discovered": 2, 00:26:13.364 "num_base_bdevs_operational": 2, 00:26:13.364 "base_bdevs_list": [ 00:26:13.364 { 00:26:13.365 "name": "BaseBdev1", 00:26:13.365 "uuid": "4d07e425-1bcf-5261-8d06-0cabb2e3b060", 00:26:13.365 "is_configured": true, 00:26:13.365 "data_offset": 256, 00:26:13.365 "data_size": 7936 00:26:13.365 }, 00:26:13.365 { 00:26:13.365 "name": "BaseBdev2", 00:26:13.365 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:13.365 "is_configured": true, 00:26:13.365 "data_offset": 256, 00:26:13.365 "data_size": 7936 00:26:13.365 } 00:26:13.365 ] 00:26:13.365 }' 00:26:13.365 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:13.365 08:39:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:13.930 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:13.930 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:13.930 [2024-07-23 08:39:26.355857] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:13.930 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:26:13.930 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.930 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:14.188 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:26:14.188 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:26:14.188 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:26:14.188 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:26:14.188 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:26:14.188 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:14.188 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:26:14.188 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:14.188 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:14.188 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:14.188 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:26:14.188 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:14.188 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:14.188 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:26:14.188 [2024-07-23 08:39:26.688517] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:26:14.446 /dev/nbd0 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:14.446 1+0 records in 00:26:14.446 1+0 records out 00:26:14.446 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228454 s, 17.9 MB/s 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:26:14.446 08:39:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:26:15.013 7936+0 records in 00:26:15.013 7936+0 records out 00:26:15.013 32505856 bytes (33 MB, 31 MiB) copied, 0.563352 s, 57.7 MB/s 00:26:15.013 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:15.013 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:15.013 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:15.013 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:15.013 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:26:15.013 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:15.013 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:15.013 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:15.013 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:15.013 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:15.013 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:15.013 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:15.013 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:15.013 [2024-07-23 08:39:27.527805] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:15.013 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:26:15.013 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:26:15.013 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:15.272 [2024-07-23 08:39:27.680296] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:15.272 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:15.272 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:15.272 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:15.272 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:15.272 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:15.272 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:15.272 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:15.272 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:15.272 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:15.272 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:15.272 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:15.272 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:15.530 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:15.530 "name": "raid_bdev1", 00:26:15.530 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:15.530 "strip_size_kb": 0, 00:26:15.530 "state": "online", 00:26:15.530 "raid_level": "raid1", 00:26:15.530 "superblock": true, 00:26:15.530 "num_base_bdevs": 2, 00:26:15.530 "num_base_bdevs_discovered": 1, 00:26:15.530 "num_base_bdevs_operational": 1, 00:26:15.530 "base_bdevs_list": [ 00:26:15.530 { 00:26:15.530 "name": null, 00:26:15.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:15.530 "is_configured": false, 00:26:15.530 "data_offset": 256, 00:26:15.530 "data_size": 7936 00:26:15.530 }, 00:26:15.530 { 00:26:15.530 "name": "BaseBdev2", 00:26:15.530 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:15.530 "is_configured": true, 00:26:15.530 "data_offset": 256, 00:26:15.530 "data_size": 7936 00:26:15.530 } 00:26:15.530 ] 00:26:15.530 }' 00:26:15.530 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:15.530 08:39:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:16.097 08:39:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:16.097 [2024-07-23 08:39:28.522528] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:16.097 [2024-07-23 08:39:28.539789] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0001a2c80 00:26:16.097 [2024-07-23 08:39:28.541457] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:16.097 08:39:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:17.472 08:39:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:17.472 08:39:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:17.472 08:39:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:17.472 08:39:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:17.472 08:39:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:17.472 08:39:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.472 08:39:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:17.472 08:39:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:17.472 "name": "raid_bdev1", 00:26:17.472 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:17.472 "strip_size_kb": 0, 00:26:17.472 "state": "online", 00:26:17.472 "raid_level": "raid1", 00:26:17.472 "superblock": true, 00:26:17.472 "num_base_bdevs": 2, 00:26:17.472 "num_base_bdevs_discovered": 2, 00:26:17.472 "num_base_bdevs_operational": 2, 00:26:17.472 "process": { 00:26:17.472 "type": "rebuild", 00:26:17.472 "target": "spare", 00:26:17.472 "progress": { 00:26:17.472 "blocks": 2816, 00:26:17.472 "percent": 35 00:26:17.472 } 00:26:17.472 }, 00:26:17.472 "base_bdevs_list": [ 00:26:17.472 { 00:26:17.472 "name": "spare", 00:26:17.472 "uuid": "088bc0da-e60a-5638-8519-8b3c5f7669a9", 00:26:17.472 "is_configured": true, 00:26:17.472 "data_offset": 256, 00:26:17.472 "data_size": 7936 00:26:17.472 }, 00:26:17.472 { 00:26:17.472 "name": "BaseBdev2", 00:26:17.472 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:17.472 "is_configured": true, 00:26:17.472 "data_offset": 256, 00:26:17.472 "data_size": 7936 00:26:17.472 } 00:26:17.472 ] 00:26:17.472 }' 00:26:17.472 08:39:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:17.472 08:39:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:17.472 08:39:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:17.472 08:39:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:17.472 08:39:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:17.472 [2024-07-23 08:39:29.974948] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:17.730 [2024-07-23 08:39:30.054390] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:17.731 [2024-07-23 08:39:30.054450] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:17.731 [2024-07-23 08:39:30.054466] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:17.731 [2024-07-23 08:39:30.054477] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:17.731 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:17.731 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:17.731 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:17.731 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:17.731 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:17.731 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:17.731 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:17.731 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:17.731 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:17.731 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:17.731 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.731 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:17.989 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:17.989 "name": "raid_bdev1", 00:26:17.989 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:17.989 "strip_size_kb": 0, 00:26:17.989 "state": "online", 00:26:17.989 "raid_level": "raid1", 00:26:17.989 "superblock": true, 00:26:17.989 "num_base_bdevs": 2, 00:26:17.989 "num_base_bdevs_discovered": 1, 00:26:17.989 "num_base_bdevs_operational": 1, 00:26:17.989 "base_bdevs_list": [ 00:26:17.989 { 00:26:17.989 "name": null, 00:26:17.989 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:17.989 "is_configured": false, 00:26:17.989 "data_offset": 256, 00:26:17.989 "data_size": 7936 00:26:17.989 }, 00:26:17.989 { 00:26:17.989 "name": "BaseBdev2", 00:26:17.989 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:17.989 "is_configured": true, 00:26:17.989 "data_offset": 256, 00:26:17.989 "data_size": 7936 00:26:17.989 } 00:26:17.989 ] 00:26:17.989 }' 00:26:17.989 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:17.989 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:18.556 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:18.556 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:18.556 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:18.556 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:18.556 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:18.556 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:18.556 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.556 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:18.556 "name": "raid_bdev1", 00:26:18.556 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:18.556 "strip_size_kb": 0, 00:26:18.556 "state": "online", 00:26:18.556 "raid_level": "raid1", 00:26:18.556 "superblock": true, 00:26:18.556 "num_base_bdevs": 2, 00:26:18.556 "num_base_bdevs_discovered": 1, 00:26:18.556 "num_base_bdevs_operational": 1, 00:26:18.556 "base_bdevs_list": [ 00:26:18.556 { 00:26:18.556 "name": null, 00:26:18.556 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:18.556 "is_configured": false, 00:26:18.556 "data_offset": 256, 00:26:18.556 "data_size": 7936 00:26:18.556 }, 00:26:18.556 { 00:26:18.556 "name": "BaseBdev2", 00:26:18.556 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:18.556 "is_configured": true, 00:26:18.556 "data_offset": 256, 00:26:18.556 "data_size": 7936 00:26:18.556 } 00:26:18.556 ] 00:26:18.556 }' 00:26:18.556 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:18.556 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:18.556 08:39:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:18.556 08:39:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:18.556 08:39:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:18.815 [2024-07-23 08:39:31.182552] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:18.815 [2024-07-23 08:39:31.197911] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0001a2d50 00:26:18.815 [2024-07-23 08:39:31.199517] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:18.815 08:39:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:19.750 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:19.750 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:19.750 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:19.750 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:19.750 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:19.750 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:19.750 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:20.008 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:20.008 "name": "raid_bdev1", 00:26:20.008 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:20.008 "strip_size_kb": 0, 00:26:20.008 "state": "online", 00:26:20.008 "raid_level": "raid1", 00:26:20.008 "superblock": true, 00:26:20.008 "num_base_bdevs": 2, 00:26:20.008 "num_base_bdevs_discovered": 2, 00:26:20.008 "num_base_bdevs_operational": 2, 00:26:20.008 "process": { 00:26:20.008 "type": "rebuild", 00:26:20.008 "target": "spare", 00:26:20.008 "progress": { 00:26:20.008 "blocks": 2816, 00:26:20.008 "percent": 35 00:26:20.008 } 00:26:20.008 }, 00:26:20.008 "base_bdevs_list": [ 00:26:20.008 { 00:26:20.008 "name": "spare", 00:26:20.008 "uuid": "088bc0da-e60a-5638-8519-8b3c5f7669a9", 00:26:20.008 "is_configured": true, 00:26:20.008 "data_offset": 256, 00:26:20.008 "data_size": 7936 00:26:20.008 }, 00:26:20.008 { 00:26:20.008 "name": "BaseBdev2", 00:26:20.008 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:20.008 "is_configured": true, 00:26:20.008 "data_offset": 256, 00:26:20.008 "data_size": 7936 00:26:20.008 } 00:26:20.008 ] 00:26:20.008 }' 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:26:20.009 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=925 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:20.009 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:20.267 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:20.267 "name": "raid_bdev1", 00:26:20.267 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:20.267 "strip_size_kb": 0, 00:26:20.267 "state": "online", 00:26:20.267 "raid_level": "raid1", 00:26:20.267 "superblock": true, 00:26:20.267 "num_base_bdevs": 2, 00:26:20.267 "num_base_bdevs_discovered": 2, 00:26:20.267 "num_base_bdevs_operational": 2, 00:26:20.267 "process": { 00:26:20.267 "type": "rebuild", 00:26:20.267 "target": "spare", 00:26:20.267 "progress": { 00:26:20.267 "blocks": 3584, 00:26:20.267 "percent": 45 00:26:20.267 } 00:26:20.267 }, 00:26:20.267 "base_bdevs_list": [ 00:26:20.267 { 00:26:20.267 "name": "spare", 00:26:20.267 "uuid": "088bc0da-e60a-5638-8519-8b3c5f7669a9", 00:26:20.267 "is_configured": true, 00:26:20.267 "data_offset": 256, 00:26:20.267 "data_size": 7936 00:26:20.267 }, 00:26:20.267 { 00:26:20.267 "name": "BaseBdev2", 00:26:20.267 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:20.267 "is_configured": true, 00:26:20.267 "data_offset": 256, 00:26:20.267 "data_size": 7936 00:26:20.267 } 00:26:20.267 ] 00:26:20.267 }' 00:26:20.267 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:20.267 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:20.267 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:20.267 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:20.267 08:39:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:21.642 08:39:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:21.642 08:39:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:21.642 08:39:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:21.642 08:39:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:21.642 08:39:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:21.642 08:39:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:21.642 08:39:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.642 08:39:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:21.642 08:39:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:21.642 "name": "raid_bdev1", 00:26:21.642 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:21.642 "strip_size_kb": 0, 00:26:21.642 "state": "online", 00:26:21.642 "raid_level": "raid1", 00:26:21.642 "superblock": true, 00:26:21.642 "num_base_bdevs": 2, 00:26:21.642 "num_base_bdevs_discovered": 2, 00:26:21.642 "num_base_bdevs_operational": 2, 00:26:21.642 "process": { 00:26:21.642 "type": "rebuild", 00:26:21.642 "target": "spare", 00:26:21.642 "progress": { 00:26:21.642 "blocks": 6656, 00:26:21.642 "percent": 83 00:26:21.642 } 00:26:21.642 }, 00:26:21.642 "base_bdevs_list": [ 00:26:21.642 { 00:26:21.642 "name": "spare", 00:26:21.642 "uuid": "088bc0da-e60a-5638-8519-8b3c5f7669a9", 00:26:21.642 "is_configured": true, 00:26:21.642 "data_offset": 256, 00:26:21.642 "data_size": 7936 00:26:21.642 }, 00:26:21.642 { 00:26:21.642 "name": "BaseBdev2", 00:26:21.642 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:21.642 "is_configured": true, 00:26:21.642 "data_offset": 256, 00:26:21.642 "data_size": 7936 00:26:21.642 } 00:26:21.642 ] 00:26:21.642 }' 00:26:21.642 08:39:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:21.642 08:39:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:21.642 08:39:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:21.642 08:39:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:21.642 08:39:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:21.910 [2024-07-23 08:39:34.324039] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:21.910 [2024-07-23 08:39:34.324097] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:21.910 [2024-07-23 08:39:34.324180] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:22.521 08:39:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:22.521 08:39:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:22.521 08:39:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:22.521 08:39:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:22.521 08:39:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:22.521 08:39:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:22.521 08:39:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.521 08:39:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:22.778 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:22.778 "name": "raid_bdev1", 00:26:22.778 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:22.778 "strip_size_kb": 0, 00:26:22.778 "state": "online", 00:26:22.778 "raid_level": "raid1", 00:26:22.778 "superblock": true, 00:26:22.778 "num_base_bdevs": 2, 00:26:22.778 "num_base_bdevs_discovered": 2, 00:26:22.778 "num_base_bdevs_operational": 2, 00:26:22.778 "base_bdevs_list": [ 00:26:22.778 { 00:26:22.778 "name": "spare", 00:26:22.778 "uuid": "088bc0da-e60a-5638-8519-8b3c5f7669a9", 00:26:22.778 "is_configured": true, 00:26:22.778 "data_offset": 256, 00:26:22.778 "data_size": 7936 00:26:22.778 }, 00:26:22.778 { 00:26:22.778 "name": "BaseBdev2", 00:26:22.778 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:22.778 "is_configured": true, 00:26:22.778 "data_offset": 256, 00:26:22.778 "data_size": 7936 00:26:22.778 } 00:26:22.778 ] 00:26:22.778 }' 00:26:22.778 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:22.778 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:22.778 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:22.778 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:22.778 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:26:22.778 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:22.778 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:22.778 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:22.778 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:22.778 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:22.778 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.778 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:23.036 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:23.036 "name": "raid_bdev1", 00:26:23.036 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:23.036 "strip_size_kb": 0, 00:26:23.036 "state": "online", 00:26:23.036 "raid_level": "raid1", 00:26:23.036 "superblock": true, 00:26:23.036 "num_base_bdevs": 2, 00:26:23.036 "num_base_bdevs_discovered": 2, 00:26:23.036 "num_base_bdevs_operational": 2, 00:26:23.036 "base_bdevs_list": [ 00:26:23.036 { 00:26:23.036 "name": "spare", 00:26:23.036 "uuid": "088bc0da-e60a-5638-8519-8b3c5f7669a9", 00:26:23.036 "is_configured": true, 00:26:23.036 "data_offset": 256, 00:26:23.036 "data_size": 7936 00:26:23.036 }, 00:26:23.036 { 00:26:23.036 "name": "BaseBdev2", 00:26:23.036 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:23.036 "is_configured": true, 00:26:23.036 "data_offset": 256, 00:26:23.036 "data_size": 7936 00:26:23.036 } 00:26:23.036 ] 00:26:23.036 }' 00:26:23.036 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:23.037 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:23.037 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:23.037 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:23.037 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:23.037 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:23.037 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:23.037 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:23.037 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:23.037 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:23.037 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:23.037 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:23.037 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:23.037 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:23.037 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:23.037 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.295 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:23.295 "name": "raid_bdev1", 00:26:23.295 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:23.295 "strip_size_kb": 0, 00:26:23.295 "state": "online", 00:26:23.295 "raid_level": "raid1", 00:26:23.295 "superblock": true, 00:26:23.295 "num_base_bdevs": 2, 00:26:23.295 "num_base_bdevs_discovered": 2, 00:26:23.295 "num_base_bdevs_operational": 2, 00:26:23.295 "base_bdevs_list": [ 00:26:23.295 { 00:26:23.295 "name": "spare", 00:26:23.295 "uuid": "088bc0da-e60a-5638-8519-8b3c5f7669a9", 00:26:23.295 "is_configured": true, 00:26:23.295 "data_offset": 256, 00:26:23.295 "data_size": 7936 00:26:23.295 }, 00:26:23.295 { 00:26:23.295 "name": "BaseBdev2", 00:26:23.295 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:23.295 "is_configured": true, 00:26:23.295 "data_offset": 256, 00:26:23.295 "data_size": 7936 00:26:23.295 } 00:26:23.295 ] 00:26:23.295 }' 00:26:23.295 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:23.295 08:39:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:23.858 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:23.858 [2024-07-23 08:39:36.284967] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:23.858 [2024-07-23 08:39:36.284995] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:23.858 [2024-07-23 08:39:36.285067] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:23.858 [2024-07-23 08:39:36.285134] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:23.858 [2024-07-23 08:39:36.285145] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036c80 name raid_bdev1, state offline 00:26:23.858 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.858 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:26:24.143 /dev/nbd0 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:24.143 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:24.401 1+0 records in 00:26:24.401 1+0 records out 00:26:24.401 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021681 s, 18.9 MB/s 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:26:24.401 /dev/nbd1 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:24.401 1+0 records in 00:26:24.401 1+0 records out 00:26:24.401 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241399 s, 17.0 MB/s 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:24.401 08:39:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:24.660 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:26:24.660 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:24.660 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:24.660 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:24.660 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:26:24.660 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:24.660 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:24.918 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:24.918 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:24.918 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:24.918 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:24.918 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:24.918 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:24.918 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:26:24.918 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:26:24.918 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:24.918 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:25.176 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:25.176 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:25.176 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:25.176 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:25.176 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:25.176 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:25.176 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:26:25.176 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:26:25.176 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:25.176 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:25.176 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:25.434 [2024-07-23 08:39:37.784863] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:25.434 [2024-07-23 08:39:37.784916] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:25.434 [2024-07-23 08:39:37.784939] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038480 00:26:25.434 [2024-07-23 08:39:37.784949] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:25.434 [2024-07-23 08:39:37.786764] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:25.434 [2024-07-23 08:39:37.786792] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:25.434 [2024-07-23 08:39:37.786857] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:25.434 [2024-07-23 08:39:37.786924] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:25.434 [2024-07-23 08:39:37.787095] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:25.434 spare 00:26:25.434 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:25.434 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:25.434 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:25.434 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:25.434 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:25.434 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:25.434 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:25.434 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:25.434 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:25.434 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:25.434 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.434 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:25.434 [2024-07-23 08:39:37.887417] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000038a80 00:26:25.434 [2024-07-23 08:39:37.887447] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:25.434 [2024-07-23 08:39:37.887551] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0001c7c00 00:26:25.434 [2024-07-23 08:39:37.887760] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000038a80 00:26:25.434 [2024-07-23 08:39:37.887772] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000038a80 00:26:25.434 [2024-07-23 08:39:37.887892] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:25.692 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:25.692 "name": "raid_bdev1", 00:26:25.692 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:25.692 "strip_size_kb": 0, 00:26:25.692 "state": "online", 00:26:25.692 "raid_level": "raid1", 00:26:25.692 "superblock": true, 00:26:25.692 "num_base_bdevs": 2, 00:26:25.692 "num_base_bdevs_discovered": 2, 00:26:25.692 "num_base_bdevs_operational": 2, 00:26:25.692 "base_bdevs_list": [ 00:26:25.692 { 00:26:25.692 "name": "spare", 00:26:25.692 "uuid": "088bc0da-e60a-5638-8519-8b3c5f7669a9", 00:26:25.692 "is_configured": true, 00:26:25.693 "data_offset": 256, 00:26:25.693 "data_size": 7936 00:26:25.693 }, 00:26:25.693 { 00:26:25.693 "name": "BaseBdev2", 00:26:25.693 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:25.693 "is_configured": true, 00:26:25.693 "data_offset": 256, 00:26:25.693 "data_size": 7936 00:26:25.693 } 00:26:25.693 ] 00:26:25.693 }' 00:26:25.693 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:25.693 08:39:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:25.950 08:39:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:25.950 08:39:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:25.950 08:39:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:25.950 08:39:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:25.950 08:39:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:25.950 08:39:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:25.950 08:39:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.208 08:39:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:26.208 "name": "raid_bdev1", 00:26:26.208 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:26.208 "strip_size_kb": 0, 00:26:26.208 "state": "online", 00:26:26.208 "raid_level": "raid1", 00:26:26.208 "superblock": true, 00:26:26.208 "num_base_bdevs": 2, 00:26:26.208 "num_base_bdevs_discovered": 2, 00:26:26.208 "num_base_bdevs_operational": 2, 00:26:26.208 "base_bdevs_list": [ 00:26:26.208 { 00:26:26.208 "name": "spare", 00:26:26.208 "uuid": "088bc0da-e60a-5638-8519-8b3c5f7669a9", 00:26:26.208 "is_configured": true, 00:26:26.208 "data_offset": 256, 00:26:26.208 "data_size": 7936 00:26:26.208 }, 00:26:26.208 { 00:26:26.208 "name": "BaseBdev2", 00:26:26.208 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:26.208 "is_configured": true, 00:26:26.208 "data_offset": 256, 00:26:26.208 "data_size": 7936 00:26:26.208 } 00:26:26.208 ] 00:26:26.208 }' 00:26:26.208 08:39:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:26.208 08:39:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:26.208 08:39:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:26.208 08:39:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:26.208 08:39:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.208 08:39:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:26.467 08:39:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:26.467 08:39:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:26.726 [2024-07-23 08:39:39.028249] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:26.726 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:26.726 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:26.726 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:26.726 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:26.726 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:26.726 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:26.726 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:26.726 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:26.726 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:26.726 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:26.726 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:26.726 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.726 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:26.726 "name": "raid_bdev1", 00:26:26.726 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:26.726 "strip_size_kb": 0, 00:26:26.726 "state": "online", 00:26:26.726 "raid_level": "raid1", 00:26:26.726 "superblock": true, 00:26:26.726 "num_base_bdevs": 2, 00:26:26.726 "num_base_bdevs_discovered": 1, 00:26:26.726 "num_base_bdevs_operational": 1, 00:26:26.726 "base_bdevs_list": [ 00:26:26.726 { 00:26:26.726 "name": null, 00:26:26.726 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:26.726 "is_configured": false, 00:26:26.726 "data_offset": 256, 00:26:26.726 "data_size": 7936 00:26:26.726 }, 00:26:26.726 { 00:26:26.726 "name": "BaseBdev2", 00:26:26.726 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:26.726 "is_configured": true, 00:26:26.726 "data_offset": 256, 00:26:26.726 "data_size": 7936 00:26:26.726 } 00:26:26.726 ] 00:26:26.726 }' 00:26:26.726 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:26.726 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:27.293 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:27.552 [2024-07-23 08:39:39.862464] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:27.552 [2024-07-23 08:39:39.862648] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:27.552 [2024-07-23 08:39:39.862667] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:27.552 [2024-07-23 08:39:39.862697] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:27.552 [2024-07-23 08:39:39.878878] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0001c7cd0 00:26:27.552 [2024-07-23 08:39:39.880480] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:27.552 08:39:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:28.487 08:39:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:28.487 08:39:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:28.487 08:39:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:28.487 08:39:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:28.487 08:39:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:28.487 08:39:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:28.487 08:39:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:28.747 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:28.747 "name": "raid_bdev1", 00:26:28.747 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:28.747 "strip_size_kb": 0, 00:26:28.747 "state": "online", 00:26:28.747 "raid_level": "raid1", 00:26:28.747 "superblock": true, 00:26:28.747 "num_base_bdevs": 2, 00:26:28.747 "num_base_bdevs_discovered": 2, 00:26:28.747 "num_base_bdevs_operational": 2, 00:26:28.747 "process": { 00:26:28.747 "type": "rebuild", 00:26:28.747 "target": "spare", 00:26:28.747 "progress": { 00:26:28.747 "blocks": 2816, 00:26:28.747 "percent": 35 00:26:28.747 } 00:26:28.747 }, 00:26:28.747 "base_bdevs_list": [ 00:26:28.747 { 00:26:28.747 "name": "spare", 00:26:28.747 "uuid": "088bc0da-e60a-5638-8519-8b3c5f7669a9", 00:26:28.747 "is_configured": true, 00:26:28.747 "data_offset": 256, 00:26:28.747 "data_size": 7936 00:26:28.747 }, 00:26:28.747 { 00:26:28.747 "name": "BaseBdev2", 00:26:28.747 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:28.747 "is_configured": true, 00:26:28.747 "data_offset": 256, 00:26:28.747 "data_size": 7936 00:26:28.747 } 00:26:28.747 ] 00:26:28.747 }' 00:26:28.747 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:28.747 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:28.747 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:28.747 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:28.747 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:29.006 [2024-07-23 08:39:41.302588] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:29.006 [2024-07-23 08:39:41.392365] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:29.006 [2024-07-23 08:39:41.392417] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:29.006 [2024-07-23 08:39:41.392431] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:29.006 [2024-07-23 08:39:41.392440] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:29.006 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:29.006 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:29.006 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:29.006 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:29.006 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:29.006 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:29.006 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:29.006 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:29.006 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:29.006 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:29.006 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.006 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:29.265 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:29.265 "name": "raid_bdev1", 00:26:29.265 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:29.265 "strip_size_kb": 0, 00:26:29.265 "state": "online", 00:26:29.265 "raid_level": "raid1", 00:26:29.265 "superblock": true, 00:26:29.265 "num_base_bdevs": 2, 00:26:29.265 "num_base_bdevs_discovered": 1, 00:26:29.265 "num_base_bdevs_operational": 1, 00:26:29.265 "base_bdevs_list": [ 00:26:29.265 { 00:26:29.265 "name": null, 00:26:29.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.265 "is_configured": false, 00:26:29.265 "data_offset": 256, 00:26:29.265 "data_size": 7936 00:26:29.265 }, 00:26:29.265 { 00:26:29.265 "name": "BaseBdev2", 00:26:29.265 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:29.265 "is_configured": true, 00:26:29.265 "data_offset": 256, 00:26:29.265 "data_size": 7936 00:26:29.265 } 00:26:29.265 ] 00:26:29.265 }' 00:26:29.265 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:29.265 08:39:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:29.833 08:39:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:29.833 [2024-07-23 08:39:42.242067] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:29.833 [2024-07-23 08:39:42.242130] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:29.833 [2024-07-23 08:39:42.242150] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039080 00:26:29.833 [2024-07-23 08:39:42.242163] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:29.833 [2024-07-23 08:39:42.242428] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:29.833 [2024-07-23 08:39:42.242444] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:29.833 [2024-07-23 08:39:42.242502] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:29.833 [2024-07-23 08:39:42.242516] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:29.833 [2024-07-23 08:39:42.242527] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:29.833 [2024-07-23 08:39:42.242554] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:29.833 [2024-07-23 08:39:42.259205] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0001c7da0 00:26:29.833 spare 00:26:29.833 [2024-07-23 08:39:42.260831] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:29.833 08:39:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:30.770 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:30.770 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:30.770 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:30.770 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:30.770 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:30.770 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.770 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:31.028 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:31.028 "name": "raid_bdev1", 00:26:31.028 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:31.028 "strip_size_kb": 0, 00:26:31.028 "state": "online", 00:26:31.028 "raid_level": "raid1", 00:26:31.028 "superblock": true, 00:26:31.028 "num_base_bdevs": 2, 00:26:31.028 "num_base_bdevs_discovered": 2, 00:26:31.028 "num_base_bdevs_operational": 2, 00:26:31.028 "process": { 00:26:31.028 "type": "rebuild", 00:26:31.028 "target": "spare", 00:26:31.028 "progress": { 00:26:31.028 "blocks": 2816, 00:26:31.028 "percent": 35 00:26:31.028 } 00:26:31.028 }, 00:26:31.028 "base_bdevs_list": [ 00:26:31.028 { 00:26:31.028 "name": "spare", 00:26:31.028 "uuid": "088bc0da-e60a-5638-8519-8b3c5f7669a9", 00:26:31.028 "is_configured": true, 00:26:31.028 "data_offset": 256, 00:26:31.028 "data_size": 7936 00:26:31.028 }, 00:26:31.028 { 00:26:31.028 "name": "BaseBdev2", 00:26:31.028 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:31.028 "is_configured": true, 00:26:31.028 "data_offset": 256, 00:26:31.028 "data_size": 7936 00:26:31.028 } 00:26:31.028 ] 00:26:31.028 }' 00:26:31.028 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:31.028 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:31.028 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:31.028 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:31.028 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:31.287 [2024-07-23 08:39:43.675156] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:31.287 [2024-07-23 08:39:43.772936] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:31.287 [2024-07-23 08:39:43.772987] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:31.287 [2024-07-23 08:39:43.773005] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:31.287 [2024-07-23 08:39:43.773014] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:31.546 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:31.546 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:31.546 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:31.546 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:31.546 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:31.546 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:31.546 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:31.546 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:31.546 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:31.546 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:31.546 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.546 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:31.546 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:31.546 "name": "raid_bdev1", 00:26:31.546 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:31.546 "strip_size_kb": 0, 00:26:31.546 "state": "online", 00:26:31.546 "raid_level": "raid1", 00:26:31.546 "superblock": true, 00:26:31.546 "num_base_bdevs": 2, 00:26:31.546 "num_base_bdevs_discovered": 1, 00:26:31.546 "num_base_bdevs_operational": 1, 00:26:31.546 "base_bdevs_list": [ 00:26:31.546 { 00:26:31.546 "name": null, 00:26:31.546 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:31.546 "is_configured": false, 00:26:31.546 "data_offset": 256, 00:26:31.546 "data_size": 7936 00:26:31.546 }, 00:26:31.546 { 00:26:31.546 "name": "BaseBdev2", 00:26:31.546 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:31.546 "is_configured": true, 00:26:31.546 "data_offset": 256, 00:26:31.546 "data_size": 7936 00:26:31.546 } 00:26:31.546 ] 00:26:31.546 }' 00:26:31.546 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:31.546 08:39:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:32.114 08:39:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:32.114 08:39:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:32.114 08:39:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:32.114 08:39:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:32.114 08:39:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:32.114 08:39:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:32.114 08:39:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:32.372 08:39:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:32.372 "name": "raid_bdev1", 00:26:32.372 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:32.372 "strip_size_kb": 0, 00:26:32.372 "state": "online", 00:26:32.372 "raid_level": "raid1", 00:26:32.372 "superblock": true, 00:26:32.372 "num_base_bdevs": 2, 00:26:32.372 "num_base_bdevs_discovered": 1, 00:26:32.372 "num_base_bdevs_operational": 1, 00:26:32.372 "base_bdevs_list": [ 00:26:32.372 { 00:26:32.372 "name": null, 00:26:32.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:32.372 "is_configured": false, 00:26:32.372 "data_offset": 256, 00:26:32.372 "data_size": 7936 00:26:32.372 }, 00:26:32.372 { 00:26:32.372 "name": "BaseBdev2", 00:26:32.372 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:32.372 "is_configured": true, 00:26:32.372 "data_offset": 256, 00:26:32.372 "data_size": 7936 00:26:32.372 } 00:26:32.372 ] 00:26:32.372 }' 00:26:32.373 08:39:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:32.373 08:39:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:32.373 08:39:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:32.373 08:39:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:32.373 08:39:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:32.631 08:39:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:32.631 [2024-07-23 08:39:45.037225] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:32.631 [2024-07-23 08:39:45.037276] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:32.631 [2024-07-23 08:39:45.037300] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000039680 00:26:32.631 [2024-07-23 08:39:45.037309] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:32.631 [2024-07-23 08:39:45.037563] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:32.631 [2024-07-23 08:39:45.037578] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:32.631 [2024-07-23 08:39:45.037638] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:32.631 [2024-07-23 08:39:45.037653] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:32.631 [2024-07-23 08:39:45.037663] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:32.631 BaseBdev1 00:26:32.631 08:39:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:33.568 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:33.568 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:33.568 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:33.568 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:33.568 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:33.568 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:33.568 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:33.568 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:33.568 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:33.568 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:33.568 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:33.568 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:33.827 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:33.827 "name": "raid_bdev1", 00:26:33.827 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:33.827 "strip_size_kb": 0, 00:26:33.827 "state": "online", 00:26:33.827 "raid_level": "raid1", 00:26:33.827 "superblock": true, 00:26:33.827 "num_base_bdevs": 2, 00:26:33.827 "num_base_bdevs_discovered": 1, 00:26:33.827 "num_base_bdevs_operational": 1, 00:26:33.827 "base_bdevs_list": [ 00:26:33.827 { 00:26:33.827 "name": null, 00:26:33.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:33.827 "is_configured": false, 00:26:33.827 "data_offset": 256, 00:26:33.827 "data_size": 7936 00:26:33.827 }, 00:26:33.827 { 00:26:33.827 "name": "BaseBdev2", 00:26:33.827 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:33.827 "is_configured": true, 00:26:33.827 "data_offset": 256, 00:26:33.827 "data_size": 7936 00:26:33.827 } 00:26:33.827 ] 00:26:33.827 }' 00:26:33.827 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:33.827 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:34.393 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:34.393 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:34.393 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:34.393 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:34.393 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:34.393 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:34.393 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:34.393 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:34.393 "name": "raid_bdev1", 00:26:34.393 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:34.393 "strip_size_kb": 0, 00:26:34.393 "state": "online", 00:26:34.393 "raid_level": "raid1", 00:26:34.393 "superblock": true, 00:26:34.393 "num_base_bdevs": 2, 00:26:34.393 "num_base_bdevs_discovered": 1, 00:26:34.393 "num_base_bdevs_operational": 1, 00:26:34.393 "base_bdevs_list": [ 00:26:34.393 { 00:26:34.393 "name": null, 00:26:34.393 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:34.393 "is_configured": false, 00:26:34.393 "data_offset": 256, 00:26:34.393 "data_size": 7936 00:26:34.394 }, 00:26:34.394 { 00:26:34.394 "name": "BaseBdev2", 00:26:34.394 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:34.394 "is_configured": true, 00:26:34.394 "data_offset": 256, 00:26:34.394 "data_size": 7936 00:26:34.394 } 00:26:34.394 ] 00:26:34.394 }' 00:26:34.394 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:34.652 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:34.652 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:34.652 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:34.652 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:34.652 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:26:34.652 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:34.652 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:34.652 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:34.652 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:34.652 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:34.652 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:34.652 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:34.652 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:34.652 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:34.652 08:39:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:34.652 [2024-07-23 08:39:47.138780] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:34.652 [2024-07-23 08:39:47.138936] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:34.652 [2024-07-23 08:39:47.138951] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:34.652 request: 00:26:34.652 { 00:26:34.652 "base_bdev": "BaseBdev1", 00:26:34.652 "raid_bdev": "raid_bdev1", 00:26:34.652 "method": "bdev_raid_add_base_bdev", 00:26:34.652 "req_id": 1 00:26:34.652 } 00:26:34.652 Got JSON-RPC error response 00:26:34.652 response: 00:26:34.652 { 00:26:34.652 "code": -22, 00:26:34.652 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:34.652 } 00:26:34.652 08:39:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:26:34.652 08:39:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:34.652 08:39:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:34.653 08:39:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:34.653 08:39:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:36.029 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:36.029 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:36.029 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:36.029 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:36.029 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:36.029 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:36.029 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:36.029 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:36.029 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:36.029 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:36.029 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:36.029 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:36.029 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:36.029 "name": "raid_bdev1", 00:26:36.029 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:36.029 "strip_size_kb": 0, 00:26:36.029 "state": "online", 00:26:36.029 "raid_level": "raid1", 00:26:36.029 "superblock": true, 00:26:36.029 "num_base_bdevs": 2, 00:26:36.029 "num_base_bdevs_discovered": 1, 00:26:36.029 "num_base_bdevs_operational": 1, 00:26:36.029 "base_bdevs_list": [ 00:26:36.029 { 00:26:36.029 "name": null, 00:26:36.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:36.029 "is_configured": false, 00:26:36.029 "data_offset": 256, 00:26:36.029 "data_size": 7936 00:26:36.029 }, 00:26:36.029 { 00:26:36.029 "name": "BaseBdev2", 00:26:36.029 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:36.029 "is_configured": true, 00:26:36.029 "data_offset": 256, 00:26:36.029 "data_size": 7936 00:26:36.029 } 00:26:36.029 ] 00:26:36.029 }' 00:26:36.029 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:36.029 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:36.596 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:36.596 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:36.596 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:36.596 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:36.596 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:36.596 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:36.596 08:39:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:36.596 08:39:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:36.596 "name": "raid_bdev1", 00:26:36.596 "uuid": "3b1c94b4-3435-44e3-a79e-9644ee4cf86a", 00:26:36.596 "strip_size_kb": 0, 00:26:36.596 "state": "online", 00:26:36.596 "raid_level": "raid1", 00:26:36.596 "superblock": true, 00:26:36.596 "num_base_bdevs": 2, 00:26:36.596 "num_base_bdevs_discovered": 1, 00:26:36.596 "num_base_bdevs_operational": 1, 00:26:36.596 "base_bdevs_list": [ 00:26:36.596 { 00:26:36.596 "name": null, 00:26:36.596 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:36.596 "is_configured": false, 00:26:36.596 "data_offset": 256, 00:26:36.596 "data_size": 7936 00:26:36.596 }, 00:26:36.596 { 00:26:36.596 "name": "BaseBdev2", 00:26:36.596 "uuid": "eb45995f-913f-5f3c-90cc-25a72d6f564f", 00:26:36.596 "is_configured": true, 00:26:36.596 "data_offset": 256, 00:26:36.596 "data_size": 7936 00:26:36.596 } 00:26:36.596 ] 00:26:36.596 }' 00:26:36.596 08:39:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:36.596 08:39:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:36.596 08:39:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:36.596 08:39:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:36.596 08:39:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 1576572 00:26:36.596 08:39:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 1576572 ']' 00:26:36.596 08:39:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 1576572 00:26:36.596 08:39:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:26:36.596 08:39:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:36.596 08:39:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1576572 00:26:36.865 08:39:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:36.865 08:39:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:36.865 08:39:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1576572' 00:26:36.865 killing process with pid 1576572 00:26:36.865 08:39:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 1576572 00:26:36.865 Received shutdown signal, test time was about 60.000000 seconds 00:26:36.865 00:26:36.865 Latency(us) 00:26:36.865 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:36.865 =================================================================================================================== 00:26:36.865 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:36.865 [2024-07-23 08:39:49.131241] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:36.865 [2024-07-23 08:39:49.131359] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:36.865 08:39:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 1576572 00:26:36.865 [2024-07-23 08:39:49.131407] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:36.865 [2024-07-23 08:39:49.131418] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038a80 name raid_bdev1, state offline 00:26:37.168 [2024-07-23 08:39:49.468580] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:38.544 08:39:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:26:38.544 00:26:38.544 real 0m27.510s 00:26:38.544 user 0m41.111s 00:26:38.544 sys 0m3.353s 00:26:38.544 08:39:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:38.544 08:39:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:38.544 ************************************ 00:26:38.544 END TEST raid_rebuild_test_sb_md_separate 00:26:38.544 ************************************ 00:26:38.544 08:39:50 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:38.544 08:39:50 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:26:38.544 08:39:50 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:26:38.544 08:39:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:26:38.544 08:39:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:38.544 08:39:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:38.544 ************************************ 00:26:38.544 START TEST raid_state_function_test_sb_md_interleaved 00:26:38.544 ************************************ 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:26:38.544 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:26:38.545 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:26:38.545 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:26:38.545 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:26:38.545 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:26:38.545 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:26:38.545 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:26:38.545 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=1582013 00:26:38.545 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1582013' 00:26:38.545 Process raid pid: 1582013 00:26:38.545 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:26:38.545 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 1582013 /var/tmp/spdk-raid.sock 00:26:38.545 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 1582013 ']' 00:26:38.545 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:38.545 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:38.545 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:38.545 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:38.545 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:38.545 08:39:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:38.545 [2024-07-23 08:39:50.863605] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:26:38.545 [2024-07-23 08:39:50.863705] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:38.545 [2024-07-23 08:39:50.994206] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:38.804 [2024-07-23 08:39:51.194900] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:39.063 [2024-07-23 08:39:51.452574] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:39.063 [2024-07-23 08:39:51.452603] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:39.322 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:39.322 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:26:39.322 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:39.322 [2024-07-23 08:39:51.788174] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:39.322 [2024-07-23 08:39:51.788215] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:39.322 [2024-07-23 08:39:51.788225] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:39.322 [2024-07-23 08:39:51.788236] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:39.322 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:39.322 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:39.322 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:39.322 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:39.322 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:39.322 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:39.322 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:39.322 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:39.322 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:39.322 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:39.322 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:39.322 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:39.580 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:39.580 "name": "Existed_Raid", 00:26:39.580 "uuid": "164675cc-03d9-4622-9aea-c7c88e075755", 00:26:39.581 "strip_size_kb": 0, 00:26:39.581 "state": "configuring", 00:26:39.581 "raid_level": "raid1", 00:26:39.581 "superblock": true, 00:26:39.581 "num_base_bdevs": 2, 00:26:39.581 "num_base_bdevs_discovered": 0, 00:26:39.581 "num_base_bdevs_operational": 2, 00:26:39.581 "base_bdevs_list": [ 00:26:39.581 { 00:26:39.581 "name": "BaseBdev1", 00:26:39.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:39.581 "is_configured": false, 00:26:39.581 "data_offset": 0, 00:26:39.581 "data_size": 0 00:26:39.581 }, 00:26:39.581 { 00:26:39.581 "name": "BaseBdev2", 00:26:39.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:39.581 "is_configured": false, 00:26:39.581 "data_offset": 0, 00:26:39.581 "data_size": 0 00:26:39.581 } 00:26:39.581 ] 00:26:39.581 }' 00:26:39.581 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:39.581 08:39:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:40.147 08:39:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:40.147 [2024-07-23 08:39:52.606203] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:40.147 [2024-07-23 08:39:52.606237] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034580 name Existed_Raid, state configuring 00:26:40.148 08:39:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:40.406 [2024-07-23 08:39:52.774662] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:40.406 [2024-07-23 08:39:52.774701] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:40.406 [2024-07-23 08:39:52.774709] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:40.406 [2024-07-23 08:39:52.774719] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:40.406 08:39:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:26:40.665 [2024-07-23 08:39:52.979021] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:40.665 BaseBdev1 00:26:40.665 08:39:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:26:40.665 08:39:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:26:40.665 08:39:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:40.665 08:39:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:26:40.666 08:39:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:40.666 08:39:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:40.666 08:39:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:40.666 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:26:40.924 [ 00:26:40.924 { 00:26:40.924 "name": "BaseBdev1", 00:26:40.924 "aliases": [ 00:26:40.924 "3da03c1a-79f9-4849-b51e-6ad6c33b046d" 00:26:40.924 ], 00:26:40.924 "product_name": "Malloc disk", 00:26:40.924 "block_size": 4128, 00:26:40.924 "num_blocks": 8192, 00:26:40.924 "uuid": "3da03c1a-79f9-4849-b51e-6ad6c33b046d", 00:26:40.924 "md_size": 32, 00:26:40.924 "md_interleave": true, 00:26:40.924 "dif_type": 0, 00:26:40.924 "assigned_rate_limits": { 00:26:40.924 "rw_ios_per_sec": 0, 00:26:40.924 "rw_mbytes_per_sec": 0, 00:26:40.924 "r_mbytes_per_sec": 0, 00:26:40.924 "w_mbytes_per_sec": 0 00:26:40.924 }, 00:26:40.924 "claimed": true, 00:26:40.924 "claim_type": "exclusive_write", 00:26:40.924 "zoned": false, 00:26:40.924 "supported_io_types": { 00:26:40.924 "read": true, 00:26:40.924 "write": true, 00:26:40.924 "unmap": true, 00:26:40.924 "flush": true, 00:26:40.924 "reset": true, 00:26:40.924 "nvme_admin": false, 00:26:40.924 "nvme_io": false, 00:26:40.924 "nvme_io_md": false, 00:26:40.924 "write_zeroes": true, 00:26:40.924 "zcopy": true, 00:26:40.924 "get_zone_info": false, 00:26:40.924 "zone_management": false, 00:26:40.924 "zone_append": false, 00:26:40.924 "compare": false, 00:26:40.924 "compare_and_write": false, 00:26:40.924 "abort": true, 00:26:40.924 "seek_hole": false, 00:26:40.924 "seek_data": false, 00:26:40.924 "copy": true, 00:26:40.924 "nvme_iov_md": false 00:26:40.924 }, 00:26:40.924 "memory_domains": [ 00:26:40.924 { 00:26:40.924 "dma_device_id": "system", 00:26:40.924 "dma_device_type": 1 00:26:40.924 }, 00:26:40.924 { 00:26:40.924 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:40.924 "dma_device_type": 2 00:26:40.924 } 00:26:40.924 ], 00:26:40.924 "driver_specific": {} 00:26:40.924 } 00:26:40.924 ] 00:26:40.924 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:26:40.924 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:40.924 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:40.924 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:40.924 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:40.924 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:40.924 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:40.924 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:40.924 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:40.924 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:40.924 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:40.925 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:40.925 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:41.182 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:41.182 "name": "Existed_Raid", 00:26:41.182 "uuid": "bf530cdb-3c64-451f-a3ee-c12db4523c6b", 00:26:41.182 "strip_size_kb": 0, 00:26:41.182 "state": "configuring", 00:26:41.182 "raid_level": "raid1", 00:26:41.182 "superblock": true, 00:26:41.182 "num_base_bdevs": 2, 00:26:41.182 "num_base_bdevs_discovered": 1, 00:26:41.182 "num_base_bdevs_operational": 2, 00:26:41.182 "base_bdevs_list": [ 00:26:41.182 { 00:26:41.182 "name": "BaseBdev1", 00:26:41.182 "uuid": "3da03c1a-79f9-4849-b51e-6ad6c33b046d", 00:26:41.182 "is_configured": true, 00:26:41.182 "data_offset": 256, 00:26:41.182 "data_size": 7936 00:26:41.182 }, 00:26:41.182 { 00:26:41.182 "name": "BaseBdev2", 00:26:41.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:41.183 "is_configured": false, 00:26:41.183 "data_offset": 0, 00:26:41.183 "data_size": 0 00:26:41.183 } 00:26:41.183 ] 00:26:41.183 }' 00:26:41.183 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:41.183 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:41.750 08:39:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:41.750 [2024-07-23 08:39:54.142175] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:41.750 [2024-07-23 08:39:54.142223] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000034880 name Existed_Raid, state configuring 00:26:41.750 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:42.008 [2024-07-23 08:39:54.298623] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:42.009 [2024-07-23 08:39:54.300204] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:42.009 [2024-07-23 08:39:54.300240] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:42.009 "name": "Existed_Raid", 00:26:42.009 "uuid": "a8ed82b5-f37a-4d5f-8c93-bec4b0a31625", 00:26:42.009 "strip_size_kb": 0, 00:26:42.009 "state": "configuring", 00:26:42.009 "raid_level": "raid1", 00:26:42.009 "superblock": true, 00:26:42.009 "num_base_bdevs": 2, 00:26:42.009 "num_base_bdevs_discovered": 1, 00:26:42.009 "num_base_bdevs_operational": 2, 00:26:42.009 "base_bdevs_list": [ 00:26:42.009 { 00:26:42.009 "name": "BaseBdev1", 00:26:42.009 "uuid": "3da03c1a-79f9-4849-b51e-6ad6c33b046d", 00:26:42.009 "is_configured": true, 00:26:42.009 "data_offset": 256, 00:26:42.009 "data_size": 7936 00:26:42.009 }, 00:26:42.009 { 00:26:42.009 "name": "BaseBdev2", 00:26:42.009 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:42.009 "is_configured": false, 00:26:42.009 "data_offset": 0, 00:26:42.009 "data_size": 0 00:26:42.009 } 00:26:42.009 ] 00:26:42.009 }' 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:42.009 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:42.575 08:39:54 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:26:42.833 [2024-07-23 08:39:55.152924] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:42.833 [2024-07-23 08:39:55.153107] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035180 00:26:42.833 [2024-07-23 08:39:55.153121] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:26:42.833 [2024-07-23 08:39:55.153197] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:26:42.833 [2024-07-23 08:39:55.153318] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035180 00:26:42.833 [2024-07-23 08:39:55.153329] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000035180 00:26:42.833 [2024-07-23 08:39:55.153403] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:42.833 BaseBdev2 00:26:42.833 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:26:42.833 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:26:42.833 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:42.833 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:26:42.833 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:42.833 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:42.834 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:42.834 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:26:43.092 [ 00:26:43.092 { 00:26:43.092 "name": "BaseBdev2", 00:26:43.092 "aliases": [ 00:26:43.092 "99e68f85-e626-41aa-a4ee-d49ea22d8e96" 00:26:43.092 ], 00:26:43.092 "product_name": "Malloc disk", 00:26:43.092 "block_size": 4128, 00:26:43.092 "num_blocks": 8192, 00:26:43.092 "uuid": "99e68f85-e626-41aa-a4ee-d49ea22d8e96", 00:26:43.092 "md_size": 32, 00:26:43.092 "md_interleave": true, 00:26:43.092 "dif_type": 0, 00:26:43.092 "assigned_rate_limits": { 00:26:43.092 "rw_ios_per_sec": 0, 00:26:43.092 "rw_mbytes_per_sec": 0, 00:26:43.092 "r_mbytes_per_sec": 0, 00:26:43.092 "w_mbytes_per_sec": 0 00:26:43.092 }, 00:26:43.092 "claimed": true, 00:26:43.092 "claim_type": "exclusive_write", 00:26:43.092 "zoned": false, 00:26:43.092 "supported_io_types": { 00:26:43.092 "read": true, 00:26:43.092 "write": true, 00:26:43.092 "unmap": true, 00:26:43.092 "flush": true, 00:26:43.092 "reset": true, 00:26:43.092 "nvme_admin": false, 00:26:43.092 "nvme_io": false, 00:26:43.092 "nvme_io_md": false, 00:26:43.092 "write_zeroes": true, 00:26:43.092 "zcopy": true, 00:26:43.092 "get_zone_info": false, 00:26:43.093 "zone_management": false, 00:26:43.093 "zone_append": false, 00:26:43.093 "compare": false, 00:26:43.093 "compare_and_write": false, 00:26:43.093 "abort": true, 00:26:43.093 "seek_hole": false, 00:26:43.093 "seek_data": false, 00:26:43.093 "copy": true, 00:26:43.093 "nvme_iov_md": false 00:26:43.093 }, 00:26:43.093 "memory_domains": [ 00:26:43.093 { 00:26:43.093 "dma_device_id": "system", 00:26:43.093 "dma_device_type": 1 00:26:43.093 }, 00:26:43.093 { 00:26:43.093 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:43.093 "dma_device_type": 2 00:26:43.093 } 00:26:43.093 ], 00:26:43.093 "driver_specific": {} 00:26:43.093 } 00:26:43.093 ] 00:26:43.093 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:26:43.093 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:26:43.093 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:43.093 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:26:43.093 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:43.093 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:43.093 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:43.093 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:43.093 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:43.093 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:43.093 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:43.093 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:43.093 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:43.093 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:43.093 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:43.351 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:43.351 "name": "Existed_Raid", 00:26:43.351 "uuid": "a8ed82b5-f37a-4d5f-8c93-bec4b0a31625", 00:26:43.351 "strip_size_kb": 0, 00:26:43.351 "state": "online", 00:26:43.351 "raid_level": "raid1", 00:26:43.351 "superblock": true, 00:26:43.351 "num_base_bdevs": 2, 00:26:43.351 "num_base_bdevs_discovered": 2, 00:26:43.351 "num_base_bdevs_operational": 2, 00:26:43.351 "base_bdevs_list": [ 00:26:43.351 { 00:26:43.351 "name": "BaseBdev1", 00:26:43.351 "uuid": "3da03c1a-79f9-4849-b51e-6ad6c33b046d", 00:26:43.351 "is_configured": true, 00:26:43.352 "data_offset": 256, 00:26:43.352 "data_size": 7936 00:26:43.352 }, 00:26:43.352 { 00:26:43.352 "name": "BaseBdev2", 00:26:43.352 "uuid": "99e68f85-e626-41aa-a4ee-d49ea22d8e96", 00:26:43.352 "is_configured": true, 00:26:43.352 "data_offset": 256, 00:26:43.352 "data_size": 7936 00:26:43.352 } 00:26:43.352 ] 00:26:43.352 }' 00:26:43.352 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:43.352 08:39:55 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:43.918 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:26:43.918 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:43.918 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:43.918 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:43.918 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:43.918 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:26:43.918 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:43.918 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:43.918 [2024-07-23 08:39:56.320345] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:43.918 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:43.918 "name": "Existed_Raid", 00:26:43.918 "aliases": [ 00:26:43.918 "a8ed82b5-f37a-4d5f-8c93-bec4b0a31625" 00:26:43.918 ], 00:26:43.918 "product_name": "Raid Volume", 00:26:43.918 "block_size": 4128, 00:26:43.918 "num_blocks": 7936, 00:26:43.918 "uuid": "a8ed82b5-f37a-4d5f-8c93-bec4b0a31625", 00:26:43.918 "md_size": 32, 00:26:43.918 "md_interleave": true, 00:26:43.918 "dif_type": 0, 00:26:43.918 "assigned_rate_limits": { 00:26:43.918 "rw_ios_per_sec": 0, 00:26:43.918 "rw_mbytes_per_sec": 0, 00:26:43.918 "r_mbytes_per_sec": 0, 00:26:43.918 "w_mbytes_per_sec": 0 00:26:43.918 }, 00:26:43.918 "claimed": false, 00:26:43.918 "zoned": false, 00:26:43.918 "supported_io_types": { 00:26:43.918 "read": true, 00:26:43.918 "write": true, 00:26:43.918 "unmap": false, 00:26:43.918 "flush": false, 00:26:43.918 "reset": true, 00:26:43.918 "nvme_admin": false, 00:26:43.918 "nvme_io": false, 00:26:43.918 "nvme_io_md": false, 00:26:43.918 "write_zeroes": true, 00:26:43.919 "zcopy": false, 00:26:43.919 "get_zone_info": false, 00:26:43.919 "zone_management": false, 00:26:43.919 "zone_append": false, 00:26:43.919 "compare": false, 00:26:43.919 "compare_and_write": false, 00:26:43.919 "abort": false, 00:26:43.919 "seek_hole": false, 00:26:43.919 "seek_data": false, 00:26:43.919 "copy": false, 00:26:43.919 "nvme_iov_md": false 00:26:43.919 }, 00:26:43.919 "memory_domains": [ 00:26:43.919 { 00:26:43.919 "dma_device_id": "system", 00:26:43.919 "dma_device_type": 1 00:26:43.919 }, 00:26:43.919 { 00:26:43.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:43.919 "dma_device_type": 2 00:26:43.919 }, 00:26:43.919 { 00:26:43.919 "dma_device_id": "system", 00:26:43.919 "dma_device_type": 1 00:26:43.919 }, 00:26:43.919 { 00:26:43.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:43.919 "dma_device_type": 2 00:26:43.919 } 00:26:43.919 ], 00:26:43.919 "driver_specific": { 00:26:43.919 "raid": { 00:26:43.919 "uuid": "a8ed82b5-f37a-4d5f-8c93-bec4b0a31625", 00:26:43.919 "strip_size_kb": 0, 00:26:43.919 "state": "online", 00:26:43.919 "raid_level": "raid1", 00:26:43.919 "superblock": true, 00:26:43.919 "num_base_bdevs": 2, 00:26:43.919 "num_base_bdevs_discovered": 2, 00:26:43.919 "num_base_bdevs_operational": 2, 00:26:43.919 "base_bdevs_list": [ 00:26:43.919 { 00:26:43.919 "name": "BaseBdev1", 00:26:43.919 "uuid": "3da03c1a-79f9-4849-b51e-6ad6c33b046d", 00:26:43.919 "is_configured": true, 00:26:43.919 "data_offset": 256, 00:26:43.919 "data_size": 7936 00:26:43.919 }, 00:26:43.919 { 00:26:43.919 "name": "BaseBdev2", 00:26:43.919 "uuid": "99e68f85-e626-41aa-a4ee-d49ea22d8e96", 00:26:43.919 "is_configured": true, 00:26:43.919 "data_offset": 256, 00:26:43.919 "data_size": 7936 00:26:43.919 } 00:26:43.919 ] 00:26:43.919 } 00:26:43.919 } 00:26:43.919 }' 00:26:43.919 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:43.919 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:26:43.919 BaseBdev2' 00:26:43.919 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:43.919 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:26:43.919 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:44.177 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:44.177 "name": "BaseBdev1", 00:26:44.177 "aliases": [ 00:26:44.177 "3da03c1a-79f9-4849-b51e-6ad6c33b046d" 00:26:44.177 ], 00:26:44.177 "product_name": "Malloc disk", 00:26:44.177 "block_size": 4128, 00:26:44.177 "num_blocks": 8192, 00:26:44.177 "uuid": "3da03c1a-79f9-4849-b51e-6ad6c33b046d", 00:26:44.177 "md_size": 32, 00:26:44.177 "md_interleave": true, 00:26:44.177 "dif_type": 0, 00:26:44.177 "assigned_rate_limits": { 00:26:44.177 "rw_ios_per_sec": 0, 00:26:44.177 "rw_mbytes_per_sec": 0, 00:26:44.177 "r_mbytes_per_sec": 0, 00:26:44.177 "w_mbytes_per_sec": 0 00:26:44.177 }, 00:26:44.177 "claimed": true, 00:26:44.177 "claim_type": "exclusive_write", 00:26:44.177 "zoned": false, 00:26:44.177 "supported_io_types": { 00:26:44.177 "read": true, 00:26:44.177 "write": true, 00:26:44.177 "unmap": true, 00:26:44.177 "flush": true, 00:26:44.177 "reset": true, 00:26:44.177 "nvme_admin": false, 00:26:44.177 "nvme_io": false, 00:26:44.178 "nvme_io_md": false, 00:26:44.178 "write_zeroes": true, 00:26:44.178 "zcopy": true, 00:26:44.178 "get_zone_info": false, 00:26:44.178 "zone_management": false, 00:26:44.178 "zone_append": false, 00:26:44.178 "compare": false, 00:26:44.178 "compare_and_write": false, 00:26:44.178 "abort": true, 00:26:44.178 "seek_hole": false, 00:26:44.178 "seek_data": false, 00:26:44.178 "copy": true, 00:26:44.178 "nvme_iov_md": false 00:26:44.178 }, 00:26:44.178 "memory_domains": [ 00:26:44.178 { 00:26:44.178 "dma_device_id": "system", 00:26:44.178 "dma_device_type": 1 00:26:44.178 }, 00:26:44.178 { 00:26:44.178 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:44.178 "dma_device_type": 2 00:26:44.178 } 00:26:44.178 ], 00:26:44.178 "driver_specific": {} 00:26:44.178 }' 00:26:44.178 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:44.178 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:44.178 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:26:44.178 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:44.178 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:44.436 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:44.436 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:44.436 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:44.436 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:26:44.436 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:44.436 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:44.436 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:44.436 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:44.436 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:44.436 08:39:56 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:44.694 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:44.694 "name": "BaseBdev2", 00:26:44.694 "aliases": [ 00:26:44.695 "99e68f85-e626-41aa-a4ee-d49ea22d8e96" 00:26:44.695 ], 00:26:44.695 "product_name": "Malloc disk", 00:26:44.695 "block_size": 4128, 00:26:44.695 "num_blocks": 8192, 00:26:44.695 "uuid": "99e68f85-e626-41aa-a4ee-d49ea22d8e96", 00:26:44.695 "md_size": 32, 00:26:44.695 "md_interleave": true, 00:26:44.695 "dif_type": 0, 00:26:44.695 "assigned_rate_limits": { 00:26:44.695 "rw_ios_per_sec": 0, 00:26:44.695 "rw_mbytes_per_sec": 0, 00:26:44.695 "r_mbytes_per_sec": 0, 00:26:44.695 "w_mbytes_per_sec": 0 00:26:44.695 }, 00:26:44.695 "claimed": true, 00:26:44.695 "claim_type": "exclusive_write", 00:26:44.695 "zoned": false, 00:26:44.695 "supported_io_types": { 00:26:44.695 "read": true, 00:26:44.695 "write": true, 00:26:44.695 "unmap": true, 00:26:44.695 "flush": true, 00:26:44.695 "reset": true, 00:26:44.695 "nvme_admin": false, 00:26:44.695 "nvme_io": false, 00:26:44.695 "nvme_io_md": false, 00:26:44.695 "write_zeroes": true, 00:26:44.695 "zcopy": true, 00:26:44.695 "get_zone_info": false, 00:26:44.695 "zone_management": false, 00:26:44.695 "zone_append": false, 00:26:44.695 "compare": false, 00:26:44.695 "compare_and_write": false, 00:26:44.695 "abort": true, 00:26:44.695 "seek_hole": false, 00:26:44.695 "seek_data": false, 00:26:44.695 "copy": true, 00:26:44.695 "nvme_iov_md": false 00:26:44.695 }, 00:26:44.695 "memory_domains": [ 00:26:44.695 { 00:26:44.695 "dma_device_id": "system", 00:26:44.695 "dma_device_type": 1 00:26:44.695 }, 00:26:44.695 { 00:26:44.695 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:44.695 "dma_device_type": 2 00:26:44.695 } 00:26:44.695 ], 00:26:44.695 "driver_specific": {} 00:26:44.695 }' 00:26:44.695 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:44.695 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:44.695 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:26:44.695 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:44.695 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:44.695 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:44.695 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:44.953 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:44.953 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:26:44.953 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:44.953 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:44.953 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:44.953 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:45.212 [2024-07-23 08:39:57.523311] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.212 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:45.471 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:45.471 "name": "Existed_Raid", 00:26:45.471 "uuid": "a8ed82b5-f37a-4d5f-8c93-bec4b0a31625", 00:26:45.471 "strip_size_kb": 0, 00:26:45.471 "state": "online", 00:26:45.471 "raid_level": "raid1", 00:26:45.471 "superblock": true, 00:26:45.471 "num_base_bdevs": 2, 00:26:45.471 "num_base_bdevs_discovered": 1, 00:26:45.471 "num_base_bdevs_operational": 1, 00:26:45.471 "base_bdevs_list": [ 00:26:45.471 { 00:26:45.471 "name": null, 00:26:45.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:45.471 "is_configured": false, 00:26:45.471 "data_offset": 256, 00:26:45.471 "data_size": 7936 00:26:45.471 }, 00:26:45.471 { 00:26:45.471 "name": "BaseBdev2", 00:26:45.471 "uuid": "99e68f85-e626-41aa-a4ee-d49ea22d8e96", 00:26:45.471 "is_configured": true, 00:26:45.471 "data_offset": 256, 00:26:45.471 "data_size": 7936 00:26:45.471 } 00:26:45.471 ] 00:26:45.471 }' 00:26:45.471 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:45.471 08:39:57 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:45.729 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:26:45.729 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:45.729 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:45.729 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.988 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:45.988 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:45.988 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:26:46.247 [2024-07-23 08:39:58.543220] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:46.247 [2024-07-23 08:39:58.543327] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:46.247 [2024-07-23 08:39:58.641821] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:46.247 [2024-07-23 08:39:58.641865] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:46.247 [2024-07-23 08:39:58.641878] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035180 name Existed_Raid, state offline 00:26:46.247 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:46.247 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:46.247 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:46.247 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:26:46.507 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:26:46.507 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:26:46.507 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:26:46.507 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 1582013 00:26:46.507 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 1582013 ']' 00:26:46.507 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 1582013 00:26:46.507 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:26:46.507 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:46.507 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1582013 00:26:46.507 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:46.507 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:46.507 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1582013' 00:26:46.507 killing process with pid 1582013 00:26:46.507 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 1582013 00:26:46.507 [2024-07-23 08:39:58.872576] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:46.507 08:39:58 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 1582013 00:26:46.507 [2024-07-23 08:39:58.889210] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:47.884 08:40:00 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:26:47.884 00:26:47.884 real 0m9.406s 00:26:47.884 user 0m15.692s 00:26:47.884 sys 0m1.451s 00:26:47.884 08:40:00 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:47.884 08:40:00 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:47.884 ************************************ 00:26:47.884 END TEST raid_state_function_test_sb_md_interleaved 00:26:47.884 ************************************ 00:26:47.884 08:40:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:47.884 08:40:00 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:26:47.884 08:40:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:26:47.884 08:40:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:47.884 08:40:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:47.884 ************************************ 00:26:47.884 START TEST raid_superblock_test_md_interleaved 00:26:47.884 ************************************ 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=1583899 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 1583899 /var/tmp/spdk-raid.sock 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 1583899 ']' 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:47.884 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:47.884 08:40:00 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:47.884 [2024-07-23 08:40:00.331408] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:26:47.884 [2024-07-23 08:40:00.331498] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1583899 ] 00:26:48.143 [2024-07-23 08:40:00.461872] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:48.403 [2024-07-23 08:40:00.684252] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:48.662 [2024-07-23 08:40:00.939846] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:48.662 [2024-07-23 08:40:00.939878] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:48.662 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:48.662 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:26:48.662 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:26:48.662 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:48.662 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:26:48.662 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:26:48.662 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:26:48.662 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:48.662 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:48.662 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:48.662 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:26:48.921 malloc1 00:26:48.921 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:49.180 [2024-07-23 08:40:01.476898] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:49.180 [2024-07-23 08:40:01.476954] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:49.180 [2024-07-23 08:40:01.476996] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:26:49.180 [2024-07-23 08:40:01.477006] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:49.180 [2024-07-23 08:40:01.478758] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:49.180 [2024-07-23 08:40:01.478785] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:49.180 pt1 00:26:49.180 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:49.180 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:49.180 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:26:49.180 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:26:49.180 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:26:49.180 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:49.180 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:49.180 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:49.180 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:26:49.180 malloc2 00:26:49.439 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:49.439 [2024-07-23 08:40:01.868324] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:49.439 [2024-07-23 08:40:01.868380] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:49.439 [2024-07-23 08:40:01.868400] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:26:49.439 [2024-07-23 08:40:01.868409] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:49.439 [2024-07-23 08:40:01.870204] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:49.439 [2024-07-23 08:40:01.870231] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:49.439 pt2 00:26:49.439 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:49.439 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:49.439 08:40:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:26:49.699 [2024-07-23 08:40:02.028766] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:49.699 [2024-07-23 08:40:02.030414] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:49.699 [2024-07-23 08:40:02.030654] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000035a80 00:26:49.699 [2024-07-23 08:40:02.030672] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:26:49.699 [2024-07-23 08:40:02.030762] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bdf0 00:26:49.699 [2024-07-23 08:40:02.030867] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000035a80 00:26:49.699 [2024-07-23 08:40:02.030878] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000035a80 00:26:49.699 [2024-07-23 08:40:02.030967] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:49.699 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:49.699 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:49.699 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:49.699 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:49.699 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:49.699 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:49.699 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:49.699 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:49.699 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:49.699 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:49.699 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:49.699 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:49.699 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:49.699 "name": "raid_bdev1", 00:26:49.699 "uuid": "5edf3b90-0e79-4dda-acaf-b3a1a236aff4", 00:26:49.699 "strip_size_kb": 0, 00:26:49.699 "state": "online", 00:26:49.699 "raid_level": "raid1", 00:26:49.699 "superblock": true, 00:26:49.699 "num_base_bdevs": 2, 00:26:49.699 "num_base_bdevs_discovered": 2, 00:26:49.699 "num_base_bdevs_operational": 2, 00:26:49.699 "base_bdevs_list": [ 00:26:49.699 { 00:26:49.699 "name": "pt1", 00:26:49.699 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:49.699 "is_configured": true, 00:26:49.699 "data_offset": 256, 00:26:49.699 "data_size": 7936 00:26:49.699 }, 00:26:49.699 { 00:26:49.699 "name": "pt2", 00:26:49.699 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:49.699 "is_configured": true, 00:26:49.699 "data_offset": 256, 00:26:49.699 "data_size": 7936 00:26:49.699 } 00:26:49.699 ] 00:26:49.699 }' 00:26:49.699 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:49.699 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:50.266 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:26:50.266 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:50.266 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:50.266 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:50.266 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:50.266 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:26:50.266 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:50.266 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:50.524 [2024-07-23 08:40:02.843119] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:50.524 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:50.524 "name": "raid_bdev1", 00:26:50.524 "aliases": [ 00:26:50.524 "5edf3b90-0e79-4dda-acaf-b3a1a236aff4" 00:26:50.524 ], 00:26:50.524 "product_name": "Raid Volume", 00:26:50.524 "block_size": 4128, 00:26:50.524 "num_blocks": 7936, 00:26:50.525 "uuid": "5edf3b90-0e79-4dda-acaf-b3a1a236aff4", 00:26:50.525 "md_size": 32, 00:26:50.525 "md_interleave": true, 00:26:50.525 "dif_type": 0, 00:26:50.525 "assigned_rate_limits": { 00:26:50.525 "rw_ios_per_sec": 0, 00:26:50.525 "rw_mbytes_per_sec": 0, 00:26:50.525 "r_mbytes_per_sec": 0, 00:26:50.525 "w_mbytes_per_sec": 0 00:26:50.525 }, 00:26:50.525 "claimed": false, 00:26:50.525 "zoned": false, 00:26:50.525 "supported_io_types": { 00:26:50.525 "read": true, 00:26:50.525 "write": true, 00:26:50.525 "unmap": false, 00:26:50.525 "flush": false, 00:26:50.525 "reset": true, 00:26:50.525 "nvme_admin": false, 00:26:50.525 "nvme_io": false, 00:26:50.525 "nvme_io_md": false, 00:26:50.525 "write_zeroes": true, 00:26:50.525 "zcopy": false, 00:26:50.525 "get_zone_info": false, 00:26:50.525 "zone_management": false, 00:26:50.525 "zone_append": false, 00:26:50.525 "compare": false, 00:26:50.525 "compare_and_write": false, 00:26:50.525 "abort": false, 00:26:50.525 "seek_hole": false, 00:26:50.525 "seek_data": false, 00:26:50.525 "copy": false, 00:26:50.525 "nvme_iov_md": false 00:26:50.525 }, 00:26:50.525 "memory_domains": [ 00:26:50.525 { 00:26:50.525 "dma_device_id": "system", 00:26:50.525 "dma_device_type": 1 00:26:50.525 }, 00:26:50.525 { 00:26:50.525 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:50.525 "dma_device_type": 2 00:26:50.525 }, 00:26:50.525 { 00:26:50.525 "dma_device_id": "system", 00:26:50.525 "dma_device_type": 1 00:26:50.525 }, 00:26:50.525 { 00:26:50.525 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:50.525 "dma_device_type": 2 00:26:50.525 } 00:26:50.525 ], 00:26:50.525 "driver_specific": { 00:26:50.525 "raid": { 00:26:50.525 "uuid": "5edf3b90-0e79-4dda-acaf-b3a1a236aff4", 00:26:50.525 "strip_size_kb": 0, 00:26:50.525 "state": "online", 00:26:50.525 "raid_level": "raid1", 00:26:50.525 "superblock": true, 00:26:50.525 "num_base_bdevs": 2, 00:26:50.525 "num_base_bdevs_discovered": 2, 00:26:50.525 "num_base_bdevs_operational": 2, 00:26:50.525 "base_bdevs_list": [ 00:26:50.525 { 00:26:50.525 "name": "pt1", 00:26:50.525 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:50.525 "is_configured": true, 00:26:50.525 "data_offset": 256, 00:26:50.525 "data_size": 7936 00:26:50.525 }, 00:26:50.525 { 00:26:50.525 "name": "pt2", 00:26:50.525 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:50.525 "is_configured": true, 00:26:50.525 "data_offset": 256, 00:26:50.525 "data_size": 7936 00:26:50.525 } 00:26:50.525 ] 00:26:50.525 } 00:26:50.525 } 00:26:50.525 }' 00:26:50.525 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:50.525 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:50.525 pt2' 00:26:50.525 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:50.525 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:50.525 08:40:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:50.783 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:50.783 "name": "pt1", 00:26:50.783 "aliases": [ 00:26:50.783 "00000000-0000-0000-0000-000000000001" 00:26:50.783 ], 00:26:50.783 "product_name": "passthru", 00:26:50.783 "block_size": 4128, 00:26:50.783 "num_blocks": 8192, 00:26:50.783 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:50.783 "md_size": 32, 00:26:50.783 "md_interleave": true, 00:26:50.783 "dif_type": 0, 00:26:50.783 "assigned_rate_limits": { 00:26:50.784 "rw_ios_per_sec": 0, 00:26:50.784 "rw_mbytes_per_sec": 0, 00:26:50.784 "r_mbytes_per_sec": 0, 00:26:50.784 "w_mbytes_per_sec": 0 00:26:50.784 }, 00:26:50.784 "claimed": true, 00:26:50.784 "claim_type": "exclusive_write", 00:26:50.784 "zoned": false, 00:26:50.784 "supported_io_types": { 00:26:50.784 "read": true, 00:26:50.784 "write": true, 00:26:50.784 "unmap": true, 00:26:50.784 "flush": true, 00:26:50.784 "reset": true, 00:26:50.784 "nvme_admin": false, 00:26:50.784 "nvme_io": false, 00:26:50.784 "nvme_io_md": false, 00:26:50.784 "write_zeroes": true, 00:26:50.784 "zcopy": true, 00:26:50.784 "get_zone_info": false, 00:26:50.784 "zone_management": false, 00:26:50.784 "zone_append": false, 00:26:50.784 "compare": false, 00:26:50.784 "compare_and_write": false, 00:26:50.784 "abort": true, 00:26:50.784 "seek_hole": false, 00:26:50.784 "seek_data": false, 00:26:50.784 "copy": true, 00:26:50.784 "nvme_iov_md": false 00:26:50.784 }, 00:26:50.784 "memory_domains": [ 00:26:50.784 { 00:26:50.784 "dma_device_id": "system", 00:26:50.784 "dma_device_type": 1 00:26:50.784 }, 00:26:50.784 { 00:26:50.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:50.784 "dma_device_type": 2 00:26:50.784 } 00:26:50.784 ], 00:26:50.784 "driver_specific": { 00:26:50.784 "passthru": { 00:26:50.784 "name": "pt1", 00:26:50.784 "base_bdev_name": "malloc1" 00:26:50.784 } 00:26:50.784 } 00:26:50.784 }' 00:26:50.784 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:50.784 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:50.784 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:26:50.784 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:50.784 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:50.784 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:50.784 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:50.784 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:50.784 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:26:50.784 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:51.043 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:51.043 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:51.043 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:51.043 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:51.043 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:51.043 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:51.043 "name": "pt2", 00:26:51.043 "aliases": [ 00:26:51.043 "00000000-0000-0000-0000-000000000002" 00:26:51.043 ], 00:26:51.043 "product_name": "passthru", 00:26:51.043 "block_size": 4128, 00:26:51.043 "num_blocks": 8192, 00:26:51.043 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:51.043 "md_size": 32, 00:26:51.043 "md_interleave": true, 00:26:51.043 "dif_type": 0, 00:26:51.043 "assigned_rate_limits": { 00:26:51.043 "rw_ios_per_sec": 0, 00:26:51.043 "rw_mbytes_per_sec": 0, 00:26:51.043 "r_mbytes_per_sec": 0, 00:26:51.043 "w_mbytes_per_sec": 0 00:26:51.043 }, 00:26:51.043 "claimed": true, 00:26:51.043 "claim_type": "exclusive_write", 00:26:51.043 "zoned": false, 00:26:51.043 "supported_io_types": { 00:26:51.043 "read": true, 00:26:51.043 "write": true, 00:26:51.043 "unmap": true, 00:26:51.043 "flush": true, 00:26:51.043 "reset": true, 00:26:51.043 "nvme_admin": false, 00:26:51.043 "nvme_io": false, 00:26:51.043 "nvme_io_md": false, 00:26:51.043 "write_zeroes": true, 00:26:51.043 "zcopy": true, 00:26:51.043 "get_zone_info": false, 00:26:51.043 "zone_management": false, 00:26:51.043 "zone_append": false, 00:26:51.043 "compare": false, 00:26:51.043 "compare_and_write": false, 00:26:51.043 "abort": true, 00:26:51.043 "seek_hole": false, 00:26:51.043 "seek_data": false, 00:26:51.043 "copy": true, 00:26:51.043 "nvme_iov_md": false 00:26:51.043 }, 00:26:51.043 "memory_domains": [ 00:26:51.043 { 00:26:51.043 "dma_device_id": "system", 00:26:51.043 "dma_device_type": 1 00:26:51.043 }, 00:26:51.043 { 00:26:51.043 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:51.043 "dma_device_type": 2 00:26:51.043 } 00:26:51.043 ], 00:26:51.043 "driver_specific": { 00:26:51.043 "passthru": { 00:26:51.043 "name": "pt2", 00:26:51.043 "base_bdev_name": "malloc2" 00:26:51.043 } 00:26:51.043 } 00:26:51.043 }' 00:26:51.043 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:51.302 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:51.302 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:26:51.302 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:51.302 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:51.302 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:51.302 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:51.302 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:51.302 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:26:51.302 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:51.302 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:51.302 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:51.302 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:51.302 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:26:51.564 [2024-07-23 08:40:03.966095] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:51.564 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=5edf3b90-0e79-4dda-acaf-b3a1a236aff4 00:26:51.564 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z 5edf3b90-0e79-4dda-acaf-b3a1a236aff4 ']' 00:26:51.564 08:40:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:51.832 [2024-07-23 08:40:04.138286] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:51.832 [2024-07-23 08:40:04.138313] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:51.832 [2024-07-23 08:40:04.138394] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:51.832 [2024-07-23 08:40:04.138454] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:51.832 [2024-07-23 08:40:04.138470] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000035a80 name raid_bdev1, state offline 00:26:51.832 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:26:51.832 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.832 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:26:51.832 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:26:51.832 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:51.832 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:52.090 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:52.090 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:52.349 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:26:52.349 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:26:52.349 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:26:52.349 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:52.349 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:26:52.349 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:52.349 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:52.349 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:52.349 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:52.349 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:52.349 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:52.349 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:52.349 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:52.349 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:52.349 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:52.608 [2024-07-23 08:40:04.980503] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:26:52.608 [2024-07-23 08:40:04.982136] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:26:52.608 [2024-07-23 08:40:04.982201] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:26:52.608 [2024-07-23 08:40:04.982262] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:26:52.608 [2024-07-23 08:40:04.982278] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:52.608 [2024-07-23 08:40:04.982294] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036080 name raid_bdev1, state configuring 00:26:52.608 request: 00:26:52.608 { 00:26:52.608 "name": "raid_bdev1", 00:26:52.608 "raid_level": "raid1", 00:26:52.608 "base_bdevs": [ 00:26:52.608 "malloc1", 00:26:52.608 "malloc2" 00:26:52.608 ], 00:26:52.608 "superblock": false, 00:26:52.608 "method": "bdev_raid_create", 00:26:52.608 "req_id": 1 00:26:52.608 } 00:26:52.608 Got JSON-RPC error response 00:26:52.608 response: 00:26:52.608 { 00:26:52.608 "code": -17, 00:26:52.608 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:26:52.608 } 00:26:52.608 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:26:52.608 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:52.608 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:52.608 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:52.608 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:26:52.608 08:40:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:52.867 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:26:52.867 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:26:52.867 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:52.867 [2024-07-23 08:40:05.325351] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:52.867 [2024-07-23 08:40:05.325413] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:52.867 [2024-07-23 08:40:05.325431] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036680 00:26:52.867 [2024-07-23 08:40:05.325442] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:52.868 [2024-07-23 08:40:05.327172] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:52.868 [2024-07-23 08:40:05.327202] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:52.868 [2024-07-23 08:40:05.327254] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:52.868 [2024-07-23 08:40:05.327309] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:52.868 pt1 00:26:52.868 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:26:52.868 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:52.868 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:52.868 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:52.868 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:52.868 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:52.868 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:52.868 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:52.868 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:52.868 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:52.868 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:52.868 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:53.126 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:53.126 "name": "raid_bdev1", 00:26:53.126 "uuid": "5edf3b90-0e79-4dda-acaf-b3a1a236aff4", 00:26:53.126 "strip_size_kb": 0, 00:26:53.126 "state": "configuring", 00:26:53.126 "raid_level": "raid1", 00:26:53.126 "superblock": true, 00:26:53.126 "num_base_bdevs": 2, 00:26:53.126 "num_base_bdevs_discovered": 1, 00:26:53.126 "num_base_bdevs_operational": 2, 00:26:53.126 "base_bdevs_list": [ 00:26:53.126 { 00:26:53.126 "name": "pt1", 00:26:53.126 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:53.126 "is_configured": true, 00:26:53.127 "data_offset": 256, 00:26:53.127 "data_size": 7936 00:26:53.127 }, 00:26:53.127 { 00:26:53.127 "name": null, 00:26:53.127 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:53.127 "is_configured": false, 00:26:53.127 "data_offset": 256, 00:26:53.127 "data_size": 7936 00:26:53.127 } 00:26:53.127 ] 00:26:53.127 }' 00:26:53.127 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:53.127 08:40:05 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:53.693 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:26:53.693 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:26:53.693 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:53.693 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:53.693 [2024-07-23 08:40:06.187651] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:53.693 [2024-07-23 08:40:06.187717] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:53.693 [2024-07-23 08:40:06.187735] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036f80 00:26:53.693 [2024-07-23 08:40:06.187745] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:53.693 [2024-07-23 08:40:06.187939] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:53.693 [2024-07-23 08:40:06.187954] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:53.693 [2024-07-23 08:40:06.187999] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:53.693 [2024-07-23 08:40:06.188023] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:53.693 [2024-07-23 08:40:06.188128] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036c80 00:26:53.693 [2024-07-23 08:40:06.188142] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:26:53.693 [2024-07-23 08:40:06.188202] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:26:53.693 [2024-07-23 08:40:06.188287] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036c80 00:26:53.693 [2024-07-23 08:40:06.188295] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036c80 00:26:53.693 [2024-07-23 08:40:06.188361] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:53.693 pt2 00:26:53.693 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:26:53.693 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:53.693 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:53.693 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:53.693 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:53.693 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:53.951 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:53.951 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:53.951 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:53.951 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:53.951 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:53.951 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:53.951 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:53.951 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:53.951 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:53.951 "name": "raid_bdev1", 00:26:53.951 "uuid": "5edf3b90-0e79-4dda-acaf-b3a1a236aff4", 00:26:53.951 "strip_size_kb": 0, 00:26:53.951 "state": "online", 00:26:53.951 "raid_level": "raid1", 00:26:53.951 "superblock": true, 00:26:53.952 "num_base_bdevs": 2, 00:26:53.952 "num_base_bdevs_discovered": 2, 00:26:53.952 "num_base_bdevs_operational": 2, 00:26:53.952 "base_bdevs_list": [ 00:26:53.952 { 00:26:53.952 "name": "pt1", 00:26:53.952 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:53.952 "is_configured": true, 00:26:53.952 "data_offset": 256, 00:26:53.952 "data_size": 7936 00:26:53.952 }, 00:26:53.952 { 00:26:53.952 "name": "pt2", 00:26:53.952 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:53.952 "is_configured": true, 00:26:53.952 "data_offset": 256, 00:26:53.952 "data_size": 7936 00:26:53.952 } 00:26:53.952 ] 00:26:53.952 }' 00:26:53.952 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:53.952 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:54.519 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:26:54.519 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:54.519 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:54.519 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:54.519 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:54.519 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:26:54.519 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:54.519 08:40:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:54.519 [2024-07-23 08:40:07.018079] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:54.779 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:54.779 "name": "raid_bdev1", 00:26:54.779 "aliases": [ 00:26:54.779 "5edf3b90-0e79-4dda-acaf-b3a1a236aff4" 00:26:54.779 ], 00:26:54.779 "product_name": "Raid Volume", 00:26:54.779 "block_size": 4128, 00:26:54.779 "num_blocks": 7936, 00:26:54.779 "uuid": "5edf3b90-0e79-4dda-acaf-b3a1a236aff4", 00:26:54.779 "md_size": 32, 00:26:54.779 "md_interleave": true, 00:26:54.779 "dif_type": 0, 00:26:54.779 "assigned_rate_limits": { 00:26:54.779 "rw_ios_per_sec": 0, 00:26:54.779 "rw_mbytes_per_sec": 0, 00:26:54.779 "r_mbytes_per_sec": 0, 00:26:54.779 "w_mbytes_per_sec": 0 00:26:54.779 }, 00:26:54.779 "claimed": false, 00:26:54.779 "zoned": false, 00:26:54.779 "supported_io_types": { 00:26:54.779 "read": true, 00:26:54.779 "write": true, 00:26:54.779 "unmap": false, 00:26:54.779 "flush": false, 00:26:54.779 "reset": true, 00:26:54.779 "nvme_admin": false, 00:26:54.779 "nvme_io": false, 00:26:54.779 "nvme_io_md": false, 00:26:54.779 "write_zeroes": true, 00:26:54.779 "zcopy": false, 00:26:54.779 "get_zone_info": false, 00:26:54.779 "zone_management": false, 00:26:54.779 "zone_append": false, 00:26:54.779 "compare": false, 00:26:54.779 "compare_and_write": false, 00:26:54.779 "abort": false, 00:26:54.779 "seek_hole": false, 00:26:54.779 "seek_data": false, 00:26:54.779 "copy": false, 00:26:54.779 "nvme_iov_md": false 00:26:54.779 }, 00:26:54.779 "memory_domains": [ 00:26:54.779 { 00:26:54.779 "dma_device_id": "system", 00:26:54.779 "dma_device_type": 1 00:26:54.779 }, 00:26:54.779 { 00:26:54.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:54.779 "dma_device_type": 2 00:26:54.779 }, 00:26:54.779 { 00:26:54.779 "dma_device_id": "system", 00:26:54.779 "dma_device_type": 1 00:26:54.779 }, 00:26:54.779 { 00:26:54.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:54.779 "dma_device_type": 2 00:26:54.779 } 00:26:54.779 ], 00:26:54.779 "driver_specific": { 00:26:54.779 "raid": { 00:26:54.779 "uuid": "5edf3b90-0e79-4dda-acaf-b3a1a236aff4", 00:26:54.779 "strip_size_kb": 0, 00:26:54.779 "state": "online", 00:26:54.779 "raid_level": "raid1", 00:26:54.779 "superblock": true, 00:26:54.779 "num_base_bdevs": 2, 00:26:54.779 "num_base_bdevs_discovered": 2, 00:26:54.779 "num_base_bdevs_operational": 2, 00:26:54.779 "base_bdevs_list": [ 00:26:54.779 { 00:26:54.779 "name": "pt1", 00:26:54.779 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:54.779 "is_configured": true, 00:26:54.779 "data_offset": 256, 00:26:54.779 "data_size": 7936 00:26:54.779 }, 00:26:54.779 { 00:26:54.779 "name": "pt2", 00:26:54.779 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:54.779 "is_configured": true, 00:26:54.779 "data_offset": 256, 00:26:54.779 "data_size": 7936 00:26:54.779 } 00:26:54.779 ] 00:26:54.779 } 00:26:54.779 } 00:26:54.779 }' 00:26:54.779 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:54.779 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:54.779 pt2' 00:26:54.779 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:54.779 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:54.779 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:54.779 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:54.779 "name": "pt1", 00:26:54.779 "aliases": [ 00:26:54.779 "00000000-0000-0000-0000-000000000001" 00:26:54.779 ], 00:26:54.779 "product_name": "passthru", 00:26:54.779 "block_size": 4128, 00:26:54.779 "num_blocks": 8192, 00:26:54.779 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:54.779 "md_size": 32, 00:26:54.779 "md_interleave": true, 00:26:54.779 "dif_type": 0, 00:26:54.779 "assigned_rate_limits": { 00:26:54.779 "rw_ios_per_sec": 0, 00:26:54.779 "rw_mbytes_per_sec": 0, 00:26:54.779 "r_mbytes_per_sec": 0, 00:26:54.779 "w_mbytes_per_sec": 0 00:26:54.779 }, 00:26:54.779 "claimed": true, 00:26:54.779 "claim_type": "exclusive_write", 00:26:54.779 "zoned": false, 00:26:54.779 "supported_io_types": { 00:26:54.779 "read": true, 00:26:54.779 "write": true, 00:26:54.779 "unmap": true, 00:26:54.779 "flush": true, 00:26:54.779 "reset": true, 00:26:54.779 "nvme_admin": false, 00:26:54.779 "nvme_io": false, 00:26:54.779 "nvme_io_md": false, 00:26:54.779 "write_zeroes": true, 00:26:54.779 "zcopy": true, 00:26:54.779 "get_zone_info": false, 00:26:54.779 "zone_management": false, 00:26:54.779 "zone_append": false, 00:26:54.779 "compare": false, 00:26:54.779 "compare_and_write": false, 00:26:54.779 "abort": true, 00:26:54.779 "seek_hole": false, 00:26:54.779 "seek_data": false, 00:26:54.779 "copy": true, 00:26:54.779 "nvme_iov_md": false 00:26:54.779 }, 00:26:54.779 "memory_domains": [ 00:26:54.779 { 00:26:54.779 "dma_device_id": "system", 00:26:54.779 "dma_device_type": 1 00:26:54.779 }, 00:26:54.779 { 00:26:54.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:54.779 "dma_device_type": 2 00:26:54.779 } 00:26:54.779 ], 00:26:54.779 "driver_specific": { 00:26:54.779 "passthru": { 00:26:54.779 "name": "pt1", 00:26:54.779 "base_bdev_name": "malloc1" 00:26:54.779 } 00:26:54.779 } 00:26:54.779 }' 00:26:54.779 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:54.779 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:55.038 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:26:55.038 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:55.038 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:55.038 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:55.038 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:55.038 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:55.038 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:26:55.038 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:55.038 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:55.038 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:55.038 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:55.038 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:55.038 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:55.297 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:55.297 "name": "pt2", 00:26:55.297 "aliases": [ 00:26:55.297 "00000000-0000-0000-0000-000000000002" 00:26:55.297 ], 00:26:55.297 "product_name": "passthru", 00:26:55.297 "block_size": 4128, 00:26:55.297 "num_blocks": 8192, 00:26:55.297 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:55.297 "md_size": 32, 00:26:55.297 "md_interleave": true, 00:26:55.297 "dif_type": 0, 00:26:55.297 "assigned_rate_limits": { 00:26:55.297 "rw_ios_per_sec": 0, 00:26:55.297 "rw_mbytes_per_sec": 0, 00:26:55.297 "r_mbytes_per_sec": 0, 00:26:55.297 "w_mbytes_per_sec": 0 00:26:55.297 }, 00:26:55.297 "claimed": true, 00:26:55.297 "claim_type": "exclusive_write", 00:26:55.297 "zoned": false, 00:26:55.297 "supported_io_types": { 00:26:55.297 "read": true, 00:26:55.297 "write": true, 00:26:55.297 "unmap": true, 00:26:55.297 "flush": true, 00:26:55.297 "reset": true, 00:26:55.297 "nvme_admin": false, 00:26:55.297 "nvme_io": false, 00:26:55.297 "nvme_io_md": false, 00:26:55.297 "write_zeroes": true, 00:26:55.297 "zcopy": true, 00:26:55.297 "get_zone_info": false, 00:26:55.297 "zone_management": false, 00:26:55.297 "zone_append": false, 00:26:55.297 "compare": false, 00:26:55.297 "compare_and_write": false, 00:26:55.297 "abort": true, 00:26:55.297 "seek_hole": false, 00:26:55.297 "seek_data": false, 00:26:55.297 "copy": true, 00:26:55.297 "nvme_iov_md": false 00:26:55.297 }, 00:26:55.298 "memory_domains": [ 00:26:55.298 { 00:26:55.298 "dma_device_id": "system", 00:26:55.298 "dma_device_type": 1 00:26:55.298 }, 00:26:55.298 { 00:26:55.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:55.298 "dma_device_type": 2 00:26:55.298 } 00:26:55.298 ], 00:26:55.298 "driver_specific": { 00:26:55.298 "passthru": { 00:26:55.298 "name": "pt2", 00:26:55.298 "base_bdev_name": "malloc2" 00:26:55.298 } 00:26:55.298 } 00:26:55.298 }' 00:26:55.298 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:55.298 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:55.298 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:26:55.298 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:55.298 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:55.557 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:55.557 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:55.557 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:55.557 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:26:55.557 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:55.557 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:55.557 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:55.557 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:55.557 08:40:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:26:55.817 [2024-07-23 08:40:08.104957] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' 5edf3b90-0e79-4dda-acaf-b3a1a236aff4 '!=' 5edf3b90-0e79-4dda-acaf-b3a1a236aff4 ']' 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:55.817 [2024-07-23 08:40:08.277170] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:55.817 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:56.076 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:56.076 "name": "raid_bdev1", 00:26:56.076 "uuid": "5edf3b90-0e79-4dda-acaf-b3a1a236aff4", 00:26:56.076 "strip_size_kb": 0, 00:26:56.076 "state": "online", 00:26:56.076 "raid_level": "raid1", 00:26:56.076 "superblock": true, 00:26:56.076 "num_base_bdevs": 2, 00:26:56.076 "num_base_bdevs_discovered": 1, 00:26:56.076 "num_base_bdevs_operational": 1, 00:26:56.076 "base_bdevs_list": [ 00:26:56.076 { 00:26:56.076 "name": null, 00:26:56.076 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:56.076 "is_configured": false, 00:26:56.076 "data_offset": 256, 00:26:56.076 "data_size": 7936 00:26:56.076 }, 00:26:56.076 { 00:26:56.076 "name": "pt2", 00:26:56.076 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:56.076 "is_configured": true, 00:26:56.076 "data_offset": 256, 00:26:56.076 "data_size": 7936 00:26:56.076 } 00:26:56.076 ] 00:26:56.076 }' 00:26:56.076 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:56.076 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:56.645 08:40:08 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:56.645 [2024-07-23 08:40:09.083239] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:56.645 [2024-07-23 08:40:09.083266] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:56.645 [2024-07-23 08:40:09.083333] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:56.645 [2024-07-23 08:40:09.083378] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:56.645 [2024-07-23 08:40:09.083390] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036c80 name raid_bdev1, state offline 00:26:56.645 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:26:56.645 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.904 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:26:56.904 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:26:56.904 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:26:56.904 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:56.904 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:57.164 [2024-07-23 08:40:09.572518] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:57.164 [2024-07-23 08:40:09.572585] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:57.164 [2024-07-23 08:40:09.572602] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037280 00:26:57.164 [2024-07-23 08:40:09.572621] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:57.164 [2024-07-23 08:40:09.574317] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:57.164 [2024-07-23 08:40:09.574347] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:57.164 [2024-07-23 08:40:09.574398] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:57.164 [2024-07-23 08:40:09.574450] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:57.164 [2024-07-23 08:40:09.574539] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037880 00:26:57.164 [2024-07-23 08:40:09.574551] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:26:57.164 [2024-07-23 08:40:09.574626] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:26:57.164 [2024-07-23 08:40:09.574733] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037880 00:26:57.164 [2024-07-23 08:40:09.574741] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037880 00:26:57.164 [2024-07-23 08:40:09.574814] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:57.164 pt2 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.164 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.423 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:57.423 "name": "raid_bdev1", 00:26:57.423 "uuid": "5edf3b90-0e79-4dda-acaf-b3a1a236aff4", 00:26:57.423 "strip_size_kb": 0, 00:26:57.423 "state": "online", 00:26:57.423 "raid_level": "raid1", 00:26:57.423 "superblock": true, 00:26:57.423 "num_base_bdevs": 2, 00:26:57.423 "num_base_bdevs_discovered": 1, 00:26:57.423 "num_base_bdevs_operational": 1, 00:26:57.423 "base_bdevs_list": [ 00:26:57.423 { 00:26:57.423 "name": null, 00:26:57.423 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.423 "is_configured": false, 00:26:57.423 "data_offset": 256, 00:26:57.423 "data_size": 7936 00:26:57.423 }, 00:26:57.423 { 00:26:57.423 "name": "pt2", 00:26:57.423 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:57.423 "is_configured": true, 00:26:57.423 "data_offset": 256, 00:26:57.423 "data_size": 7936 00:26:57.424 } 00:26:57.424 ] 00:26:57.424 }' 00:26:57.424 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:57.424 08:40:09 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:57.991 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:57.991 [2024-07-23 08:40:10.394718] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:57.991 [2024-07-23 08:40:10.394753] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:57.991 [2024-07-23 08:40:10.394822] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:57.992 [2024-07-23 08:40:10.394873] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:57.992 [2024-07-23 08:40:10.394883] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037880 name raid_bdev1, state offline 00:26:57.992 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.992 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:58.251 [2024-07-23 08:40:10.735581] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:58.251 [2024-07-23 08:40:10.735646] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:58.251 [2024-07-23 08:40:10.735668] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037b80 00:26:58.251 [2024-07-23 08:40:10.735678] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:58.251 [2024-07-23 08:40:10.737426] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:58.251 [2024-07-23 08:40:10.737453] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:58.251 [2024-07-23 08:40:10.737505] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:58.251 [2024-07-23 08:40:10.737545] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:58.251 [2024-07-23 08:40:10.737673] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:26:58.251 [2024-07-23 08:40:10.737689] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:58.251 [2024-07-23 08:40:10.737708] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038180 name raid_bdev1, state configuring 00:26:58.251 [2024-07-23 08:40:10.737779] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:58.251 [2024-07-23 08:40:10.737846] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000038480 00:26:58.251 [2024-07-23 08:40:10.737855] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:26:58.251 [2024-07-23 08:40:10.737923] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:26:58.251 [2024-07-23 08:40:10.738014] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000038480 00:26:58.251 [2024-07-23 08:40:10.738024] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000038480 00:26:58.251 [2024-07-23 08:40:10.738092] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:58.251 pt1 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:58.251 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:58.510 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:58.510 "name": "raid_bdev1", 00:26:58.510 "uuid": "5edf3b90-0e79-4dda-acaf-b3a1a236aff4", 00:26:58.510 "strip_size_kb": 0, 00:26:58.510 "state": "online", 00:26:58.510 "raid_level": "raid1", 00:26:58.510 "superblock": true, 00:26:58.510 "num_base_bdevs": 2, 00:26:58.510 "num_base_bdevs_discovered": 1, 00:26:58.510 "num_base_bdevs_operational": 1, 00:26:58.510 "base_bdevs_list": [ 00:26:58.510 { 00:26:58.510 "name": null, 00:26:58.510 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:58.510 "is_configured": false, 00:26:58.510 "data_offset": 256, 00:26:58.510 "data_size": 7936 00:26:58.510 }, 00:26:58.510 { 00:26:58.510 "name": "pt2", 00:26:58.510 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:58.510 "is_configured": true, 00:26:58.510 "data_offset": 256, 00:26:58.510 "data_size": 7936 00:26:58.510 } 00:26:58.510 ] 00:26:58.510 }' 00:26:58.510 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:58.510 08:40:10 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:59.078 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:26:59.078 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:26:59.338 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:26:59.338 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:59.338 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:26:59.338 [2024-07-23 08:40:11.762520] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:59.338 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 5edf3b90-0e79-4dda-acaf-b3a1a236aff4 '!=' 5edf3b90-0e79-4dda-acaf-b3a1a236aff4 ']' 00:26:59.338 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 1583899 00:26:59.338 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 1583899 ']' 00:26:59.338 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 1583899 00:26:59.338 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:26:59.338 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:59.338 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1583899 00:26:59.338 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:59.338 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:59.338 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1583899' 00:26:59.338 killing process with pid 1583899 00:26:59.338 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 1583899 00:26:59.338 [2024-07-23 08:40:11.820560] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:59.338 08:40:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 1583899 00:26:59.338 [2024-07-23 08:40:11.820656] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:59.338 [2024-07-23 08:40:11.820708] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:59.338 [2024-07-23 08:40:11.820720] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000038480 name raid_bdev1, state offline 00:26:59.597 [2024-07-23 08:40:11.965454] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:00.975 08:40:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:27:00.975 00:27:00.975 real 0m12.998s 00:27:00.975 user 0m22.561s 00:27:00.975 sys 0m2.006s 00:27:00.975 08:40:13 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:00.975 08:40:13 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:00.975 ************************************ 00:27:00.975 END TEST raid_superblock_test_md_interleaved 00:27:00.975 ************************************ 00:27:00.975 08:40:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:00.975 08:40:13 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:27:00.975 08:40:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:00.975 08:40:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:00.975 08:40:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:00.975 ************************************ 00:27:00.975 START TEST raid_rebuild_test_sb_md_interleaved 00:27:00.975 ************************************ 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:00.975 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=1586707 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 1586707 /var/tmp/spdk-raid.sock 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 1586707 ']' 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:00.976 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:00.976 08:40:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:00.976 [2024-07-23 08:40:13.391342] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:27:00.976 [2024-07-23 08:40:13.391433] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1586707 ] 00:27:00.976 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:00.976 Zero copy mechanism will not be used. 00:27:01.235 [2024-07-23 08:40:13.518383] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:01.235 [2024-07-23 08:40:13.731281] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:01.494 [2024-07-23 08:40:13.982903] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:01.494 [2024-07-23 08:40:13.982934] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:01.752 08:40:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:01.752 08:40:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:27:01.752 08:40:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:01.752 08:40:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:27:02.011 BaseBdev1_malloc 00:27:02.011 08:40:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:02.011 [2024-07-23 08:40:14.502807] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:02.011 [2024-07-23 08:40:14.502862] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:02.011 [2024-07-23 08:40:14.502888] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:27:02.011 [2024-07-23 08:40:14.502900] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:02.011 [2024-07-23 08:40:14.504623] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:02.011 [2024-07-23 08:40:14.504651] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:02.011 BaseBdev1 00:27:02.011 08:40:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:02.011 08:40:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:27:02.271 BaseBdev2_malloc 00:27:02.271 08:40:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:02.530 [2024-07-23 08:40:14.854532] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:02.530 [2024-07-23 08:40:14.854585] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:02.530 [2024-07-23 08:40:14.854606] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000035480 00:27:02.530 [2024-07-23 08:40:14.854624] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:02.530 [2024-07-23 08:40:14.856308] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:02.530 [2024-07-23 08:40:14.856336] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:02.530 BaseBdev2 00:27:02.530 08:40:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:27:02.789 spare_malloc 00:27:02.789 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:02.789 spare_delay 00:27:02.789 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:03.048 [2024-07-23 08:40:15.399863] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:03.048 [2024-07-23 08:40:15.399915] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:03.048 [2024-07-23 08:40:15.399953] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000036680 00:27:03.048 [2024-07-23 08:40:15.399964] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:03.048 [2024-07-23 08:40:15.401662] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:03.048 [2024-07-23 08:40:15.401690] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:03.048 spare 00:27:03.048 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:27:03.308 [2024-07-23 08:40:15.568341] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:03.308 [2024-07-23 08:40:15.570009] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:03.308 [2024-07-23 08:40:15.570210] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000036c80 00:27:03.308 [2024-07-23 08:40:15.570229] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:03.308 [2024-07-23 08:40:15.570317] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bec0 00:27:03.308 [2024-07-23 08:40:15.570433] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000036c80 00:27:03.308 [2024-07-23 08:40:15.570442] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000036c80 00:27:03.308 [2024-07-23 08:40:15.570523] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:03.308 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:03.308 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:03.308 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:03.308 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:03.308 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:03.308 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:03.308 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:03.308 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:03.308 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:03.308 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:03.308 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:03.308 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:03.308 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:03.308 "name": "raid_bdev1", 00:27:03.308 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:03.308 "strip_size_kb": 0, 00:27:03.308 "state": "online", 00:27:03.308 "raid_level": "raid1", 00:27:03.308 "superblock": true, 00:27:03.308 "num_base_bdevs": 2, 00:27:03.308 "num_base_bdevs_discovered": 2, 00:27:03.308 "num_base_bdevs_operational": 2, 00:27:03.308 "base_bdevs_list": [ 00:27:03.308 { 00:27:03.308 "name": "BaseBdev1", 00:27:03.308 "uuid": "97bed7de-f0b8-5b93-96ed-9dbee6f37ec9", 00:27:03.308 "is_configured": true, 00:27:03.308 "data_offset": 256, 00:27:03.308 "data_size": 7936 00:27:03.308 }, 00:27:03.308 { 00:27:03.308 "name": "BaseBdev2", 00:27:03.308 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:03.308 "is_configured": true, 00:27:03.308 "data_offset": 256, 00:27:03.308 "data_size": 7936 00:27:03.308 } 00:27:03.308 ] 00:27:03.308 }' 00:27:03.308 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:03.308 08:40:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:03.875 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:03.875 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:04.134 [2024-07-23 08:40:16.422816] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:04.134 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:27:04.134 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.134 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:04.134 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:27:04.134 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:27:04.134 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:27:04.134 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:04.392 [2024-07-23 08:40:16.775489] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:04.392 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:04.392 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:04.392 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:04.392 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:04.392 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:04.392 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:04.392 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:04.392 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:04.392 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:04.392 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:04.392 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:04.392 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.650 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:04.650 "name": "raid_bdev1", 00:27:04.650 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:04.650 "strip_size_kb": 0, 00:27:04.650 "state": "online", 00:27:04.650 "raid_level": "raid1", 00:27:04.650 "superblock": true, 00:27:04.650 "num_base_bdevs": 2, 00:27:04.650 "num_base_bdevs_discovered": 1, 00:27:04.650 "num_base_bdevs_operational": 1, 00:27:04.650 "base_bdevs_list": [ 00:27:04.650 { 00:27:04.650 "name": null, 00:27:04.650 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:04.650 "is_configured": false, 00:27:04.650 "data_offset": 256, 00:27:04.650 "data_size": 7936 00:27:04.650 }, 00:27:04.650 { 00:27:04.650 "name": "BaseBdev2", 00:27:04.650 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:04.650 "is_configured": true, 00:27:04.650 "data_offset": 256, 00:27:04.650 "data_size": 7936 00:27:04.650 } 00:27:04.650 ] 00:27:04.650 }' 00:27:04.650 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:04.650 08:40:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:05.217 08:40:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:05.217 [2024-07-23 08:40:17.609716] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:05.217 [2024-07-23 08:40:17.629234] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000bf90 00:27:05.217 [2024-07-23 08:40:17.630827] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:05.217 08:40:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:06.153 08:40:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:06.153 08:40:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:06.153 08:40:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:06.153 08:40:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:06.153 08:40:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:06.153 08:40:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:06.153 08:40:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:06.444 08:40:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:06.445 "name": "raid_bdev1", 00:27:06.445 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:06.445 "strip_size_kb": 0, 00:27:06.445 "state": "online", 00:27:06.445 "raid_level": "raid1", 00:27:06.445 "superblock": true, 00:27:06.445 "num_base_bdevs": 2, 00:27:06.445 "num_base_bdevs_discovered": 2, 00:27:06.445 "num_base_bdevs_operational": 2, 00:27:06.445 "process": { 00:27:06.445 "type": "rebuild", 00:27:06.445 "target": "spare", 00:27:06.445 "progress": { 00:27:06.445 "blocks": 2816, 00:27:06.445 "percent": 35 00:27:06.445 } 00:27:06.445 }, 00:27:06.445 "base_bdevs_list": [ 00:27:06.445 { 00:27:06.445 "name": "spare", 00:27:06.445 "uuid": "36a9c73a-47cc-5bff-bee7-52808103b7d5", 00:27:06.445 "is_configured": true, 00:27:06.445 "data_offset": 256, 00:27:06.445 "data_size": 7936 00:27:06.445 }, 00:27:06.445 { 00:27:06.445 "name": "BaseBdev2", 00:27:06.445 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:06.445 "is_configured": true, 00:27:06.445 "data_offset": 256, 00:27:06.445 "data_size": 7936 00:27:06.445 } 00:27:06.445 ] 00:27:06.445 }' 00:27:06.445 08:40:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:06.445 08:40:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:06.445 08:40:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:06.445 08:40:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:06.445 08:40:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:06.701 [2024-07-23 08:40:19.052267] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:06.701 [2024-07-23 08:40:19.142925] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:06.701 [2024-07-23 08:40:19.142975] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:06.701 [2024-07-23 08:40:19.142990] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:06.702 [2024-07-23 08:40:19.142999] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:06.702 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:06.702 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:06.702 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:06.702 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:06.702 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:06.702 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:06.702 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:06.702 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:06.702 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:06.702 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:06.702 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:06.702 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:06.959 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:06.959 "name": "raid_bdev1", 00:27:06.959 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:06.959 "strip_size_kb": 0, 00:27:06.959 "state": "online", 00:27:06.959 "raid_level": "raid1", 00:27:06.959 "superblock": true, 00:27:06.959 "num_base_bdevs": 2, 00:27:06.959 "num_base_bdevs_discovered": 1, 00:27:06.959 "num_base_bdevs_operational": 1, 00:27:06.959 "base_bdevs_list": [ 00:27:06.959 { 00:27:06.959 "name": null, 00:27:06.959 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:06.959 "is_configured": false, 00:27:06.959 "data_offset": 256, 00:27:06.959 "data_size": 7936 00:27:06.959 }, 00:27:06.959 { 00:27:06.959 "name": "BaseBdev2", 00:27:06.959 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:06.959 "is_configured": true, 00:27:06.959 "data_offset": 256, 00:27:06.959 "data_size": 7936 00:27:06.959 } 00:27:06.959 ] 00:27:06.959 }' 00:27:06.959 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:06.959 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:07.525 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:07.525 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:07.525 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:07.525 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:07.525 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:07.525 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.525 08:40:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:07.525 08:40:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:07.525 "name": "raid_bdev1", 00:27:07.525 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:07.525 "strip_size_kb": 0, 00:27:07.525 "state": "online", 00:27:07.525 "raid_level": "raid1", 00:27:07.525 "superblock": true, 00:27:07.525 "num_base_bdevs": 2, 00:27:07.525 "num_base_bdevs_discovered": 1, 00:27:07.525 "num_base_bdevs_operational": 1, 00:27:07.525 "base_bdevs_list": [ 00:27:07.525 { 00:27:07.525 "name": null, 00:27:07.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:07.525 "is_configured": false, 00:27:07.525 "data_offset": 256, 00:27:07.525 "data_size": 7936 00:27:07.525 }, 00:27:07.525 { 00:27:07.525 "name": "BaseBdev2", 00:27:07.525 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:07.525 "is_configured": true, 00:27:07.525 "data_offset": 256, 00:27:07.525 "data_size": 7936 00:27:07.525 } 00:27:07.525 ] 00:27:07.525 }' 00:27:07.525 08:40:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:07.783 08:40:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:07.783 08:40:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:07.783 08:40:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:07.783 08:40:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:07.783 [2024-07-23 08:40:20.282199] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:07.783 [2024-07-23 08:40:20.300716] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c060 00:27:08.041 [2024-07-23 08:40:20.302400] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:08.041 08:40:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:08.975 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:08.975 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:08.975 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:08.975 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:08.975 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:08.975 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.975 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:09.233 "name": "raid_bdev1", 00:27:09.233 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:09.233 "strip_size_kb": 0, 00:27:09.233 "state": "online", 00:27:09.233 "raid_level": "raid1", 00:27:09.233 "superblock": true, 00:27:09.233 "num_base_bdevs": 2, 00:27:09.233 "num_base_bdevs_discovered": 2, 00:27:09.233 "num_base_bdevs_operational": 2, 00:27:09.233 "process": { 00:27:09.233 "type": "rebuild", 00:27:09.233 "target": "spare", 00:27:09.233 "progress": { 00:27:09.233 "blocks": 2816, 00:27:09.233 "percent": 35 00:27:09.233 } 00:27:09.233 }, 00:27:09.233 "base_bdevs_list": [ 00:27:09.233 { 00:27:09.233 "name": "spare", 00:27:09.233 "uuid": "36a9c73a-47cc-5bff-bee7-52808103b7d5", 00:27:09.233 "is_configured": true, 00:27:09.233 "data_offset": 256, 00:27:09.233 "data_size": 7936 00:27:09.233 }, 00:27:09.233 { 00:27:09.233 "name": "BaseBdev2", 00:27:09.233 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:09.233 "is_configured": true, 00:27:09.233 "data_offset": 256, 00:27:09.233 "data_size": 7936 00:27:09.233 } 00:27:09.233 ] 00:27:09.233 }' 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:09.233 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=974 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:09.233 "name": "raid_bdev1", 00:27:09.233 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:09.233 "strip_size_kb": 0, 00:27:09.233 "state": "online", 00:27:09.233 "raid_level": "raid1", 00:27:09.233 "superblock": true, 00:27:09.233 "num_base_bdevs": 2, 00:27:09.233 "num_base_bdevs_discovered": 2, 00:27:09.233 "num_base_bdevs_operational": 2, 00:27:09.233 "process": { 00:27:09.233 "type": "rebuild", 00:27:09.233 "target": "spare", 00:27:09.233 "progress": { 00:27:09.233 "blocks": 3584, 00:27:09.233 "percent": 45 00:27:09.233 } 00:27:09.233 }, 00:27:09.233 "base_bdevs_list": [ 00:27:09.233 { 00:27:09.233 "name": "spare", 00:27:09.233 "uuid": "36a9c73a-47cc-5bff-bee7-52808103b7d5", 00:27:09.233 "is_configured": true, 00:27:09.233 "data_offset": 256, 00:27:09.233 "data_size": 7936 00:27:09.233 }, 00:27:09.233 { 00:27:09.233 "name": "BaseBdev2", 00:27:09.233 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:09.233 "is_configured": true, 00:27:09.233 "data_offset": 256, 00:27:09.233 "data_size": 7936 00:27:09.233 } 00:27:09.233 ] 00:27:09.233 }' 00:27:09.233 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:09.491 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:09.491 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:09.491 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:09.491 08:40:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:10.427 08:40:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:10.427 08:40:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:10.427 08:40:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:10.427 08:40:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:10.427 08:40:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:10.427 08:40:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:10.427 08:40:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:10.427 08:40:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:10.685 08:40:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:10.685 "name": "raid_bdev1", 00:27:10.685 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:10.685 "strip_size_kb": 0, 00:27:10.685 "state": "online", 00:27:10.685 "raid_level": "raid1", 00:27:10.685 "superblock": true, 00:27:10.685 "num_base_bdevs": 2, 00:27:10.685 "num_base_bdevs_discovered": 2, 00:27:10.685 "num_base_bdevs_operational": 2, 00:27:10.685 "process": { 00:27:10.685 "type": "rebuild", 00:27:10.685 "target": "spare", 00:27:10.685 "progress": { 00:27:10.685 "blocks": 6656, 00:27:10.685 "percent": 83 00:27:10.685 } 00:27:10.685 }, 00:27:10.685 "base_bdevs_list": [ 00:27:10.685 { 00:27:10.685 "name": "spare", 00:27:10.685 "uuid": "36a9c73a-47cc-5bff-bee7-52808103b7d5", 00:27:10.685 "is_configured": true, 00:27:10.685 "data_offset": 256, 00:27:10.685 "data_size": 7936 00:27:10.685 }, 00:27:10.685 { 00:27:10.685 "name": "BaseBdev2", 00:27:10.685 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:10.685 "is_configured": true, 00:27:10.685 "data_offset": 256, 00:27:10.685 "data_size": 7936 00:27:10.685 } 00:27:10.685 ] 00:27:10.685 }' 00:27:10.685 08:40:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:10.685 08:40:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:10.685 08:40:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:10.685 08:40:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:10.685 08:40:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:10.945 [2024-07-23 08:40:23.426822] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:10.945 [2024-07-23 08:40:23.426879] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:10.945 [2024-07-23 08:40:23.426979] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:11.880 "name": "raid_bdev1", 00:27:11.880 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:11.880 "strip_size_kb": 0, 00:27:11.880 "state": "online", 00:27:11.880 "raid_level": "raid1", 00:27:11.880 "superblock": true, 00:27:11.880 "num_base_bdevs": 2, 00:27:11.880 "num_base_bdevs_discovered": 2, 00:27:11.880 "num_base_bdevs_operational": 2, 00:27:11.880 "base_bdevs_list": [ 00:27:11.880 { 00:27:11.880 "name": "spare", 00:27:11.880 "uuid": "36a9c73a-47cc-5bff-bee7-52808103b7d5", 00:27:11.880 "is_configured": true, 00:27:11.880 "data_offset": 256, 00:27:11.880 "data_size": 7936 00:27:11.880 }, 00:27:11.880 { 00:27:11.880 "name": "BaseBdev2", 00:27:11.880 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:11.880 "is_configured": true, 00:27:11.880 "data_offset": 256, 00:27:11.880 "data_size": 7936 00:27:11.880 } 00:27:11.880 ] 00:27:11.880 }' 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:11.880 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:12.138 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:12.138 "name": "raid_bdev1", 00:27:12.138 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:12.138 "strip_size_kb": 0, 00:27:12.138 "state": "online", 00:27:12.138 "raid_level": "raid1", 00:27:12.138 "superblock": true, 00:27:12.138 "num_base_bdevs": 2, 00:27:12.138 "num_base_bdevs_discovered": 2, 00:27:12.138 "num_base_bdevs_operational": 2, 00:27:12.138 "base_bdevs_list": [ 00:27:12.138 { 00:27:12.138 "name": "spare", 00:27:12.138 "uuid": "36a9c73a-47cc-5bff-bee7-52808103b7d5", 00:27:12.138 "is_configured": true, 00:27:12.138 "data_offset": 256, 00:27:12.138 "data_size": 7936 00:27:12.138 }, 00:27:12.138 { 00:27:12.138 "name": "BaseBdev2", 00:27:12.138 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:12.138 "is_configured": true, 00:27:12.138 "data_offset": 256, 00:27:12.139 "data_size": 7936 00:27:12.139 } 00:27:12.139 ] 00:27:12.139 }' 00:27:12.139 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:12.139 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:12.139 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:12.139 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:12.139 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:12.139 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:12.139 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:12.139 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:12.139 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:12.139 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:12.139 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:12.139 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:12.139 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:12.139 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:12.139 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.139 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:12.397 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:12.397 "name": "raid_bdev1", 00:27:12.397 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:12.397 "strip_size_kb": 0, 00:27:12.397 "state": "online", 00:27:12.397 "raid_level": "raid1", 00:27:12.397 "superblock": true, 00:27:12.397 "num_base_bdevs": 2, 00:27:12.397 "num_base_bdevs_discovered": 2, 00:27:12.397 "num_base_bdevs_operational": 2, 00:27:12.397 "base_bdevs_list": [ 00:27:12.397 { 00:27:12.397 "name": "spare", 00:27:12.397 "uuid": "36a9c73a-47cc-5bff-bee7-52808103b7d5", 00:27:12.397 "is_configured": true, 00:27:12.397 "data_offset": 256, 00:27:12.397 "data_size": 7936 00:27:12.397 }, 00:27:12.397 { 00:27:12.397 "name": "BaseBdev2", 00:27:12.397 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:12.397 "is_configured": true, 00:27:12.397 "data_offset": 256, 00:27:12.397 "data_size": 7936 00:27:12.397 } 00:27:12.397 ] 00:27:12.397 }' 00:27:12.397 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:12.397 08:40:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:12.963 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:12.963 [2024-07-23 08:40:25.423649] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:12.963 [2024-07-23 08:40:25.423679] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:12.963 [2024-07-23 08:40:25.423759] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:12.964 [2024-07-23 08:40:25.423823] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:12.964 [2024-07-23 08:40:25.423834] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000036c80 name raid_bdev1, state offline 00:27:12.964 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.964 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:27:13.222 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:13.222 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:27:13.222 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:27:13.222 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:13.480 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:13.480 [2024-07-23 08:40:25.961064] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:13.480 [2024-07-23 08:40:25.961120] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:13.480 [2024-07-23 08:40:25.961158] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000037880 00:27:13.480 [2024-07-23 08:40:25.961168] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:13.480 [2024-07-23 08:40:25.962936] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:13.480 [2024-07-23 08:40:25.962963] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:13.480 [2024-07-23 08:40:25.963025] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:13.480 [2024-07-23 08:40:25.963087] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:13.481 [2024-07-23 08:40:25.963201] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:13.481 spare 00:27:13.481 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:13.481 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:13.481 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:13.481 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:13.481 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:13.481 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:13.481 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:13.481 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:13.481 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:13.481 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:13.481 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.481 08:40:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:13.739 [2024-07-23 08:40:26.063525] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000037e80 00:27:13.739 [2024-07-23 08:40:26.063562] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:13.739 [2024-07-23 08:40:26.063679] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c130 00:27:13.739 [2024-07-23 08:40:26.063842] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000037e80 00:27:13.739 [2024-07-23 08:40:26.063851] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000037e80 00:27:13.739 [2024-07-23 08:40:26.063942] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:13.739 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:13.739 "name": "raid_bdev1", 00:27:13.739 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:13.739 "strip_size_kb": 0, 00:27:13.739 "state": "online", 00:27:13.739 "raid_level": "raid1", 00:27:13.739 "superblock": true, 00:27:13.739 "num_base_bdevs": 2, 00:27:13.739 "num_base_bdevs_discovered": 2, 00:27:13.739 "num_base_bdevs_operational": 2, 00:27:13.739 "base_bdevs_list": [ 00:27:13.739 { 00:27:13.739 "name": "spare", 00:27:13.739 "uuid": "36a9c73a-47cc-5bff-bee7-52808103b7d5", 00:27:13.739 "is_configured": true, 00:27:13.739 "data_offset": 256, 00:27:13.739 "data_size": 7936 00:27:13.739 }, 00:27:13.739 { 00:27:13.739 "name": "BaseBdev2", 00:27:13.739 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:13.739 "is_configured": true, 00:27:13.739 "data_offset": 256, 00:27:13.739 "data_size": 7936 00:27:13.739 } 00:27:13.739 ] 00:27:13.739 }' 00:27:13.739 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:13.739 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:14.305 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:14.305 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:14.305 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:14.305 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:14.305 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:14.305 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.305 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:14.305 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:14.305 "name": "raid_bdev1", 00:27:14.305 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:14.305 "strip_size_kb": 0, 00:27:14.305 "state": "online", 00:27:14.305 "raid_level": "raid1", 00:27:14.305 "superblock": true, 00:27:14.305 "num_base_bdevs": 2, 00:27:14.305 "num_base_bdevs_discovered": 2, 00:27:14.305 "num_base_bdevs_operational": 2, 00:27:14.305 "base_bdevs_list": [ 00:27:14.305 { 00:27:14.305 "name": "spare", 00:27:14.305 "uuid": "36a9c73a-47cc-5bff-bee7-52808103b7d5", 00:27:14.305 "is_configured": true, 00:27:14.305 "data_offset": 256, 00:27:14.305 "data_size": 7936 00:27:14.305 }, 00:27:14.305 { 00:27:14.305 "name": "BaseBdev2", 00:27:14.305 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:14.305 "is_configured": true, 00:27:14.305 "data_offset": 256, 00:27:14.305 "data_size": 7936 00:27:14.305 } 00:27:14.305 ] 00:27:14.305 }' 00:27:14.305 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:14.564 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:14.564 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:14.564 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:14.564 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.564 08:40:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:14.564 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:27:14.564 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:14.822 [2024-07-23 08:40:27.216475] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:14.822 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:14.822 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:14.822 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:14.822 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:14.822 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:14.822 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:14.822 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:14.822 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:14.822 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:14.822 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:14.822 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.822 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:15.080 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:15.080 "name": "raid_bdev1", 00:27:15.080 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:15.080 "strip_size_kb": 0, 00:27:15.080 "state": "online", 00:27:15.080 "raid_level": "raid1", 00:27:15.080 "superblock": true, 00:27:15.080 "num_base_bdevs": 2, 00:27:15.080 "num_base_bdevs_discovered": 1, 00:27:15.080 "num_base_bdevs_operational": 1, 00:27:15.080 "base_bdevs_list": [ 00:27:15.080 { 00:27:15.080 "name": null, 00:27:15.080 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:15.080 "is_configured": false, 00:27:15.080 "data_offset": 256, 00:27:15.080 "data_size": 7936 00:27:15.080 }, 00:27:15.080 { 00:27:15.080 "name": "BaseBdev2", 00:27:15.080 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:15.080 "is_configured": true, 00:27:15.080 "data_offset": 256, 00:27:15.080 "data_size": 7936 00:27:15.080 } 00:27:15.080 ] 00:27:15.080 }' 00:27:15.080 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:15.080 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:15.646 08:40:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:15.646 [2024-07-23 08:40:28.058724] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:15.646 [2024-07-23 08:40:28.058903] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:15.646 [2024-07-23 08:40:28.058920] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:15.646 [2024-07-23 08:40:28.058957] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:15.646 [2024-07-23 08:40:28.077473] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c200 00:27:15.646 [2024-07-23 08:40:28.079108] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:15.646 08:40:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:27:16.581 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:16.581 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:16.581 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:16.581 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:16.581 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:16.840 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:16.840 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.840 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:16.840 "name": "raid_bdev1", 00:27:16.840 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:16.840 "strip_size_kb": 0, 00:27:16.840 "state": "online", 00:27:16.840 "raid_level": "raid1", 00:27:16.840 "superblock": true, 00:27:16.840 "num_base_bdevs": 2, 00:27:16.840 "num_base_bdevs_discovered": 2, 00:27:16.840 "num_base_bdevs_operational": 2, 00:27:16.840 "process": { 00:27:16.840 "type": "rebuild", 00:27:16.840 "target": "spare", 00:27:16.840 "progress": { 00:27:16.840 "blocks": 2816, 00:27:16.840 "percent": 35 00:27:16.840 } 00:27:16.840 }, 00:27:16.840 "base_bdevs_list": [ 00:27:16.840 { 00:27:16.840 "name": "spare", 00:27:16.840 "uuid": "36a9c73a-47cc-5bff-bee7-52808103b7d5", 00:27:16.840 "is_configured": true, 00:27:16.840 "data_offset": 256, 00:27:16.840 "data_size": 7936 00:27:16.840 }, 00:27:16.840 { 00:27:16.840 "name": "BaseBdev2", 00:27:16.840 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:16.840 "is_configured": true, 00:27:16.840 "data_offset": 256, 00:27:16.840 "data_size": 7936 00:27:16.840 } 00:27:16.840 ] 00:27:16.840 }' 00:27:16.840 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:16.840 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:16.840 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:17.099 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:17.099 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:17.099 [2024-07-23 08:40:29.516813] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:17.099 [2024-07-23 08:40:29.591389] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:17.099 [2024-07-23 08:40:29.591444] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:17.099 [2024-07-23 08:40:29.591460] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:17.099 [2024-07-23 08:40:29.591472] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:17.358 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:17.358 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:17.358 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:17.358 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:17.358 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:17.358 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:17.358 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:17.358 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:17.358 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:17.358 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:17.358 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:17.358 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:17.358 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:17.358 "name": "raid_bdev1", 00:27:17.358 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:17.358 "strip_size_kb": 0, 00:27:17.358 "state": "online", 00:27:17.358 "raid_level": "raid1", 00:27:17.358 "superblock": true, 00:27:17.358 "num_base_bdevs": 2, 00:27:17.358 "num_base_bdevs_discovered": 1, 00:27:17.358 "num_base_bdevs_operational": 1, 00:27:17.358 "base_bdevs_list": [ 00:27:17.358 { 00:27:17.358 "name": null, 00:27:17.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:17.358 "is_configured": false, 00:27:17.358 "data_offset": 256, 00:27:17.358 "data_size": 7936 00:27:17.358 }, 00:27:17.358 { 00:27:17.358 "name": "BaseBdev2", 00:27:17.358 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:17.358 "is_configured": true, 00:27:17.358 "data_offset": 256, 00:27:17.358 "data_size": 7936 00:27:17.358 } 00:27:17.358 ] 00:27:17.358 }' 00:27:17.358 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:17.358 08:40:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:17.926 08:40:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:18.185 [2024-07-23 08:40:30.458467] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:18.185 [2024-07-23 08:40:30.458525] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:18.185 [2024-07-23 08:40:30.458560] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038480 00:27:18.185 [2024-07-23 08:40:30.458571] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:18.185 [2024-07-23 08:40:30.458802] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:18.185 [2024-07-23 08:40:30.458818] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:18.185 [2024-07-23 08:40:30.458886] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:18.185 [2024-07-23 08:40:30.458899] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:18.185 [2024-07-23 08:40:30.458909] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:18.185 [2024-07-23 08:40:30.458930] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:18.185 [2024-07-23 08:40:30.477287] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00000c2d0 00:27:18.185 spare 00:27:18.185 [2024-07-23 08:40:30.478921] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:18.185 08:40:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:27:19.122 08:40:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:19.122 08:40:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:19.122 08:40:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:19.122 08:40:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:19.122 08:40:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:19.122 08:40:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:19.122 08:40:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:19.381 08:40:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:19.381 "name": "raid_bdev1", 00:27:19.381 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:19.381 "strip_size_kb": 0, 00:27:19.381 "state": "online", 00:27:19.381 "raid_level": "raid1", 00:27:19.381 "superblock": true, 00:27:19.381 "num_base_bdevs": 2, 00:27:19.381 "num_base_bdevs_discovered": 2, 00:27:19.381 "num_base_bdevs_operational": 2, 00:27:19.381 "process": { 00:27:19.381 "type": "rebuild", 00:27:19.381 "target": "spare", 00:27:19.381 "progress": { 00:27:19.381 "blocks": 2816, 00:27:19.381 "percent": 35 00:27:19.381 } 00:27:19.381 }, 00:27:19.381 "base_bdevs_list": [ 00:27:19.381 { 00:27:19.381 "name": "spare", 00:27:19.381 "uuid": "36a9c73a-47cc-5bff-bee7-52808103b7d5", 00:27:19.381 "is_configured": true, 00:27:19.381 "data_offset": 256, 00:27:19.381 "data_size": 7936 00:27:19.381 }, 00:27:19.381 { 00:27:19.381 "name": "BaseBdev2", 00:27:19.381 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:19.381 "is_configured": true, 00:27:19.381 "data_offset": 256, 00:27:19.381 "data_size": 7936 00:27:19.381 } 00:27:19.381 ] 00:27:19.381 }' 00:27:19.381 08:40:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:19.381 08:40:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:19.381 08:40:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:19.381 08:40:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:19.381 08:40:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:19.640 [2024-07-23 08:40:31.924329] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:19.640 [2024-07-23 08:40:31.990962] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:19.640 [2024-07-23 08:40:31.991009] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:19.640 [2024-07-23 08:40:31.991045] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:19.640 [2024-07-23 08:40:31.991054] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:19.640 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:19.640 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:19.640 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:19.640 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:19.640 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:19.640 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:19.640 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:19.640 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:19.640 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:19.640 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:19.640 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:19.640 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:19.899 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:19.899 "name": "raid_bdev1", 00:27:19.899 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:19.899 "strip_size_kb": 0, 00:27:19.899 "state": "online", 00:27:19.899 "raid_level": "raid1", 00:27:19.899 "superblock": true, 00:27:19.899 "num_base_bdevs": 2, 00:27:19.899 "num_base_bdevs_discovered": 1, 00:27:19.899 "num_base_bdevs_operational": 1, 00:27:19.899 "base_bdevs_list": [ 00:27:19.899 { 00:27:19.899 "name": null, 00:27:19.899 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:19.899 "is_configured": false, 00:27:19.899 "data_offset": 256, 00:27:19.899 "data_size": 7936 00:27:19.899 }, 00:27:19.899 { 00:27:19.899 "name": "BaseBdev2", 00:27:19.899 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:19.899 "is_configured": true, 00:27:19.899 "data_offset": 256, 00:27:19.899 "data_size": 7936 00:27:19.899 } 00:27:19.899 ] 00:27:19.899 }' 00:27:19.899 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:19.899 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:20.465 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:20.465 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:20.465 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:20.465 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:20.465 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:20.465 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.465 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:20.465 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:20.465 "name": "raid_bdev1", 00:27:20.465 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:20.465 "strip_size_kb": 0, 00:27:20.465 "state": "online", 00:27:20.465 "raid_level": "raid1", 00:27:20.465 "superblock": true, 00:27:20.465 "num_base_bdevs": 2, 00:27:20.465 "num_base_bdevs_discovered": 1, 00:27:20.465 "num_base_bdevs_operational": 1, 00:27:20.465 "base_bdevs_list": [ 00:27:20.465 { 00:27:20.465 "name": null, 00:27:20.465 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:20.465 "is_configured": false, 00:27:20.465 "data_offset": 256, 00:27:20.465 "data_size": 7936 00:27:20.465 }, 00:27:20.465 { 00:27:20.465 "name": "BaseBdev2", 00:27:20.465 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:20.465 "is_configured": true, 00:27:20.465 "data_offset": 256, 00:27:20.465 "data_size": 7936 00:27:20.465 } 00:27:20.465 ] 00:27:20.465 }' 00:27:20.465 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:20.465 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:20.465 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:20.465 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:20.465 08:40:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:20.724 08:40:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:20.982 [2024-07-23 08:40:33.243082] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:20.982 [2024-07-23 08:40:33.243134] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:20.982 [2024-07-23 08:40:33.243158] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000038a80 00:27:20.982 [2024-07-23 08:40:33.243168] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:20.982 [2024-07-23 08:40:33.243401] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:20.982 [2024-07-23 08:40:33.243415] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:20.982 [2024-07-23 08:40:33.243466] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:20.982 [2024-07-23 08:40:33.243480] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:20.982 [2024-07-23 08:40:33.243490] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:20.982 BaseBdev1 00:27:20.982 08:40:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:27:21.942 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:21.942 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:21.942 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:21.942 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:21.942 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:21.942 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:21.942 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:21.942 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:21.942 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:21.942 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:21.942 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.942 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:21.942 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:21.942 "name": "raid_bdev1", 00:27:21.942 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:21.942 "strip_size_kb": 0, 00:27:21.942 "state": "online", 00:27:21.942 "raid_level": "raid1", 00:27:21.942 "superblock": true, 00:27:21.942 "num_base_bdevs": 2, 00:27:21.942 "num_base_bdevs_discovered": 1, 00:27:21.942 "num_base_bdevs_operational": 1, 00:27:21.942 "base_bdevs_list": [ 00:27:21.942 { 00:27:21.942 "name": null, 00:27:21.942 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:21.942 "is_configured": false, 00:27:21.942 "data_offset": 256, 00:27:21.942 "data_size": 7936 00:27:21.942 }, 00:27:21.942 { 00:27:21.942 "name": "BaseBdev2", 00:27:21.942 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:21.942 "is_configured": true, 00:27:21.942 "data_offset": 256, 00:27:21.942 "data_size": 7936 00:27:21.942 } 00:27:21.942 ] 00:27:21.942 }' 00:27:21.942 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:21.942 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:22.509 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:22.509 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:22.509 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:22.509 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:22.509 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:22.509 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:22.509 08:40:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:22.768 "name": "raid_bdev1", 00:27:22.768 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:22.768 "strip_size_kb": 0, 00:27:22.768 "state": "online", 00:27:22.768 "raid_level": "raid1", 00:27:22.768 "superblock": true, 00:27:22.768 "num_base_bdevs": 2, 00:27:22.768 "num_base_bdevs_discovered": 1, 00:27:22.768 "num_base_bdevs_operational": 1, 00:27:22.768 "base_bdevs_list": [ 00:27:22.768 { 00:27:22.768 "name": null, 00:27:22.768 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:22.768 "is_configured": false, 00:27:22.768 "data_offset": 256, 00:27:22.768 "data_size": 7936 00:27:22.768 }, 00:27:22.768 { 00:27:22.768 "name": "BaseBdev2", 00:27:22.768 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:22.768 "is_configured": true, 00:27:22.768 "data_offset": 256, 00:27:22.768 "data_size": 7936 00:27:22.768 } 00:27:22.768 ] 00:27:22.768 }' 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:22.768 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:23.027 [2024-07-23 08:40:35.336774] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:23.027 [2024-07-23 08:40:35.336929] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:23.027 [2024-07-23 08:40:35.336943] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:23.027 request: 00:27:23.027 { 00:27:23.027 "base_bdev": "BaseBdev1", 00:27:23.027 "raid_bdev": "raid_bdev1", 00:27:23.027 "method": "bdev_raid_add_base_bdev", 00:27:23.027 "req_id": 1 00:27:23.027 } 00:27:23.027 Got JSON-RPC error response 00:27:23.027 response: 00:27:23.027 { 00:27:23.027 "code": -22, 00:27:23.027 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:23.027 } 00:27:23.027 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:27:23.027 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:23.027 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:23.027 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:23.027 08:40:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:27:23.963 08:40:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:23.963 08:40:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:23.963 08:40:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:23.963 08:40:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:23.963 08:40:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:23.963 08:40:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:23.963 08:40:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:23.963 08:40:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:23.963 08:40:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:23.963 08:40:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:23.963 08:40:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:23.963 08:40:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:24.222 08:40:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:24.222 "name": "raid_bdev1", 00:27:24.222 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:24.222 "strip_size_kb": 0, 00:27:24.222 "state": "online", 00:27:24.222 "raid_level": "raid1", 00:27:24.222 "superblock": true, 00:27:24.222 "num_base_bdevs": 2, 00:27:24.222 "num_base_bdevs_discovered": 1, 00:27:24.222 "num_base_bdevs_operational": 1, 00:27:24.222 "base_bdevs_list": [ 00:27:24.222 { 00:27:24.222 "name": null, 00:27:24.222 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:24.222 "is_configured": false, 00:27:24.222 "data_offset": 256, 00:27:24.222 "data_size": 7936 00:27:24.222 }, 00:27:24.222 { 00:27:24.222 "name": "BaseBdev2", 00:27:24.222 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:24.222 "is_configured": true, 00:27:24.222 "data_offset": 256, 00:27:24.222 "data_size": 7936 00:27:24.222 } 00:27:24.222 ] 00:27:24.222 }' 00:27:24.222 08:40:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:24.222 08:40:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:24.788 "name": "raid_bdev1", 00:27:24.788 "uuid": "6484db21-d012-417b-abe3-ee648a01e6dc", 00:27:24.788 "strip_size_kb": 0, 00:27:24.788 "state": "online", 00:27:24.788 "raid_level": "raid1", 00:27:24.788 "superblock": true, 00:27:24.788 "num_base_bdevs": 2, 00:27:24.788 "num_base_bdevs_discovered": 1, 00:27:24.788 "num_base_bdevs_operational": 1, 00:27:24.788 "base_bdevs_list": [ 00:27:24.788 { 00:27:24.788 "name": null, 00:27:24.788 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:24.788 "is_configured": false, 00:27:24.788 "data_offset": 256, 00:27:24.788 "data_size": 7936 00:27:24.788 }, 00:27:24.788 { 00:27:24.788 "name": "BaseBdev2", 00:27:24.788 "uuid": "6bdeb5e1-93de-56f1-8a2a-c42d96828178", 00:27:24.788 "is_configured": true, 00:27:24.788 "data_offset": 256, 00:27:24.788 "data_size": 7936 00:27:24.788 } 00:27:24.788 ] 00:27:24.788 }' 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 1586707 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 1586707 ']' 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 1586707 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:24.788 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1586707 00:27:25.047 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:25.047 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:25.047 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1586707' 00:27:25.047 killing process with pid 1586707 00:27:25.047 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 1586707 00:27:25.047 Received shutdown signal, test time was about 60.000000 seconds 00:27:25.047 00:27:25.047 Latency(us) 00:27:25.047 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:25.047 =================================================================================================================== 00:27:25.047 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:27:25.047 [2024-07-23 08:40:37.339177] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:25.047 [2024-07-23 08:40:37.339294] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:25.047 08:40:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 1586707 00:27:25.047 [2024-07-23 08:40:37.339345] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:25.047 [2024-07-23 08:40:37.339357] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000037e80 name raid_bdev1, state offline 00:27:25.307 [2024-07-23 08:40:37.583860] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:26.686 08:40:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:27:26.686 00:27:26.686 real 0m25.569s 00:27:26.686 user 0m38.830s 00:27:26.686 sys 0m2.675s 00:27:26.686 08:40:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:26.686 08:40:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:26.686 ************************************ 00:27:26.686 END TEST raid_rebuild_test_sb_md_interleaved 00:27:26.686 ************************************ 00:27:26.686 08:40:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:26.686 08:40:38 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:27:26.686 08:40:38 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:27:26.686 08:40:38 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 1586707 ']' 00:27:26.686 08:40:38 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 1586707 00:27:26.686 08:40:38 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:27:26.686 00:27:26.686 real 16m1.639s 00:27:26.686 user 25m47.221s 00:27:26.686 sys 2m18.822s 00:27:26.686 08:40:38 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:26.686 08:40:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:26.686 ************************************ 00:27:26.686 END TEST bdev_raid 00:27:26.686 ************************************ 00:27:26.686 08:40:38 -- common/autotest_common.sh@1142 -- # return 0 00:27:26.686 08:40:38 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:27:26.686 08:40:38 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:27:26.686 08:40:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:26.686 08:40:38 -- common/autotest_common.sh@10 -- # set +x 00:27:26.686 ************************************ 00:27:26.686 START TEST bdevperf_config 00:27:26.686 ************************************ 00:27:26.686 08:40:39 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:27:26.686 * Looking for test storage... 00:27:26.686 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:27:26.686 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:27:26.686 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:27:26.686 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:27:26.686 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:27:26.686 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:27:26.686 08:40:39 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:27:31.962 08:40:43 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-23 08:40:39.198894] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:27:31.962 [2024-07-23 08:40:39.199006] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1591731 ] 00:27:31.962 Using job config with 4 jobs 00:27:31.962 [2024-07-23 08:40:39.371648] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:31.962 [2024-07-23 08:40:39.630078] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:31.962 cpumask for '\''job0'\'' is too big 00:27:31.962 cpumask for '\''job1'\'' is too big 00:27:31.962 cpumask for '\''job2'\'' is too big 00:27:31.962 cpumask for '\''job3'\'' is too big 00:27:31.962 Running I/O for 2 seconds... 00:27:31.962 00:27:31.962 Latency(us) 00:27:31.962 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:31.962 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:27:31.962 Malloc0 : 2.01 32054.71 31.30 0.00 0.00 7978.47 1497.97 12857.54 00:27:31.962 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:27:31.962 Malloc0 : 2.01 32035.29 31.28 0.00 0.00 7970.16 1458.96 11359.57 00:27:31.962 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:27:31.963 Malloc0 : 2.01 32016.10 31.27 0.00 0.00 7960.34 1536.98 9736.78 00:27:31.963 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:27:31.963 Malloc0 : 2.02 32093.47 31.34 0.00 0.00 7927.82 635.86 8426.06 00:27:31.963 =================================================================================================================== 00:27:31.963 Total : 128199.58 125.19 0.00 0.00 7959.17 635.86 12857.54' 00:27:31.963 08:40:43 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-23 08:40:39.198894] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:27:31.963 [2024-07-23 08:40:39.199006] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1591731 ] 00:27:31.963 Using job config with 4 jobs 00:27:31.963 [2024-07-23 08:40:39.371648] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:31.963 [2024-07-23 08:40:39.630078] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:31.963 cpumask for '\''job0'\'' is too big 00:27:31.963 cpumask for '\''job1'\'' is too big 00:27:31.963 cpumask for '\''job2'\'' is too big 00:27:31.963 cpumask for '\''job3'\'' is too big 00:27:31.963 Running I/O for 2 seconds... 00:27:31.963 00:27:31.963 Latency(us) 00:27:31.963 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:31.963 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:27:31.963 Malloc0 : 2.01 32054.71 31.30 0.00 0.00 7978.47 1497.97 12857.54 00:27:31.963 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:27:31.963 Malloc0 : 2.01 32035.29 31.28 0.00 0.00 7970.16 1458.96 11359.57 00:27:31.963 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:27:31.963 Malloc0 : 2.01 32016.10 31.27 0.00 0.00 7960.34 1536.98 9736.78 00:27:31.963 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:27:31.963 Malloc0 : 2.02 32093.47 31.34 0.00 0.00 7927.82 635.86 8426.06 00:27:31.963 =================================================================================================================== 00:27:31.963 Total : 128199.58 125.19 0.00 0.00 7959.17 635.86 12857.54' 00:27:31.963 08:40:43 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-23 08:40:39.198894] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:27:31.963 [2024-07-23 08:40:39.199006] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1591731 ] 00:27:31.963 Using job config with 4 jobs 00:27:31.963 [2024-07-23 08:40:39.371648] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:31.963 [2024-07-23 08:40:39.630078] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:31.963 cpumask for '\''job0'\'' is too big 00:27:31.963 cpumask for '\''job1'\'' is too big 00:27:31.963 cpumask for '\''job2'\'' is too big 00:27:31.963 cpumask for '\''job3'\'' is too big 00:27:31.963 Running I/O for 2 seconds... 00:27:31.963 00:27:31.963 Latency(us) 00:27:31.963 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:31.963 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:27:31.963 Malloc0 : 2.01 32054.71 31.30 0.00 0.00 7978.47 1497.97 12857.54 00:27:31.963 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:27:31.963 Malloc0 : 2.01 32035.29 31.28 0.00 0.00 7970.16 1458.96 11359.57 00:27:31.963 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:27:31.963 Malloc0 : 2.01 32016.10 31.27 0.00 0.00 7960.34 1536.98 9736.78 00:27:31.963 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:27:31.963 Malloc0 : 2.02 32093.47 31.34 0.00 0.00 7927.82 635.86 8426.06 00:27:31.963 =================================================================================================================== 00:27:31.963 Total : 128199.58 125.19 0.00 0.00 7959.17 635.86 12857.54' 00:27:31.963 08:40:43 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:27:31.963 08:40:43 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:27:31.963 08:40:43 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:27:31.963 08:40:43 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:27:31.963 [2024-07-23 08:40:43.742009] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:27:31.963 [2024-07-23 08:40:43.742093] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1592499 ] 00:27:31.963 [2024-07-23 08:40:43.881167] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:31.963 [2024-07-23 08:40:44.135108] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:32.222 cpumask for 'job0' is too big 00:27:32.222 cpumask for 'job1' is too big 00:27:32.222 cpumask for 'job2' is too big 00:27:32.222 cpumask for 'job3' is too big 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:27:36.414 Running I/O for 2 seconds... 00:27:36.414 00:27:36.414 Latency(us) 00:27:36.414 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:36.414 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:27:36.414 Malloc0 : 2.02 31471.66 30.73 0.00 0.00 8128.27 1443.35 12545.46 00:27:36.414 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:27:36.414 Malloc0 : 2.02 31446.38 30.71 0.00 0.00 8120.09 1412.14 11047.50 00:27:36.414 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:27:36.414 Malloc0 : 2.02 31423.18 30.69 0.00 0.00 8112.70 1419.95 9924.02 00:27:36.414 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:27:36.414 Malloc0 : 2.02 31399.97 30.66 0.00 0.00 8105.11 1427.75 9986.44 00:27:36.414 =================================================================================================================== 00:27:36.414 Total : 125741.19 122.79 0.00 0.00 8116.54 1412.14 12545.46' 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:27:36.414 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:27:36.414 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:27:36.414 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:27:36.414 08:40:48 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-23 08:40:48.246134] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:27:40.607 [2024-07-23 08:40:48.246231] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1593470 ] 00:27:40.607 Using job config with 3 jobs 00:27:40.607 [2024-07-23 08:40:48.387810] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:40.607 [2024-07-23 08:40:48.629066] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:40.607 cpumask for '\''job0'\'' is too big 00:27:40.607 cpumask for '\''job1'\'' is too big 00:27:40.607 cpumask for '\''job2'\'' is too big 00:27:40.607 Running I/O for 2 seconds... 00:27:40.607 00:27:40.607 Latency(us) 00:27:40.607 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:40.607 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:27:40.607 Malloc0 : 2.01 42853.85 41.85 0.00 0.00 5967.17 1419.95 8987.79 00:27:40.607 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:27:40.607 Malloc0 : 2.01 42876.00 41.87 0.00 0.00 5953.97 1396.54 7552.24 00:27:40.607 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:27:40.607 Malloc0 : 2.01 42850.02 41.85 0.00 0.00 5947.47 1412.14 7333.79 00:27:40.607 =================================================================================================================== 00:27:40.607 Total : 128579.87 125.57 0.00 0.00 5956.19 1396.54 8987.79' 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-23 08:40:48.246134] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:27:40.607 [2024-07-23 08:40:48.246231] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1593470 ] 00:27:40.607 Using job config with 3 jobs 00:27:40.607 [2024-07-23 08:40:48.387810] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:40.607 [2024-07-23 08:40:48.629066] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:40.607 cpumask for '\''job0'\'' is too big 00:27:40.607 cpumask for '\''job1'\'' is too big 00:27:40.607 cpumask for '\''job2'\'' is too big 00:27:40.607 Running I/O for 2 seconds... 00:27:40.607 00:27:40.607 Latency(us) 00:27:40.607 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:40.607 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:27:40.607 Malloc0 : 2.01 42853.85 41.85 0.00 0.00 5967.17 1419.95 8987.79 00:27:40.607 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:27:40.607 Malloc0 : 2.01 42876.00 41.87 0.00 0.00 5953.97 1396.54 7552.24 00:27:40.607 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:27:40.607 Malloc0 : 2.01 42850.02 41.85 0.00 0.00 5947.47 1412.14 7333.79 00:27:40.607 =================================================================================================================== 00:27:40.607 Total : 128579.87 125.57 0.00 0.00 5956.19 1396.54 8987.79' 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-23 08:40:48.246134] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:27:40.607 [2024-07-23 08:40:48.246231] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1593470 ] 00:27:40.607 Using job config with 3 jobs 00:27:40.607 [2024-07-23 08:40:48.387810] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:40.607 [2024-07-23 08:40:48.629066] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:40.607 cpumask for '\''job0'\'' is too big 00:27:40.607 cpumask for '\''job1'\'' is too big 00:27:40.607 cpumask for '\''job2'\'' is too big 00:27:40.607 Running I/O for 2 seconds... 00:27:40.607 00:27:40.607 Latency(us) 00:27:40.607 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:40.607 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:27:40.607 Malloc0 : 2.01 42853.85 41.85 0.00 0.00 5967.17 1419.95 8987.79 00:27:40.607 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:27:40.607 Malloc0 : 2.01 42876.00 41.87 0.00 0.00 5953.97 1396.54 7552.24 00:27:40.607 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:27:40.607 Malloc0 : 2.01 42850.02 41.85 0.00 0.00 5947.47 1412.14 7333.79 00:27:40.607 =================================================================================================================== 00:27:40.607 Total : 128579.87 125.57 0.00 0.00 5956.19 1396.54 8987.79' 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:27:40.607 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:27:40.607 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:27:40.607 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:27:40.607 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:27:40.607 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:27:40.607 08:40:52 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:27:44.798 08:40:57 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-23 08:40:52.770224] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:27:44.798 [2024-07-23 08:40:52.770324] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1594250 ] 00:27:44.798 Using job config with 4 jobs 00:27:44.798 [2024-07-23 08:40:52.906918] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:44.798 [2024-07-23 08:40:53.137324] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:44.798 cpumask for '\''job0'\'' is too big 00:27:44.798 cpumask for '\''job1'\'' is too big 00:27:44.798 cpumask for '\''job2'\'' is too big 00:27:44.798 cpumask for '\''job3'\'' is too big 00:27:44.798 Running I/O for 2 seconds... 00:27:44.798 00:27:44.799 Latency(us) 00:27:44.799 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:44.799 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc0 : 2.02 16332.85 15.95 0.00 0.00 15663.83 3276.80 25964.74 00:27:44.799 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc1 : 2.02 16322.40 15.94 0.00 0.00 15661.06 3744.91 25964.74 00:27:44.799 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc0 : 2.02 16312.60 15.93 0.00 0.00 15624.51 3042.74 22843.98 00:27:44.799 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc1 : 2.03 16302.26 15.92 0.00 0.00 15623.13 3651.29 22719.15 00:27:44.799 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc0 : 2.03 16356.94 15.97 0.00 0.00 15527.45 3058.35 20222.54 00:27:44.799 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc1 : 2.04 16346.65 15.96 0.00 0.00 15525.93 3666.90 20097.71 00:27:44.799 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc0 : 2.04 16336.97 15.95 0.00 0.00 15491.47 3058.35 20097.71 00:27:44.799 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc1 : 2.04 16326.69 15.94 0.00 0.00 15489.00 3620.08 19972.88 00:27:44.799 =================================================================================================================== 00:27:44.799 Total : 130637.36 127.58 0.00 0.00 15575.54 3042.74 25964.74' 00:27:44.799 08:40:57 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-23 08:40:52.770224] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:27:44.799 [2024-07-23 08:40:52.770324] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1594250 ] 00:27:44.799 Using job config with 4 jobs 00:27:44.799 [2024-07-23 08:40:52.906918] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:44.799 [2024-07-23 08:40:53.137324] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:44.799 cpumask for '\''job0'\'' is too big 00:27:44.799 cpumask for '\''job1'\'' is too big 00:27:44.799 cpumask for '\''job2'\'' is too big 00:27:44.799 cpumask for '\''job3'\'' is too big 00:27:44.799 Running I/O for 2 seconds... 00:27:44.799 00:27:44.799 Latency(us) 00:27:44.799 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:44.799 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc0 : 2.02 16332.85 15.95 0.00 0.00 15663.83 3276.80 25964.74 00:27:44.799 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc1 : 2.02 16322.40 15.94 0.00 0.00 15661.06 3744.91 25964.74 00:27:44.799 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc0 : 2.02 16312.60 15.93 0.00 0.00 15624.51 3042.74 22843.98 00:27:44.799 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc1 : 2.03 16302.26 15.92 0.00 0.00 15623.13 3651.29 22719.15 00:27:44.799 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc0 : 2.03 16356.94 15.97 0.00 0.00 15527.45 3058.35 20222.54 00:27:44.799 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc1 : 2.04 16346.65 15.96 0.00 0.00 15525.93 3666.90 20097.71 00:27:44.799 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc0 : 2.04 16336.97 15.95 0.00 0.00 15491.47 3058.35 20097.71 00:27:44.799 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc1 : 2.04 16326.69 15.94 0.00 0.00 15489.00 3620.08 19972.88 00:27:44.799 =================================================================================================================== 00:27:44.799 Total : 130637.36 127.58 0.00 0.00 15575.54 3042.74 25964.74' 00:27:44.799 08:40:57 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-23 08:40:52.770224] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:27:44.799 [2024-07-23 08:40:52.770324] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1594250 ] 00:27:44.799 Using job config with 4 jobs 00:27:44.799 [2024-07-23 08:40:52.906918] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:44.799 [2024-07-23 08:40:53.137324] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:44.799 cpumask for '\''job0'\'' is too big 00:27:44.799 cpumask for '\''job1'\'' is too big 00:27:44.799 cpumask for '\''job2'\'' is too big 00:27:44.799 cpumask for '\''job3'\'' is too big 00:27:44.799 Running I/O for 2 seconds... 00:27:44.799 00:27:44.799 Latency(us) 00:27:44.799 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:44.799 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc0 : 2.02 16332.85 15.95 0.00 0.00 15663.83 3276.80 25964.74 00:27:44.799 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc1 : 2.02 16322.40 15.94 0.00 0.00 15661.06 3744.91 25964.74 00:27:44.799 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc0 : 2.02 16312.60 15.93 0.00 0.00 15624.51 3042.74 22843.98 00:27:44.799 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc1 : 2.03 16302.26 15.92 0.00 0.00 15623.13 3651.29 22719.15 00:27:44.799 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc0 : 2.03 16356.94 15.97 0.00 0.00 15527.45 3058.35 20222.54 00:27:44.799 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc1 : 2.04 16346.65 15.96 0.00 0.00 15525.93 3666.90 20097.71 00:27:44.799 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc0 : 2.04 16336.97 15.95 0.00 0.00 15491.47 3058.35 20097.71 00:27:44.799 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:27:44.799 Malloc1 : 2.04 16326.69 15.94 0.00 0.00 15489.00 3620.08 19972.88 00:27:44.799 =================================================================================================================== 00:27:44.799 Total : 130637.36 127.58 0.00 0.00 15575.54 3042.74 25964.74' 00:27:44.799 08:40:57 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:27:44.799 08:40:57 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:27:44.799 08:40:57 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:27:44.799 08:40:57 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:27:44.799 08:40:57 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:27:44.799 08:40:57 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:27:44.799 00:27:44.799 real 0m18.244s 00:27:44.799 user 0m16.645s 00:27:44.799 sys 0m1.270s 00:27:44.799 08:40:57 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:44.799 08:40:57 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:27:44.799 ************************************ 00:27:44.799 END TEST bdevperf_config 00:27:44.799 ************************************ 00:27:44.799 08:40:57 -- common/autotest_common.sh@1142 -- # return 0 00:27:44.799 08:40:57 -- spdk/autotest.sh@192 -- # uname -s 00:27:44.799 08:40:57 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:27:44.799 08:40:57 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:27:44.799 08:40:57 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:27:44.799 08:40:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:44.799 08:40:57 -- common/autotest_common.sh@10 -- # set +x 00:27:45.062 ************************************ 00:27:45.062 START TEST reactor_set_interrupt 00:27:45.062 ************************************ 00:27:45.062 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:27:45.062 * Looking for test storage... 00:27:45.062 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:45.062 08:40:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:27:45.063 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:27:45.063 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:45.063 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:45.063 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:27:45.063 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:45.063 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:27:45.063 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:27:45.063 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:27:45.063 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:27:45.063 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:27:45.063 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:27:45.063 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:27:45.063 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:27:45.063 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:27:45.063 08:40:57 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:27:45.063 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:27:45.063 08:40:57 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:27:45.063 08:40:57 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:27:45.063 08:40:57 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:27:45.063 08:40:57 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:45.063 08:40:57 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:27:45.063 08:40:57 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:27:45.063 08:40:57 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:27:45.063 08:40:57 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:27:45.063 08:40:57 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:27:45.063 08:40:57 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:27:45.063 08:40:57 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:27:45.063 08:40:57 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:27:45.063 08:40:57 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:27:45.063 08:40:57 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:27:45.064 08:40:57 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:27:45.064 #define SPDK_CONFIG_H 00:27:45.064 #define SPDK_CONFIG_APPS 1 00:27:45.064 #define SPDK_CONFIG_ARCH native 00:27:45.064 #define SPDK_CONFIG_ASAN 1 00:27:45.064 #undef SPDK_CONFIG_AVAHI 00:27:45.064 #undef SPDK_CONFIG_CET 00:27:45.064 #define SPDK_CONFIG_COVERAGE 1 00:27:45.064 #define SPDK_CONFIG_CROSS_PREFIX 00:27:45.064 #define SPDK_CONFIG_CRYPTO 1 00:27:45.064 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:27:45.064 #undef SPDK_CONFIG_CUSTOMOCF 00:27:45.064 #undef SPDK_CONFIG_DAOS 00:27:45.064 #define SPDK_CONFIG_DAOS_DIR 00:27:45.064 #define SPDK_CONFIG_DEBUG 1 00:27:45.064 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:27:45.064 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:27:45.064 #define SPDK_CONFIG_DPDK_INC_DIR 00:27:45.064 #define SPDK_CONFIG_DPDK_LIB_DIR 00:27:45.064 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:27:45.064 #undef SPDK_CONFIG_DPDK_UADK 00:27:45.064 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:27:45.064 #define SPDK_CONFIG_EXAMPLES 1 00:27:45.064 #undef SPDK_CONFIG_FC 00:27:45.064 #define SPDK_CONFIG_FC_PATH 00:27:45.064 #define SPDK_CONFIG_FIO_PLUGIN 1 00:27:45.064 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:27:45.064 #undef SPDK_CONFIG_FUSE 00:27:45.064 #undef SPDK_CONFIG_FUZZER 00:27:45.064 #define SPDK_CONFIG_FUZZER_LIB 00:27:45.064 #undef SPDK_CONFIG_GOLANG 00:27:45.064 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:27:45.064 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:27:45.064 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:27:45.064 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:27:45.064 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:27:45.064 #undef SPDK_CONFIG_HAVE_LIBBSD 00:27:45.064 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:27:45.064 #define SPDK_CONFIG_IDXD 1 00:27:45.064 #define SPDK_CONFIG_IDXD_KERNEL 1 00:27:45.064 #define SPDK_CONFIG_IPSEC_MB 1 00:27:45.064 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:27:45.064 #define SPDK_CONFIG_ISAL 1 00:27:45.064 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:27:45.064 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:27:45.064 #define SPDK_CONFIG_LIBDIR 00:27:45.064 #undef SPDK_CONFIG_LTO 00:27:45.064 #define SPDK_CONFIG_MAX_LCORES 128 00:27:45.064 #define SPDK_CONFIG_NVME_CUSE 1 00:27:45.064 #undef SPDK_CONFIG_OCF 00:27:45.064 #define SPDK_CONFIG_OCF_PATH 00:27:45.064 #define SPDK_CONFIG_OPENSSL_PATH 00:27:45.064 #undef SPDK_CONFIG_PGO_CAPTURE 00:27:45.064 #define SPDK_CONFIG_PGO_DIR 00:27:45.064 #undef SPDK_CONFIG_PGO_USE 00:27:45.064 #define SPDK_CONFIG_PREFIX /usr/local 00:27:45.064 #undef SPDK_CONFIG_RAID5F 00:27:45.064 #undef SPDK_CONFIG_RBD 00:27:45.064 #define SPDK_CONFIG_RDMA 1 00:27:45.064 #define SPDK_CONFIG_RDMA_PROV verbs 00:27:45.064 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:27:45.064 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:27:45.064 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:27:45.064 #define SPDK_CONFIG_SHARED 1 00:27:45.064 #undef SPDK_CONFIG_SMA 00:27:45.064 #define SPDK_CONFIG_TESTS 1 00:27:45.064 #undef SPDK_CONFIG_TSAN 00:27:45.064 #define SPDK_CONFIG_UBLK 1 00:27:45.064 #define SPDK_CONFIG_UBSAN 1 00:27:45.064 #undef SPDK_CONFIG_UNIT_TESTS 00:27:45.064 #undef SPDK_CONFIG_URING 00:27:45.064 #define SPDK_CONFIG_URING_PATH 00:27:45.064 #undef SPDK_CONFIG_URING_ZNS 00:27:45.064 #undef SPDK_CONFIG_USDT 00:27:45.064 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:27:45.064 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:27:45.064 #undef SPDK_CONFIG_VFIO_USER 00:27:45.064 #define SPDK_CONFIG_VFIO_USER_DIR 00:27:45.064 #define SPDK_CONFIG_VHOST 1 00:27:45.064 #define SPDK_CONFIG_VIRTIO 1 00:27:45.064 #undef SPDK_CONFIG_VTUNE 00:27:45.064 #define SPDK_CONFIG_VTUNE_DIR 00:27:45.064 #define SPDK_CONFIG_WERROR 1 00:27:45.064 #define SPDK_CONFIG_WPDK_DIR 00:27:45.064 #undef SPDK_CONFIG_XNVME 00:27:45.064 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:27:45.064 08:40:57 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:27:45.064 08:40:57 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:27:45.064 08:40:57 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:27:45.064 08:40:57 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:27:45.064 08:40:57 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:45.064 08:40:57 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:45.064 08:40:57 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:45.064 08:40:57 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:27:45.064 08:40:57 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:27:45.064 08:40:57 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 1 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:27:45.064 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 1 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:27:45.065 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j96 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 1595055 ]] 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 1595055 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.mutQww 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.mutQww/tests/interrupt /tmp/spdk.mutQww 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=995373056 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4289056768 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=53678231552 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=60682932224 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=7004700672 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=30336651264 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=30341464064 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4812800 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:45.066 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=12126924800 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=12136587264 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9662464 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=30338813952 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=30341468160 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=2654208 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=6068285440 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=6068289536 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:27:45.067 * Looking for test storage... 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=53678231552 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:27:45.067 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=9219293184 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:45.342 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:27:45.342 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:27:45.342 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:27:45.342 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:45.342 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:27:45.342 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:27:45.342 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:27:45.342 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:27:45.342 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:27:45.342 08:40:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:27:45.342 08:40:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:27:45.342 08:40:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:27:45.342 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:45.343 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:27:45.343 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1595174 00:27:45.343 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:45.343 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:27:45.343 08:40:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1595174 /var/tmp/spdk.sock 00:27:45.343 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 1595174 ']' 00:27:45.343 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:45.343 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:45.343 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:45.343 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:45.343 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:45.343 08:40:57 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:27:45.343 [2024-07-23 08:40:57.643550] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:27:45.343 [2024-07-23 08:40:57.643657] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1595174 ] 00:27:45.343 [2024-07-23 08:40:57.775513] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:27:45.601 [2024-07-23 08:40:57.988364] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:45.601 [2024-07-23 08:40:57.988432] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:45.601 [2024-07-23 08:40:57.988437] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:27:45.860 [2024-07-23 08:40:58.321407] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:27:46.119 08:40:58 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:46.119 08:40:58 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:27:46.119 08:40:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:27:46.119 08:40:58 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:46.378 Malloc0 00:27:46.378 Malloc1 00:27:46.378 Malloc2 00:27:46.378 08:40:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:27:46.378 08:40:58 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:27:46.378 08:40:58 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:27:46.378 08:40:58 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:27:46.378 5000+0 records in 00:27:46.378 5000+0 records out 00:27:46.378 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0105829 s, 968 MB/s 00:27:46.378 08:40:58 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:27:46.638 AIO0 00:27:46.638 08:40:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 1595174 00:27:46.638 08:40:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 1595174 without_thd 00:27:46.638 08:40:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1595174 00:27:46.638 08:40:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:27:46.638 08:40:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:27:46.638 08:40:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:27:46.638 08:40:58 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:27:46.638 08:40:58 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:27:46.638 08:40:58 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:27:46.638 08:40:58 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:27:46.638 08:40:58 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:27:46.638 08:40:58 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:27:46.638 08:40:59 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:27:46.638 08:40:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:27:46.638 08:40:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:27:46.638 08:40:59 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:27:46.638 08:40:59 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:27:46.638 08:40:59 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:27:46.638 08:40:59 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:27:46.897 spdk_thread ids are 1 on reactor0. 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1595174 0 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1595174 0 idle 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1595174 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1595174 -w 256 00:27:46.897 08:40:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1595174 root 20 0 20.1t 198144 35328 S 0.0 0.3 0:00.86 reactor_0' 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1595174 root 20 0 20.1t 198144 35328 S 0.0 0.3 0:00.86 reactor_0 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1595174 1 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1595174 1 idle 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1595174 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1595174 -w 256 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1595264 root 20 0 20.1t 198144 35328 S 0.0 0.3 0:00.00 reactor_1' 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1595264 root 20 0 20.1t 198144 35328 S 0.0 0.3 0:00.00 reactor_1 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:47.157 08:40:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:47.416 08:40:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:27:47.416 08:40:59 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:27:47.416 08:40:59 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:27:47.416 08:40:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:27:47.416 08:40:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:27:47.416 08:40:59 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:47.416 08:40:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:27:47.416 08:40:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1595174 2 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1595174 2 idle 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1595174 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1595174 -w 256 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1595266 root 20 0 20.1t 198144 35328 S 0.0 0.3 0:00.00 reactor_2' 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1595266 root 20 0 20.1t 198144 35328 S 0.0 0.3 0:00.00 reactor_2 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:27:47.417 08:40:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:27:47.675 [2024-07-23 08:41:00.009364] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:27:47.675 08:41:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:27:47.675 [2024-07-23 08:41:00.185177] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:27:47.675 [2024-07-23 08:41:00.185426] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:27:47.935 08:41:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:27:47.935 [2024-07-23 08:41:00.373070] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:27:47.935 [2024-07-23 08:41:00.373244] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:27:47.935 08:41:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:27:47.935 08:41:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1595174 0 00:27:47.935 08:41:00 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1595174 0 busy 00:27:47.935 08:41:00 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1595174 00:27:47.935 08:41:00 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:27:47.935 08:41:00 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:27:47.935 08:41:00 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:27:47.935 08:41:00 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:47.935 08:41:00 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:47.935 08:41:00 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:47.935 08:41:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1595174 -w 256 00:27:47.935 08:41:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1595174 root 20 0 20.1t 201216 35328 R 99.9 0.3 0:01.23 reactor_0' 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1595174 root 20 0 20.1t 201216 35328 R 99.9 0.3 0:01.23 reactor_0 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1595174 2 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1595174 2 busy 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1595174 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1595174 -w 256 00:27:48.194 08:41:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1595266 root 20 0 20.1t 201216 35328 R 93.8 0.3 0:00.35 reactor_2' 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1595266 root 20 0 20.1t 201216 35328 R 93.8 0.3 0:00.35 reactor_2 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=93.8 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=93 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 93 -lt 70 ]] 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:27:48.454 [2024-07-23 08:41:00.901078] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:27:48.454 [2024-07-23 08:41:00.901205] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1595174 2 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1595174 2 idle 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1595174 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:27:48.454 08:41:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1595174 -w 256 00:27:48.714 08:41:01 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1595266 root 20 0 20.1t 201216 35328 S 0.0 0.3 0:00.52 reactor_2' 00:27:48.714 08:41:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1595266 root 20 0 20.1t 201216 35328 S 0.0 0.3 0:00.52 reactor_2 00:27:48.714 08:41:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:48.714 08:41:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:48.714 08:41:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:27:48.714 08:41:01 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:27:48.714 08:41:01 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:27:48.714 08:41:01 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:27:48.714 08:41:01 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:27:48.714 08:41:01 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:48.714 08:41:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:27:48.973 [2024-07-23 08:41:01.249067] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:27:48.973 [2024-07-23 08:41:01.249208] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:27:48.973 08:41:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:27:48.973 08:41:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:27:48.973 08:41:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:27:48.973 [2024-07-23 08:41:01.429329] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:27:48.973 08:41:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1595174 0 00:27:48.973 08:41:01 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1595174 0 idle 00:27:48.973 08:41:01 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1595174 00:27:48.973 08:41:01 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:27:48.973 08:41:01 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:27:48.973 08:41:01 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:27:48.973 08:41:01 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:27:48.973 08:41:01 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:48.973 08:41:01 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:48.973 08:41:01 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:48.973 08:41:01 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1595174 -w 256 00:27:48.973 08:41:01 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:27:49.232 08:41:01 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1595174 root 20 0 20.1t 201216 35328 S 0.0 0.3 0:01.92 reactor_0' 00:27:49.232 08:41:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1595174 root 20 0 20.1t 201216 35328 S 0.0 0.3 0:01.92 reactor_0 00:27:49.232 08:41:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:49.232 08:41:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:49.232 08:41:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:27:49.233 08:41:01 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:27:49.233 08:41:01 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:27:49.233 08:41:01 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:27:49.233 08:41:01 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:27:49.233 08:41:01 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:49.233 08:41:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:27:49.233 08:41:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:27:49.233 08:41:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:27:49.233 08:41:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 1595174 00:27:49.233 08:41:01 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 1595174 ']' 00:27:49.233 08:41:01 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 1595174 00:27:49.233 08:41:01 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:27:49.233 08:41:01 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:49.233 08:41:01 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1595174 00:27:49.233 08:41:01 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:49.233 08:41:01 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:49.233 08:41:01 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1595174' 00:27:49.233 killing process with pid 1595174 00:27:49.233 08:41:01 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 1595174 00:27:49.233 08:41:01 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 1595174 00:27:51.138 08:41:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:27:51.138 08:41:03 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:27:51.138 08:41:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:27:51.138 08:41:03 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:51.138 08:41:03 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:27:51.138 08:41:03 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1596346 00:27:51.138 08:41:03 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:27:51.138 08:41:03 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:51.138 08:41:03 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1596346 /var/tmp/spdk.sock 00:27:51.138 08:41:03 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 1596346 ']' 00:27:51.138 08:41:03 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:51.138 08:41:03 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:51.138 08:41:03 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:51.138 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:51.138 08:41:03 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:51.138 08:41:03 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:27:51.138 [2024-07-23 08:41:03.287144] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:27:51.138 [2024-07-23 08:41:03.287240] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1596346 ] 00:27:51.138 [2024-07-23 08:41:03.407243] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:27:51.138 [2024-07-23 08:41:03.641371] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:51.138 [2024-07-23 08:41:03.641381] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:51.138 [2024-07-23 08:41:03.641386] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:27:51.705 [2024-07-23 08:41:04.006844] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:27:51.705 08:41:04 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:51.705 08:41:04 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:27:51.705 08:41:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:27:51.705 08:41:04 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:51.964 Malloc0 00:27:51.964 Malloc1 00:27:51.964 Malloc2 00:27:51.964 08:41:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:27:51.964 08:41:04 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:27:51.964 08:41:04 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:27:51.964 08:41:04 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:27:51.964 5000+0 records in 00:27:51.964 5000+0 records out 00:27:51.964 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0175183 s, 585 MB/s 00:27:51.964 08:41:04 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:27:52.222 AIO0 00:27:52.222 08:41:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 1596346 00:27:52.222 08:41:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 1596346 00:27:52.222 08:41:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1596346 00:27:52.222 08:41:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:27:52.222 08:41:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:27:52.222 08:41:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:27:52.222 08:41:04 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:27:52.222 08:41:04 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:27:52.222 08:41:04 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:27:52.222 08:41:04 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:27:52.222 08:41:04 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:27:52.222 08:41:04 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:27:52.481 08:41:04 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:27:52.481 08:41:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:27:52.481 08:41:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:27:52.481 08:41:04 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:27:52.481 08:41:04 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:27:52.481 08:41:04 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:27:52.481 08:41:04 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:27:52.481 08:41:04 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:27:52.481 08:41:04 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:27:52.740 spdk_thread ids are 1 on reactor0. 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1596346 0 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1596346 0 idle 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1596346 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1596346 -w 256 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1596346 root 20 0 20.1t 199680 36864 S 0.0 0.3 0:00.90 reactor_0' 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1596346 root 20 0 20.1t 199680 36864 S 0.0 0.3 0:00.90 reactor_0 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1596346 1 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1596346 1 idle 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1596346 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:27:52.740 08:41:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1596346 -w 256 00:27:52.999 08:41:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1596404 root 20 0 20.1t 199680 36864 S 0.0 0.3 0:00.00 reactor_1' 00:27:52.999 08:41:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1596404 root 20 0 20.1t 199680 36864 S 0.0 0.3 0:00.00 reactor_1 00:27:52.999 08:41:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:52.999 08:41:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:52.999 08:41:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:27:52.999 08:41:05 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:27:52.999 08:41:05 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:27:52.999 08:41:05 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:27:52.999 08:41:05 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:27:52.999 08:41:05 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:52.999 08:41:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:27:52.999 08:41:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1596346 2 00:27:52.999 08:41:05 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1596346 2 idle 00:27:52.999 08:41:05 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1596346 00:27:53.000 08:41:05 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:27:53.000 08:41:05 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:27:53.000 08:41:05 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:27:53.000 08:41:05 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:27:53.000 08:41:05 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:53.000 08:41:05 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:53.000 08:41:05 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:53.000 08:41:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1596346 -w 256 00:27:53.000 08:41:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:27:53.258 08:41:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1596405 root 20 0 20.1t 199680 36864 S 0.0 0.3 0:00.00 reactor_2' 00:27:53.258 08:41:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1596405 root 20 0 20.1t 199680 36864 S 0.0 0.3 0:00.00 reactor_2 00:27:53.258 08:41:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:53.258 08:41:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:53.258 08:41:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:27:53.258 08:41:05 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:27:53.258 08:41:05 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:27:53.258 08:41:05 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:27:53.258 08:41:05 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:27:53.258 08:41:05 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:53.258 08:41:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:27:53.258 08:41:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:27:53.258 [2024-07-23 08:41:05.682129] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:27:53.258 [2024-07-23 08:41:05.682282] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:27:53.258 [2024-07-23 08:41:05.682514] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:27:53.258 08:41:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:27:53.518 [2024-07-23 08:41:05.870552] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:27:53.518 [2024-07-23 08:41:05.870843] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:27:53.518 08:41:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:27:53.518 08:41:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1596346 0 00:27:53.518 08:41:05 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1596346 0 busy 00:27:53.518 08:41:05 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1596346 00:27:53.518 08:41:05 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:27:53.518 08:41:05 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:27:53.518 08:41:05 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:27:53.518 08:41:05 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:53.518 08:41:05 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:53.518 08:41:05 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:53.518 08:41:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1596346 -w 256 00:27:53.518 08:41:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1596346 root 20 0 20.1t 203520 36864 R 99.9 0.3 0:01.27 reactor_0' 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1596346 root 20 0 20.1t 203520 36864 R 99.9 0.3 0:01.27 reactor_0 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1596346 2 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1596346 2 busy 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1596346 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1596346 -w 256 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1596405 root 20 0 20.1t 203520 36864 R 99.9 0.3 0:00.35 reactor_2' 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1596405 root 20 0 20.1t 203520 36864 R 99.9 0.3 0:00.35 reactor_2 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:53.777 08:41:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:27:54.037 [2024-07-23 08:41:06.395992] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:27:54.037 [2024-07-23 08:41:06.396138] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:27:54.037 08:41:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:27:54.037 08:41:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1596346 2 00:27:54.037 08:41:06 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1596346 2 idle 00:27:54.037 08:41:06 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1596346 00:27:54.037 08:41:06 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:27:54.037 08:41:06 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:27:54.037 08:41:06 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:27:54.037 08:41:06 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:27:54.037 08:41:06 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:54.037 08:41:06 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:54.037 08:41:06 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:54.037 08:41:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1596346 -w 256 00:27:54.037 08:41:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:27:54.296 08:41:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1596405 root 20 0 20.1t 203520 36864 S 0.0 0.3 0:00.52 reactor_2' 00:27:54.296 08:41:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:54.296 08:41:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1596405 root 20 0 20.1t 203520 36864 S 0.0 0.3 0:00.52 reactor_2 00:27:54.296 08:41:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:54.296 08:41:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:27:54.296 08:41:06 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:27:54.296 08:41:06 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:27:54.296 08:41:06 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:27:54.296 08:41:06 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:27:54.297 08:41:06 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:54.297 08:41:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:27:54.297 [2024-07-23 08:41:06.748913] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:27:54.297 [2024-07-23 08:41:06.749154] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:27:54.297 [2024-07-23 08:41:06.749181] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:27:54.297 08:41:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:27:54.297 08:41:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1596346 0 00:27:54.297 08:41:06 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1596346 0 idle 00:27:54.297 08:41:06 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1596346 00:27:54.297 08:41:06 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:27:54.297 08:41:06 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:27:54.297 08:41:06 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:27:54.297 08:41:06 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:27:54.297 08:41:06 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:54.297 08:41:06 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:54.297 08:41:06 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:54.297 08:41:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1596346 -w 256 00:27:54.297 08:41:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:27:54.556 08:41:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1596346 root 20 0 20.1t 203520 36864 S 0.0 0.3 0:01.96 reactor_0' 00:27:54.556 08:41:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1596346 root 20 0 20.1t 203520 36864 S 0.0 0.3 0:01.96 reactor_0 00:27:54.556 08:41:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:54.556 08:41:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:54.556 08:41:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:27:54.556 08:41:06 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:27:54.556 08:41:06 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:27:54.556 08:41:06 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:27:54.556 08:41:06 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:27:54.556 08:41:06 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:54.556 08:41:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:27:54.556 08:41:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:27:54.556 08:41:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:27:54.556 08:41:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 1596346 00:27:54.556 08:41:06 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 1596346 ']' 00:27:54.556 08:41:06 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 1596346 00:27:54.556 08:41:06 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:27:54.556 08:41:06 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:54.556 08:41:06 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1596346 00:27:54.556 08:41:06 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:54.556 08:41:06 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:54.556 08:41:06 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1596346' 00:27:54.556 killing process with pid 1596346 00:27:54.556 08:41:06 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 1596346 00:27:54.556 08:41:06 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 1596346 00:27:56.466 08:41:08 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:27:56.466 08:41:08 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:27:56.466 00:27:56.466 real 0m11.209s 00:27:56.466 user 0m11.120s 00:27:56.466 sys 0m1.733s 00:27:56.466 08:41:08 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:56.466 08:41:08 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:27:56.466 ************************************ 00:27:56.466 END TEST reactor_set_interrupt 00:27:56.466 ************************************ 00:27:56.466 08:41:08 -- common/autotest_common.sh@1142 -- # return 0 00:27:56.466 08:41:08 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:27:56.466 08:41:08 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:27:56.466 08:41:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:56.466 08:41:08 -- common/autotest_common.sh@10 -- # set +x 00:27:56.466 ************************************ 00:27:56.466 START TEST reap_unregistered_poller 00:27:56.466 ************************************ 00:27:56.466 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:27:56.466 * Looking for test storage... 00:27:56.466 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:56.466 08:41:08 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:27:56.466 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:27:56.466 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:56.466 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:56.466 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:27:56.466 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:56.466 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:27:56.466 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:27:56.466 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:27:56.466 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:27:56.466 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:27:56.466 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:27:56.466 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:27:56.466 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:27:56.466 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=y 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:27:56.466 08:41:08 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:27:56.467 08:41:08 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:27:56.467 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:27:56.467 08:41:08 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:27:56.467 08:41:08 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:27:56.467 08:41:08 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:27:56.467 08:41:08 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:56.467 08:41:08 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:27:56.467 08:41:08 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:27:56.467 08:41:08 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:27:56.467 08:41:08 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:27:56.467 08:41:08 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:27:56.467 08:41:08 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:27:56.467 08:41:08 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:27:56.467 08:41:08 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:27:56.467 08:41:08 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:27:56.467 08:41:08 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:27:56.467 08:41:08 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:27:56.467 #define SPDK_CONFIG_H 00:27:56.467 #define SPDK_CONFIG_APPS 1 00:27:56.467 #define SPDK_CONFIG_ARCH native 00:27:56.467 #define SPDK_CONFIG_ASAN 1 00:27:56.467 #undef SPDK_CONFIG_AVAHI 00:27:56.467 #undef SPDK_CONFIG_CET 00:27:56.467 #define SPDK_CONFIG_COVERAGE 1 00:27:56.467 #define SPDK_CONFIG_CROSS_PREFIX 00:27:56.467 #define SPDK_CONFIG_CRYPTO 1 00:27:56.467 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:27:56.467 #undef SPDK_CONFIG_CUSTOMOCF 00:27:56.467 #undef SPDK_CONFIG_DAOS 00:27:56.467 #define SPDK_CONFIG_DAOS_DIR 00:27:56.467 #define SPDK_CONFIG_DEBUG 1 00:27:56.467 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:27:56.467 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:27:56.467 #define SPDK_CONFIG_DPDK_INC_DIR 00:27:56.467 #define SPDK_CONFIG_DPDK_LIB_DIR 00:27:56.467 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:27:56.467 #undef SPDK_CONFIG_DPDK_UADK 00:27:56.467 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:27:56.467 #define SPDK_CONFIG_EXAMPLES 1 00:27:56.467 #undef SPDK_CONFIG_FC 00:27:56.467 #define SPDK_CONFIG_FC_PATH 00:27:56.467 #define SPDK_CONFIG_FIO_PLUGIN 1 00:27:56.467 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:27:56.467 #undef SPDK_CONFIG_FUSE 00:27:56.467 #undef SPDK_CONFIG_FUZZER 00:27:56.467 #define SPDK_CONFIG_FUZZER_LIB 00:27:56.467 #undef SPDK_CONFIG_GOLANG 00:27:56.467 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:27:56.467 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:27:56.467 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:27:56.467 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:27:56.467 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:27:56.467 #undef SPDK_CONFIG_HAVE_LIBBSD 00:27:56.467 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:27:56.467 #define SPDK_CONFIG_IDXD 1 00:27:56.467 #define SPDK_CONFIG_IDXD_KERNEL 1 00:27:56.467 #define SPDK_CONFIG_IPSEC_MB 1 00:27:56.467 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:27:56.467 #define SPDK_CONFIG_ISAL 1 00:27:56.467 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:27:56.467 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:27:56.467 #define SPDK_CONFIG_LIBDIR 00:27:56.467 #undef SPDK_CONFIG_LTO 00:27:56.467 #define SPDK_CONFIG_MAX_LCORES 128 00:27:56.467 #define SPDK_CONFIG_NVME_CUSE 1 00:27:56.467 #undef SPDK_CONFIG_OCF 00:27:56.467 #define SPDK_CONFIG_OCF_PATH 00:27:56.467 #define SPDK_CONFIG_OPENSSL_PATH 00:27:56.467 #undef SPDK_CONFIG_PGO_CAPTURE 00:27:56.467 #define SPDK_CONFIG_PGO_DIR 00:27:56.467 #undef SPDK_CONFIG_PGO_USE 00:27:56.467 #define SPDK_CONFIG_PREFIX /usr/local 00:27:56.467 #undef SPDK_CONFIG_RAID5F 00:27:56.467 #undef SPDK_CONFIG_RBD 00:27:56.467 #define SPDK_CONFIG_RDMA 1 00:27:56.467 #define SPDK_CONFIG_RDMA_PROV verbs 00:27:56.467 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:27:56.467 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:27:56.467 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:27:56.467 #define SPDK_CONFIG_SHARED 1 00:27:56.467 #undef SPDK_CONFIG_SMA 00:27:56.467 #define SPDK_CONFIG_TESTS 1 00:27:56.467 #undef SPDK_CONFIG_TSAN 00:27:56.467 #define SPDK_CONFIG_UBLK 1 00:27:56.467 #define SPDK_CONFIG_UBSAN 1 00:27:56.467 #undef SPDK_CONFIG_UNIT_TESTS 00:27:56.467 #undef SPDK_CONFIG_URING 00:27:56.467 #define SPDK_CONFIG_URING_PATH 00:27:56.467 #undef SPDK_CONFIG_URING_ZNS 00:27:56.467 #undef SPDK_CONFIG_USDT 00:27:56.467 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:27:56.467 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:27:56.467 #undef SPDK_CONFIG_VFIO_USER 00:27:56.467 #define SPDK_CONFIG_VFIO_USER_DIR 00:27:56.467 #define SPDK_CONFIG_VHOST 1 00:27:56.467 #define SPDK_CONFIG_VIRTIO 1 00:27:56.467 #undef SPDK_CONFIG_VTUNE 00:27:56.467 #define SPDK_CONFIG_VTUNE_DIR 00:27:56.467 #define SPDK_CONFIG_WERROR 1 00:27:56.467 #define SPDK_CONFIG_WPDK_DIR 00:27:56.467 #undef SPDK_CONFIG_XNVME 00:27:56.467 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:27:56.467 08:41:08 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:27:56.467 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:27:56.467 08:41:08 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:27:56.467 08:41:08 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:27:56.467 08:41:08 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:27:56.468 08:41:08 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:56.468 08:41:08 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:56.468 08:41:08 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:56.468 08:41:08 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:27:56.468 08:41:08 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:27:56.468 08:41:08 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 1 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 1 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:27:56.468 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j96 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 1597404 ]] 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 1597404 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:27:56.469 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.zMCMll 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.zMCMll/tests/interrupt /tmp/spdk.zMCMll 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=995373056 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4289056768 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=53678059520 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=60682932224 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=7004872704 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=30336651264 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=30341464064 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4812800 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=12126920704 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=12136587264 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9666560 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=30338813952 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=30341468160 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=2654208 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=6068285440 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=6068289536 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:27:56.470 * Looking for test storage... 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=53678059520 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=9219465216 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:56.470 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:27:56.470 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:27:56.470 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:27:56.470 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:56.470 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:27:56.470 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:27:56.470 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:27:56.470 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:27:56.470 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:27:56.470 08:41:08 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:27:56.470 08:41:08 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:27:56.471 08:41:08 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:27:56.471 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:56.471 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:27:56.471 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1597518 00:27:56.471 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:56.471 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1597518 /var/tmp/spdk.sock 00:27:56.471 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 1597518 ']' 00:27:56.471 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:56.471 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:56.471 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:56.471 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:56.471 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:56.471 08:41:08 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:27:56.471 08:41:08 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:27:56.471 [2024-07-23 08:41:08.827582] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:27:56.471 [2024-07-23 08:41:08.827676] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1597518 ] 00:27:56.471 [2024-07-23 08:41:08.950651] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:27:56.731 [2024-07-23 08:41:09.159823] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:56.731 [2024-07-23 08:41:09.159893] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:56.731 [2024-07-23 08:41:09.159899] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:27:56.990 [2024-07-23 08:41:09.492519] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:27:57.250 08:41:09 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:57.250 08:41:09 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:27:57.250 08:41:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:27:57.250 08:41:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:27:57.250 08:41:09 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:57.250 08:41:09 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:27:57.250 08:41:09 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:57.250 08:41:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:27:57.250 "name": "app_thread", 00:27:57.250 "id": 1, 00:27:57.250 "active_pollers": [], 00:27:57.250 "timed_pollers": [ 00:27:57.250 { 00:27:57.250 "name": "rpc_subsystem_poll_servers", 00:27:57.250 "id": 1, 00:27:57.250 "state": "waiting", 00:27:57.250 "run_count": 0, 00:27:57.250 "busy_count": 0, 00:27:57.250 "period_ticks": 8400000 00:27:57.250 } 00:27:57.250 ], 00:27:57.250 "paused_pollers": [] 00:27:57.250 }' 00:27:57.250 08:41:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:27:57.250 08:41:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:27:57.250 08:41:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:27:57.250 08:41:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:27:57.250 08:41:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:27:57.250 08:41:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:27:57.250 08:41:09 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:27:57.250 08:41:09 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:27:57.250 08:41:09 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:27:57.250 5000+0 records in 00:27:57.250 5000+0 records out 00:27:57.250 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0176071 s, 582 MB/s 00:27:57.250 08:41:09 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:27:57.508 AIO0 00:27:57.508 08:41:09 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:57.767 08:41:10 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:27:57.767 08:41:10 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:27:57.767 08:41:10 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:27:57.767 08:41:10 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:57.767 08:41:10 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:27:57.767 08:41:10 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:57.767 08:41:10 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:27:57.767 "name": "app_thread", 00:27:57.767 "id": 1, 00:27:57.767 "active_pollers": [], 00:27:57.767 "timed_pollers": [ 00:27:57.767 { 00:27:57.767 "name": "rpc_subsystem_poll_servers", 00:27:57.767 "id": 1, 00:27:57.767 "state": "waiting", 00:27:57.767 "run_count": 0, 00:27:57.767 "busy_count": 0, 00:27:57.767 "period_ticks": 8400000 00:27:57.767 } 00:27:57.767 ], 00:27:57.767 "paused_pollers": [] 00:27:57.767 }' 00:27:57.767 08:41:10 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:27:58.026 08:41:10 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:27:58.026 08:41:10 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:27:58.026 08:41:10 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:27:58.026 08:41:10 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:27:58.026 08:41:10 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:27:58.026 08:41:10 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:27:58.026 08:41:10 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 1597518 00:27:58.026 08:41:10 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 1597518 ']' 00:27:58.026 08:41:10 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 1597518 00:27:58.026 08:41:10 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:27:58.026 08:41:10 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:58.026 08:41:10 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1597518 00:27:58.026 08:41:10 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:58.026 08:41:10 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:58.026 08:41:10 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1597518' 00:27:58.026 killing process with pid 1597518 00:27:58.026 08:41:10 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 1597518 00:27:58.026 08:41:10 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 1597518 00:27:59.404 08:41:11 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:27:59.404 08:41:11 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:27:59.404 00:27:59.404 real 0m3.065s 00:27:59.404 user 0m2.629s 00:27:59.404 sys 0m0.512s 00:27:59.404 08:41:11 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:59.404 08:41:11 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:27:59.404 ************************************ 00:27:59.404 END TEST reap_unregistered_poller 00:27:59.404 ************************************ 00:27:59.404 08:41:11 -- common/autotest_common.sh@1142 -- # return 0 00:27:59.404 08:41:11 -- spdk/autotest.sh@198 -- # uname -s 00:27:59.404 08:41:11 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:27:59.404 08:41:11 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:27:59.404 08:41:11 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:27:59.404 08:41:11 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:27:59.404 08:41:11 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:27:59.404 08:41:11 -- spdk/autotest.sh@260 -- # timing_exit lib 00:27:59.404 08:41:11 -- common/autotest_common.sh@728 -- # xtrace_disable 00:27:59.404 08:41:11 -- common/autotest_common.sh@10 -- # set +x 00:27:59.404 08:41:11 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:27:59.404 08:41:11 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:27:59.404 08:41:11 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:27:59.404 08:41:11 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:27:59.404 08:41:11 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:27:59.404 08:41:11 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:27:59.404 08:41:11 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:27:59.404 08:41:11 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:27:59.404 08:41:11 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:27:59.404 08:41:11 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:27:59.404 08:41:11 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:27:59.404 08:41:11 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:27:59.404 08:41:11 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:27:59.404 08:41:11 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:27:59.404 08:41:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:59.404 08:41:11 -- common/autotest_common.sh@10 -- # set +x 00:27:59.404 ************************************ 00:27:59.404 START TEST compress_compdev 00:27:59.404 ************************************ 00:27:59.404 08:41:11 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:27:59.404 * Looking for test storage... 00:27:59.404 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:27:59.404 08:41:11 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:800e967b-538f-e911-906e-001635649f5c 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=800e967b-538f-e911-906e-001635649f5c 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:27:59.404 08:41:11 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:27:59.404 08:41:11 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:27:59.404 08:41:11 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:27:59.404 08:41:11 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:27:59.404 08:41:11 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:59.404 08:41:11 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:59.405 08:41:11 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:59.405 08:41:11 compress_compdev -- paths/export.sh@5 -- # export PATH 00:27:59.405 08:41:11 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:59.405 08:41:11 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:27:59.405 08:41:11 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:27:59.405 08:41:11 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:27:59.405 08:41:11 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:27:59.405 08:41:11 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:27:59.405 08:41:11 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:27:59.405 08:41:11 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:27:59.405 08:41:11 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:27:59.405 08:41:11 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:27:59.405 08:41:11 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:59.405 08:41:11 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:27:59.405 08:41:11 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:27:59.405 08:41:11 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:27:59.405 08:41:11 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:27:59.405 08:41:11 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1598161 00:27:59.405 08:41:11 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:59.405 08:41:11 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1598161 00:27:59.405 08:41:11 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1598161 ']' 00:27:59.405 08:41:11 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:59.405 08:41:11 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:59.405 08:41:11 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:59.405 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:59.405 08:41:11 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:59.405 08:41:11 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:27:59.405 08:41:11 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:27:59.663 [2024-07-23 08:41:11.969782] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:27:59.663 [2024-07-23 08:41:11.969895] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1598161 ] 00:27:59.663 [2024-07-23 08:41:12.093811] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:59.922 [2024-07-23 08:41:12.320272] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:59.922 [2024-07-23 08:41:12.320279] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:00.923 [2024-07-23 08:41:13.289081] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:28:00.923 08:41:13 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:00.923 08:41:13 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:28:00.923 08:41:13 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:28:00.923 08:41:13 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:28:00.923 08:41:13 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:28:04.207 [2024-07-23 08:41:16.461876] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e00001b8c0 PMD being used: compress_qat 00:28:04.207 08:41:16 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:28:04.207 08:41:16 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:28:04.207 08:41:16 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:04.207 08:41:16 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:28:04.207 08:41:16 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:04.207 08:41:16 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:04.207 08:41:16 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:04.207 08:41:16 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:28:04.466 [ 00:28:04.466 { 00:28:04.466 "name": "Nvme0n1", 00:28:04.466 "aliases": [ 00:28:04.466 "971a13a6-4210-4cc1-9081-9d61a2087150" 00:28:04.466 ], 00:28:04.466 "product_name": "NVMe disk", 00:28:04.466 "block_size": 512, 00:28:04.466 "num_blocks": 7814037168, 00:28:04.466 "uuid": "971a13a6-4210-4cc1-9081-9d61a2087150", 00:28:04.466 "assigned_rate_limits": { 00:28:04.466 "rw_ios_per_sec": 0, 00:28:04.466 "rw_mbytes_per_sec": 0, 00:28:04.466 "r_mbytes_per_sec": 0, 00:28:04.466 "w_mbytes_per_sec": 0 00:28:04.466 }, 00:28:04.466 "claimed": false, 00:28:04.466 "zoned": false, 00:28:04.466 "supported_io_types": { 00:28:04.466 "read": true, 00:28:04.466 "write": true, 00:28:04.466 "unmap": true, 00:28:04.466 "flush": true, 00:28:04.466 "reset": true, 00:28:04.466 "nvme_admin": true, 00:28:04.466 "nvme_io": true, 00:28:04.466 "nvme_io_md": false, 00:28:04.466 "write_zeroes": true, 00:28:04.466 "zcopy": false, 00:28:04.466 "get_zone_info": false, 00:28:04.466 "zone_management": false, 00:28:04.466 "zone_append": false, 00:28:04.466 "compare": false, 00:28:04.466 "compare_and_write": false, 00:28:04.466 "abort": true, 00:28:04.466 "seek_hole": false, 00:28:04.466 "seek_data": false, 00:28:04.466 "copy": false, 00:28:04.466 "nvme_iov_md": false 00:28:04.466 }, 00:28:04.466 "driver_specific": { 00:28:04.466 "nvme": [ 00:28:04.466 { 00:28:04.466 "pci_address": "0000:60:00.0", 00:28:04.466 "trid": { 00:28:04.466 "trtype": "PCIe", 00:28:04.466 "traddr": "0000:60:00.0" 00:28:04.466 }, 00:28:04.466 "ctrlr_data": { 00:28:04.466 "cntlid": 0, 00:28:04.466 "vendor_id": "0x8086", 00:28:04.466 "model_number": "INTEL SSDPE2KX040T8", 00:28:04.466 "serial_number": "BTLJ81850BB64P0DGN", 00:28:04.466 "firmware_revision": "VDV1Y295", 00:28:04.466 "oacs": { 00:28:04.466 "security": 0, 00:28:04.466 "format": 1, 00:28:04.466 "firmware": 1, 00:28:04.466 "ns_manage": 1 00:28:04.466 }, 00:28:04.466 "multi_ctrlr": false, 00:28:04.466 "ana_reporting": false 00:28:04.466 }, 00:28:04.466 "vs": { 00:28:04.466 "nvme_version": "1.2" 00:28:04.466 }, 00:28:04.466 "ns_data": { 00:28:04.466 "id": 1, 00:28:04.466 "can_share": false 00:28:04.466 } 00:28:04.466 } 00:28:04.466 ], 00:28:04.466 "mp_policy": "active_passive" 00:28:04.466 } 00:28:04.466 } 00:28:04.466 ] 00:28:04.466 08:41:16 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:28:04.466 08:41:16 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:28:04.725 [2024-07-23 08:41:17.004748] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e00001ba80 PMD being used: compress_qat 00:28:06.630 7032290d-3a4b-4764-b79a-2eacd08f1f90 00:28:06.630 08:41:18 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:28:06.630 74be9bff-3dd5-4326-9bb5-0e377bca3a6c 00:28:06.630 08:41:18 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:28:06.630 08:41:18 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:28:06.630 08:41:18 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:06.630 08:41:18 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:28:06.630 08:41:18 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:06.630 08:41:18 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:06.630 08:41:18 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:06.630 08:41:19 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:28:06.889 [ 00:28:06.889 { 00:28:06.889 "name": "74be9bff-3dd5-4326-9bb5-0e377bca3a6c", 00:28:06.889 "aliases": [ 00:28:06.889 "lvs0/lv0" 00:28:06.889 ], 00:28:06.889 "product_name": "Logical Volume", 00:28:06.889 "block_size": 512, 00:28:06.889 "num_blocks": 204800, 00:28:06.889 "uuid": "74be9bff-3dd5-4326-9bb5-0e377bca3a6c", 00:28:06.889 "assigned_rate_limits": { 00:28:06.889 "rw_ios_per_sec": 0, 00:28:06.889 "rw_mbytes_per_sec": 0, 00:28:06.889 "r_mbytes_per_sec": 0, 00:28:06.889 "w_mbytes_per_sec": 0 00:28:06.889 }, 00:28:06.889 "claimed": false, 00:28:06.889 "zoned": false, 00:28:06.889 "supported_io_types": { 00:28:06.889 "read": true, 00:28:06.889 "write": true, 00:28:06.889 "unmap": true, 00:28:06.889 "flush": false, 00:28:06.889 "reset": true, 00:28:06.889 "nvme_admin": false, 00:28:06.889 "nvme_io": false, 00:28:06.889 "nvme_io_md": false, 00:28:06.889 "write_zeroes": true, 00:28:06.889 "zcopy": false, 00:28:06.889 "get_zone_info": false, 00:28:06.889 "zone_management": false, 00:28:06.889 "zone_append": false, 00:28:06.889 "compare": false, 00:28:06.889 "compare_and_write": false, 00:28:06.889 "abort": false, 00:28:06.889 "seek_hole": true, 00:28:06.889 "seek_data": true, 00:28:06.889 "copy": false, 00:28:06.889 "nvme_iov_md": false 00:28:06.889 }, 00:28:06.889 "driver_specific": { 00:28:06.889 "lvol": { 00:28:06.889 "lvol_store_uuid": "7032290d-3a4b-4764-b79a-2eacd08f1f90", 00:28:06.889 "base_bdev": "Nvme0n1", 00:28:06.889 "thin_provision": true, 00:28:06.889 "num_allocated_clusters": 0, 00:28:06.889 "snapshot": false, 00:28:06.889 "clone": false, 00:28:06.889 "esnap_clone": false 00:28:06.889 } 00:28:06.889 } 00:28:06.889 } 00:28:06.889 ] 00:28:06.889 08:41:19 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:28:06.889 08:41:19 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:28:06.889 08:41:19 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:28:06.889 [2024-07-23 08:41:19.369967] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:28:06.889 COMP_lvs0/lv0 00:28:06.889 08:41:19 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:28:06.889 08:41:19 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:28:06.889 08:41:19 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:06.889 08:41:19 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:28:06.889 08:41:19 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:06.889 08:41:19 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:06.889 08:41:19 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:07.148 08:41:19 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:28:07.406 [ 00:28:07.406 { 00:28:07.406 "name": "COMP_lvs0/lv0", 00:28:07.406 "aliases": [ 00:28:07.406 "41e83e80-43a9-5451-96b6-52f27bc0654c" 00:28:07.406 ], 00:28:07.406 "product_name": "compress", 00:28:07.406 "block_size": 512, 00:28:07.406 "num_blocks": 200704, 00:28:07.406 "uuid": "41e83e80-43a9-5451-96b6-52f27bc0654c", 00:28:07.406 "assigned_rate_limits": { 00:28:07.406 "rw_ios_per_sec": 0, 00:28:07.406 "rw_mbytes_per_sec": 0, 00:28:07.406 "r_mbytes_per_sec": 0, 00:28:07.406 "w_mbytes_per_sec": 0 00:28:07.406 }, 00:28:07.406 "claimed": false, 00:28:07.406 "zoned": false, 00:28:07.406 "supported_io_types": { 00:28:07.406 "read": true, 00:28:07.406 "write": true, 00:28:07.406 "unmap": false, 00:28:07.406 "flush": false, 00:28:07.406 "reset": false, 00:28:07.406 "nvme_admin": false, 00:28:07.406 "nvme_io": false, 00:28:07.406 "nvme_io_md": false, 00:28:07.406 "write_zeroes": true, 00:28:07.406 "zcopy": false, 00:28:07.406 "get_zone_info": false, 00:28:07.406 "zone_management": false, 00:28:07.406 "zone_append": false, 00:28:07.406 "compare": false, 00:28:07.406 "compare_and_write": false, 00:28:07.406 "abort": false, 00:28:07.406 "seek_hole": false, 00:28:07.406 "seek_data": false, 00:28:07.406 "copy": false, 00:28:07.406 "nvme_iov_md": false 00:28:07.406 }, 00:28:07.406 "driver_specific": { 00:28:07.406 "compress": { 00:28:07.406 "name": "COMP_lvs0/lv0", 00:28:07.406 "base_bdev_name": "74be9bff-3dd5-4326-9bb5-0e377bca3a6c", 00:28:07.406 "pm_path": "/tmp/pmem/38445004-60a1-4637-a659-c340b80b2ec5" 00:28:07.406 } 00:28:07.406 } 00:28:07.406 } 00:28:07.406 ] 00:28:07.406 08:41:19 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:28:07.406 08:41:19 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:07.406 [2024-07-23 08:41:19.817105] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000101e0 PMD being used: compress_qat 00:28:07.406 [2024-07-23 08:41:19.819785] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e00001bc40 PMD being used: compress_qat 00:28:07.406 Running I/O for 3 seconds... 00:28:10.696 00:28:10.696 Latency(us) 00:28:10.696 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:10.696 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:28:10.696 Verification LBA range: start 0x0 length 0x3100 00:28:10.696 COMP_lvs0/lv0 : 3.00 3536.13 13.81 0.00 0.00 9011.93 138.48 17226.61 00:28:10.696 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:28:10.696 Verification LBA range: start 0x3100 length 0x3100 00:28:10.696 COMP_lvs0/lv0 : 3.01 3602.70 14.07 0.00 0.00 8845.54 128.73 16976.94 00:28:10.696 =================================================================================================================== 00:28:10.696 Total : 7138.83 27.89 0.00 0.00 8927.94 128.73 17226.61 00:28:10.696 0 00:28:10.696 08:41:22 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:28:10.696 08:41:22 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:28:10.696 08:41:23 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:28:10.956 08:41:23 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:28:10.956 08:41:23 compress_compdev -- compress/compress.sh@78 -- # killprocess 1598161 00:28:10.956 08:41:23 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1598161 ']' 00:28:10.956 08:41:23 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1598161 00:28:10.956 08:41:23 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:28:10.956 08:41:23 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:10.956 08:41:23 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1598161 00:28:10.956 08:41:23 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:28:10.956 08:41:23 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:28:10.956 08:41:23 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1598161' 00:28:10.956 killing process with pid 1598161 00:28:10.956 08:41:23 compress_compdev -- common/autotest_common.sh@967 -- # kill 1598161 00:28:10.956 Received shutdown signal, test time was about 3.000000 seconds 00:28:10.956 00:28:10.956 Latency(us) 00:28:10.956 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:10.956 =================================================================================================================== 00:28:10.956 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:10.956 08:41:23 compress_compdev -- common/autotest_common.sh@972 -- # wait 1598161 00:28:16.230 08:41:27 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:28:16.230 08:41:27 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:28:16.230 08:41:27 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1600966 00:28:16.230 08:41:27 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:16.231 08:41:27 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:28:16.231 08:41:27 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1600966 00:28:16.231 08:41:27 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1600966 ']' 00:28:16.231 08:41:27 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:16.231 08:41:27 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:16.231 08:41:27 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:16.231 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:16.231 08:41:27 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:16.231 08:41:27 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:28:16.231 [2024-07-23 08:41:28.030293] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:28:16.231 [2024-07-23 08:41:28.030393] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1600966 ] 00:28:16.231 [2024-07-23 08:41:28.156945] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:16.231 [2024-07-23 08:41:28.368233] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:16.231 [2024-07-23 08:41:28.368240] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:16.798 [2024-07-23 08:41:29.291909] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:28:17.057 08:41:29 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:17.057 08:41:29 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:28:17.057 08:41:29 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:28:17.057 08:41:29 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:28:17.057 08:41:29 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:28:20.343 [2024-07-23 08:41:32.485154] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e00001b8c0 PMD being used: compress_qat 00:28:20.343 08:41:32 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:28:20.343 08:41:32 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:28:20.343 08:41:32 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:20.343 08:41:32 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:28:20.343 08:41:32 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:20.343 08:41:32 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:20.343 08:41:32 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:20.343 08:41:32 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:28:20.343 [ 00:28:20.343 { 00:28:20.343 "name": "Nvme0n1", 00:28:20.343 "aliases": [ 00:28:20.343 "6236901b-80f6-462d-8a54-ccf5371f4635" 00:28:20.343 ], 00:28:20.343 "product_name": "NVMe disk", 00:28:20.343 "block_size": 512, 00:28:20.343 "num_blocks": 7814037168, 00:28:20.343 "uuid": "6236901b-80f6-462d-8a54-ccf5371f4635", 00:28:20.343 "assigned_rate_limits": { 00:28:20.343 "rw_ios_per_sec": 0, 00:28:20.343 "rw_mbytes_per_sec": 0, 00:28:20.343 "r_mbytes_per_sec": 0, 00:28:20.343 "w_mbytes_per_sec": 0 00:28:20.343 }, 00:28:20.343 "claimed": false, 00:28:20.343 "zoned": false, 00:28:20.343 "supported_io_types": { 00:28:20.343 "read": true, 00:28:20.343 "write": true, 00:28:20.343 "unmap": true, 00:28:20.343 "flush": true, 00:28:20.343 "reset": true, 00:28:20.343 "nvme_admin": true, 00:28:20.343 "nvme_io": true, 00:28:20.343 "nvme_io_md": false, 00:28:20.343 "write_zeroes": true, 00:28:20.343 "zcopy": false, 00:28:20.343 "get_zone_info": false, 00:28:20.343 "zone_management": false, 00:28:20.343 "zone_append": false, 00:28:20.343 "compare": false, 00:28:20.343 "compare_and_write": false, 00:28:20.343 "abort": true, 00:28:20.343 "seek_hole": false, 00:28:20.343 "seek_data": false, 00:28:20.343 "copy": false, 00:28:20.343 "nvme_iov_md": false 00:28:20.343 }, 00:28:20.343 "driver_specific": { 00:28:20.343 "nvme": [ 00:28:20.343 { 00:28:20.343 "pci_address": "0000:60:00.0", 00:28:20.343 "trid": { 00:28:20.343 "trtype": "PCIe", 00:28:20.343 "traddr": "0000:60:00.0" 00:28:20.343 }, 00:28:20.343 "ctrlr_data": { 00:28:20.343 "cntlid": 0, 00:28:20.343 "vendor_id": "0x8086", 00:28:20.343 "model_number": "INTEL SSDPE2KX040T8", 00:28:20.343 "serial_number": "BTLJ81850BB64P0DGN", 00:28:20.343 "firmware_revision": "VDV1Y295", 00:28:20.343 "oacs": { 00:28:20.343 "security": 0, 00:28:20.343 "format": 1, 00:28:20.343 "firmware": 1, 00:28:20.343 "ns_manage": 1 00:28:20.343 }, 00:28:20.343 "multi_ctrlr": false, 00:28:20.343 "ana_reporting": false 00:28:20.343 }, 00:28:20.344 "vs": { 00:28:20.344 "nvme_version": "1.2" 00:28:20.344 }, 00:28:20.344 "ns_data": { 00:28:20.344 "id": 1, 00:28:20.344 "can_share": false 00:28:20.344 } 00:28:20.344 } 00:28:20.344 ], 00:28:20.344 "mp_policy": "active_passive" 00:28:20.344 } 00:28:20.344 } 00:28:20.344 ] 00:28:20.602 08:41:32 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:28:20.602 08:41:32 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:28:20.602 [2024-07-23 08:41:33.028385] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e00001ba80 PMD being used: compress_qat 00:28:22.505 58a1fafb-e621-4e5a-a69c-0bf1b8039f00 00:28:22.505 08:41:34 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:28:22.505 df20a139-e3e7-4510-a4b7-05133aa06918 00:28:22.505 08:41:34 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:28:22.505 08:41:34 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:28:22.505 08:41:34 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:22.505 08:41:34 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:28:22.505 08:41:34 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:22.505 08:41:34 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:22.505 08:41:34 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:22.764 08:41:35 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:28:22.764 [ 00:28:22.764 { 00:28:22.764 "name": "df20a139-e3e7-4510-a4b7-05133aa06918", 00:28:22.764 "aliases": [ 00:28:22.764 "lvs0/lv0" 00:28:22.764 ], 00:28:22.764 "product_name": "Logical Volume", 00:28:22.764 "block_size": 512, 00:28:22.764 "num_blocks": 204800, 00:28:22.764 "uuid": "df20a139-e3e7-4510-a4b7-05133aa06918", 00:28:22.764 "assigned_rate_limits": { 00:28:22.764 "rw_ios_per_sec": 0, 00:28:22.764 "rw_mbytes_per_sec": 0, 00:28:22.764 "r_mbytes_per_sec": 0, 00:28:22.764 "w_mbytes_per_sec": 0 00:28:22.764 }, 00:28:22.764 "claimed": false, 00:28:22.764 "zoned": false, 00:28:22.764 "supported_io_types": { 00:28:22.764 "read": true, 00:28:22.764 "write": true, 00:28:22.764 "unmap": true, 00:28:22.764 "flush": false, 00:28:22.764 "reset": true, 00:28:22.764 "nvme_admin": false, 00:28:22.764 "nvme_io": false, 00:28:22.764 "nvme_io_md": false, 00:28:22.764 "write_zeroes": true, 00:28:22.764 "zcopy": false, 00:28:22.764 "get_zone_info": false, 00:28:22.764 "zone_management": false, 00:28:22.764 "zone_append": false, 00:28:22.764 "compare": false, 00:28:22.764 "compare_and_write": false, 00:28:22.764 "abort": false, 00:28:22.764 "seek_hole": true, 00:28:22.764 "seek_data": true, 00:28:22.764 "copy": false, 00:28:22.764 "nvme_iov_md": false 00:28:22.764 }, 00:28:22.764 "driver_specific": { 00:28:22.764 "lvol": { 00:28:22.764 "lvol_store_uuid": "58a1fafb-e621-4e5a-a69c-0bf1b8039f00", 00:28:22.764 "base_bdev": "Nvme0n1", 00:28:22.764 "thin_provision": true, 00:28:22.764 "num_allocated_clusters": 0, 00:28:22.764 "snapshot": false, 00:28:22.764 "clone": false, 00:28:22.764 "esnap_clone": false 00:28:22.764 } 00:28:22.764 } 00:28:22.764 } 00:28:22.764 ] 00:28:22.764 08:41:35 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:28:22.764 08:41:35 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:28:22.764 08:41:35 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:28:23.024 [2024-07-23 08:41:35.377924] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:28:23.024 COMP_lvs0/lv0 00:28:23.024 08:41:35 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:28:23.024 08:41:35 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:28:23.024 08:41:35 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:23.024 08:41:35 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:28:23.024 08:41:35 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:23.024 08:41:35 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:23.024 08:41:35 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:23.285 08:41:35 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:28:23.285 [ 00:28:23.285 { 00:28:23.285 "name": "COMP_lvs0/lv0", 00:28:23.285 "aliases": [ 00:28:23.285 "ae95c2fb-9fa1-5194-a60d-c6447aaa7e9d" 00:28:23.285 ], 00:28:23.285 "product_name": "compress", 00:28:23.285 "block_size": 512, 00:28:23.285 "num_blocks": 200704, 00:28:23.285 "uuid": "ae95c2fb-9fa1-5194-a60d-c6447aaa7e9d", 00:28:23.285 "assigned_rate_limits": { 00:28:23.285 "rw_ios_per_sec": 0, 00:28:23.285 "rw_mbytes_per_sec": 0, 00:28:23.285 "r_mbytes_per_sec": 0, 00:28:23.285 "w_mbytes_per_sec": 0 00:28:23.285 }, 00:28:23.285 "claimed": false, 00:28:23.285 "zoned": false, 00:28:23.285 "supported_io_types": { 00:28:23.285 "read": true, 00:28:23.285 "write": true, 00:28:23.285 "unmap": false, 00:28:23.285 "flush": false, 00:28:23.285 "reset": false, 00:28:23.285 "nvme_admin": false, 00:28:23.285 "nvme_io": false, 00:28:23.285 "nvme_io_md": false, 00:28:23.285 "write_zeroes": true, 00:28:23.285 "zcopy": false, 00:28:23.285 "get_zone_info": false, 00:28:23.285 "zone_management": false, 00:28:23.285 "zone_append": false, 00:28:23.285 "compare": false, 00:28:23.285 "compare_and_write": false, 00:28:23.285 "abort": false, 00:28:23.285 "seek_hole": false, 00:28:23.285 "seek_data": false, 00:28:23.285 "copy": false, 00:28:23.285 "nvme_iov_md": false 00:28:23.285 }, 00:28:23.285 "driver_specific": { 00:28:23.285 "compress": { 00:28:23.285 "name": "COMP_lvs0/lv0", 00:28:23.285 "base_bdev_name": "df20a139-e3e7-4510-a4b7-05133aa06918", 00:28:23.285 "pm_path": "/tmp/pmem/9bc872c2-eb49-4ba7-b7dd-ce243341dfbb" 00:28:23.285 } 00:28:23.285 } 00:28:23.285 } 00:28:23.285 ] 00:28:23.285 08:41:35 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:28:23.285 08:41:35 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:23.575 [2024-07-23 08:41:35.821313] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000101e0 PMD being used: compress_qat 00:28:23.575 [2024-07-23 08:41:35.823892] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e00001bb60 PMD being used: compress_qat 00:28:23.575 Running I/O for 3 seconds... 00:28:26.865 00:28:26.865 Latency(us) 00:28:26.865 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:26.865 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:28:26.865 Verification LBA range: start 0x0 length 0x3100 00:28:26.865 COMP_lvs0/lv0 : 3.01 3533.57 13.80 0.00 0.00 9014.61 136.53 16352.79 00:28:26.865 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:28:26.865 Verification LBA range: start 0x3100 length 0x3100 00:28:26.865 COMP_lvs0/lv0 : 3.01 3597.16 14.05 0.00 0.00 8857.66 127.76 15791.06 00:28:26.865 =================================================================================================================== 00:28:26.865 Total : 7130.73 27.85 0.00 0.00 8935.45 127.76 16352.79 00:28:26.865 0 00:28:26.865 08:41:38 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:28:26.865 08:41:38 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:28:26.865 08:41:39 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:28:26.865 08:41:39 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:28:26.865 08:41:39 compress_compdev -- compress/compress.sh@78 -- # killprocess 1600966 00:28:26.865 08:41:39 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1600966 ']' 00:28:26.865 08:41:39 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1600966 00:28:26.865 08:41:39 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:28:26.865 08:41:39 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:26.865 08:41:39 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1600966 00:28:26.865 08:41:39 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:28:26.865 08:41:39 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:28:26.865 08:41:39 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1600966' 00:28:26.865 killing process with pid 1600966 00:28:26.865 08:41:39 compress_compdev -- common/autotest_common.sh@967 -- # kill 1600966 00:28:26.865 Received shutdown signal, test time was about 3.000000 seconds 00:28:26.865 00:28:26.865 Latency(us) 00:28:26.865 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:26.865 =================================================================================================================== 00:28:26.865 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:26.865 08:41:39 compress_compdev -- common/autotest_common.sh@972 -- # wait 1600966 00:28:32.136 08:41:43 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:28:32.136 08:41:43 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:28:32.136 08:41:43 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1603981 00:28:32.136 08:41:43 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:32.136 08:41:43 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:28:32.136 08:41:43 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1603981 00:28:32.136 08:41:43 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1603981 ']' 00:28:32.136 08:41:43 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:32.136 08:41:43 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:32.136 08:41:43 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:32.136 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:32.136 08:41:43 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:32.136 08:41:43 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:28:32.136 [2024-07-23 08:41:44.054839] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:28:32.136 [2024-07-23 08:41:44.054932] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1603981 ] 00:28:32.136 [2024-07-23 08:41:44.177173] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:32.136 [2024-07-23 08:41:44.391864] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:32.136 [2024-07-23 08:41:44.391870] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:33.074 [2024-07-23 08:41:45.314962] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:28:33.074 08:41:45 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:33.074 08:41:45 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:28:33.074 08:41:45 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:28:33.074 08:41:45 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:28:33.074 08:41:45 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:28:36.370 [2024-07-23 08:41:48.492988] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e00001b8c0 PMD being used: compress_qat 00:28:36.370 08:41:48 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:28:36.370 08:41:48 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:28:36.370 08:41:48 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:36.370 08:41:48 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:28:36.370 08:41:48 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:36.370 08:41:48 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:36.370 08:41:48 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:36.370 08:41:48 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:28:36.370 [ 00:28:36.370 { 00:28:36.370 "name": "Nvme0n1", 00:28:36.370 "aliases": [ 00:28:36.370 "0fc072aa-0950-4f30-b7dc-56b8e5cf0fa2" 00:28:36.370 ], 00:28:36.370 "product_name": "NVMe disk", 00:28:36.370 "block_size": 512, 00:28:36.370 "num_blocks": 7814037168, 00:28:36.370 "uuid": "0fc072aa-0950-4f30-b7dc-56b8e5cf0fa2", 00:28:36.370 "assigned_rate_limits": { 00:28:36.370 "rw_ios_per_sec": 0, 00:28:36.370 "rw_mbytes_per_sec": 0, 00:28:36.370 "r_mbytes_per_sec": 0, 00:28:36.370 "w_mbytes_per_sec": 0 00:28:36.370 }, 00:28:36.370 "claimed": false, 00:28:36.370 "zoned": false, 00:28:36.370 "supported_io_types": { 00:28:36.370 "read": true, 00:28:36.370 "write": true, 00:28:36.370 "unmap": true, 00:28:36.370 "flush": true, 00:28:36.370 "reset": true, 00:28:36.370 "nvme_admin": true, 00:28:36.370 "nvme_io": true, 00:28:36.370 "nvme_io_md": false, 00:28:36.370 "write_zeroes": true, 00:28:36.370 "zcopy": false, 00:28:36.370 "get_zone_info": false, 00:28:36.370 "zone_management": false, 00:28:36.370 "zone_append": false, 00:28:36.370 "compare": false, 00:28:36.370 "compare_and_write": false, 00:28:36.370 "abort": true, 00:28:36.370 "seek_hole": false, 00:28:36.370 "seek_data": false, 00:28:36.370 "copy": false, 00:28:36.370 "nvme_iov_md": false 00:28:36.370 }, 00:28:36.370 "driver_specific": { 00:28:36.370 "nvme": [ 00:28:36.370 { 00:28:36.370 "pci_address": "0000:60:00.0", 00:28:36.370 "trid": { 00:28:36.370 "trtype": "PCIe", 00:28:36.370 "traddr": "0000:60:00.0" 00:28:36.370 }, 00:28:36.370 "ctrlr_data": { 00:28:36.370 "cntlid": 0, 00:28:36.370 "vendor_id": "0x8086", 00:28:36.370 "model_number": "INTEL SSDPE2KX040T8", 00:28:36.370 "serial_number": "BTLJ81850BB64P0DGN", 00:28:36.370 "firmware_revision": "VDV1Y295", 00:28:36.370 "oacs": { 00:28:36.370 "security": 0, 00:28:36.370 "format": 1, 00:28:36.370 "firmware": 1, 00:28:36.370 "ns_manage": 1 00:28:36.370 }, 00:28:36.370 "multi_ctrlr": false, 00:28:36.370 "ana_reporting": false 00:28:36.370 }, 00:28:36.370 "vs": { 00:28:36.370 "nvme_version": "1.2" 00:28:36.370 }, 00:28:36.370 "ns_data": { 00:28:36.370 "id": 1, 00:28:36.370 "can_share": false 00:28:36.370 } 00:28:36.370 } 00:28:36.370 ], 00:28:36.370 "mp_policy": "active_passive" 00:28:36.370 } 00:28:36.370 } 00:28:36.370 ] 00:28:36.630 08:41:48 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:28:36.630 08:41:48 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:28:36.630 [2024-07-23 08:41:49.089409] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e00001ba80 PMD being used: compress_qat 00:28:38.536 b28f6816-1a8b-48af-bfc5-7b3e5f29061c 00:28:38.536 08:41:50 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:28:38.536 898981bd-3ac6-4371-9a06-f98f37db45ac 00:28:38.536 08:41:50 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:28:38.536 08:41:50 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:28:38.536 08:41:50 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:38.536 08:41:50 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:28:38.536 08:41:50 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:38.536 08:41:50 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:38.536 08:41:50 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:38.795 08:41:51 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:28:39.054 [ 00:28:39.054 { 00:28:39.054 "name": "898981bd-3ac6-4371-9a06-f98f37db45ac", 00:28:39.054 "aliases": [ 00:28:39.054 "lvs0/lv0" 00:28:39.054 ], 00:28:39.054 "product_name": "Logical Volume", 00:28:39.054 "block_size": 512, 00:28:39.054 "num_blocks": 204800, 00:28:39.054 "uuid": "898981bd-3ac6-4371-9a06-f98f37db45ac", 00:28:39.054 "assigned_rate_limits": { 00:28:39.054 "rw_ios_per_sec": 0, 00:28:39.054 "rw_mbytes_per_sec": 0, 00:28:39.054 "r_mbytes_per_sec": 0, 00:28:39.054 "w_mbytes_per_sec": 0 00:28:39.054 }, 00:28:39.054 "claimed": false, 00:28:39.054 "zoned": false, 00:28:39.054 "supported_io_types": { 00:28:39.054 "read": true, 00:28:39.054 "write": true, 00:28:39.054 "unmap": true, 00:28:39.054 "flush": false, 00:28:39.054 "reset": true, 00:28:39.054 "nvme_admin": false, 00:28:39.054 "nvme_io": false, 00:28:39.054 "nvme_io_md": false, 00:28:39.054 "write_zeroes": true, 00:28:39.054 "zcopy": false, 00:28:39.054 "get_zone_info": false, 00:28:39.054 "zone_management": false, 00:28:39.054 "zone_append": false, 00:28:39.054 "compare": false, 00:28:39.054 "compare_and_write": false, 00:28:39.054 "abort": false, 00:28:39.054 "seek_hole": true, 00:28:39.054 "seek_data": true, 00:28:39.054 "copy": false, 00:28:39.054 "nvme_iov_md": false 00:28:39.054 }, 00:28:39.054 "driver_specific": { 00:28:39.054 "lvol": { 00:28:39.054 "lvol_store_uuid": "b28f6816-1a8b-48af-bfc5-7b3e5f29061c", 00:28:39.054 "base_bdev": "Nvme0n1", 00:28:39.054 "thin_provision": true, 00:28:39.054 "num_allocated_clusters": 0, 00:28:39.054 "snapshot": false, 00:28:39.054 "clone": false, 00:28:39.054 "esnap_clone": false 00:28:39.054 } 00:28:39.054 } 00:28:39.054 } 00:28:39.054 ] 00:28:39.054 08:41:51 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:28:39.054 08:41:51 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:28:39.054 08:41:51 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:28:39.054 [2024-07-23 08:41:51.517769] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:28:39.054 COMP_lvs0/lv0 00:28:39.054 08:41:51 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:28:39.054 08:41:51 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:28:39.054 08:41:51 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:39.054 08:41:51 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:28:39.054 08:41:51 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:39.054 08:41:51 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:39.054 08:41:51 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:39.313 08:41:51 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:28:39.572 [ 00:28:39.572 { 00:28:39.572 "name": "COMP_lvs0/lv0", 00:28:39.572 "aliases": [ 00:28:39.572 "5cfa13c1-552b-56db-9603-bf3fb0fac693" 00:28:39.572 ], 00:28:39.572 "product_name": "compress", 00:28:39.572 "block_size": 4096, 00:28:39.572 "num_blocks": 25088, 00:28:39.572 "uuid": "5cfa13c1-552b-56db-9603-bf3fb0fac693", 00:28:39.572 "assigned_rate_limits": { 00:28:39.572 "rw_ios_per_sec": 0, 00:28:39.572 "rw_mbytes_per_sec": 0, 00:28:39.572 "r_mbytes_per_sec": 0, 00:28:39.572 "w_mbytes_per_sec": 0 00:28:39.572 }, 00:28:39.572 "claimed": false, 00:28:39.572 "zoned": false, 00:28:39.572 "supported_io_types": { 00:28:39.572 "read": true, 00:28:39.572 "write": true, 00:28:39.572 "unmap": false, 00:28:39.572 "flush": false, 00:28:39.572 "reset": false, 00:28:39.572 "nvme_admin": false, 00:28:39.572 "nvme_io": false, 00:28:39.572 "nvme_io_md": false, 00:28:39.572 "write_zeroes": true, 00:28:39.572 "zcopy": false, 00:28:39.572 "get_zone_info": false, 00:28:39.572 "zone_management": false, 00:28:39.572 "zone_append": false, 00:28:39.572 "compare": false, 00:28:39.572 "compare_and_write": false, 00:28:39.572 "abort": false, 00:28:39.572 "seek_hole": false, 00:28:39.572 "seek_data": false, 00:28:39.572 "copy": false, 00:28:39.572 "nvme_iov_md": false 00:28:39.572 }, 00:28:39.572 "driver_specific": { 00:28:39.572 "compress": { 00:28:39.572 "name": "COMP_lvs0/lv0", 00:28:39.572 "base_bdev_name": "898981bd-3ac6-4371-9a06-f98f37db45ac", 00:28:39.572 "pm_path": "/tmp/pmem/9d7eb5a6-e10e-4031-a0db-8d6b41007662" 00:28:39.572 } 00:28:39.572 } 00:28:39.572 } 00:28:39.572 ] 00:28:39.572 08:41:51 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:28:39.572 08:41:51 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:39.572 [2024-07-23 08:41:52.001318] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000101e0 PMD being used: compress_qat 00:28:39.572 [2024-07-23 08:41:52.004269] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e00001bb60 PMD being used: compress_qat 00:28:39.572 Running I/O for 3 seconds... 00:28:42.864 00:28:42.864 Latency(us) 00:28:42.864 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:42.864 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:28:42.864 Verification LBA range: start 0x0 length 0x3100 00:28:42.864 COMP_lvs0/lv0 : 3.01 3333.50 13.02 0.00 0.00 9552.23 183.34 18100.42 00:28:42.864 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:28:42.864 Verification LBA range: start 0x3100 length 0x3100 00:28:42.864 COMP_lvs0/lv0 : 3.01 3413.64 13.33 0.00 0.00 9327.64 174.57 17601.10 00:28:42.864 =================================================================================================================== 00:28:42.864 Total : 6747.14 26.36 0.00 0.00 9438.56 174.57 18100.42 00:28:42.864 0 00:28:42.864 08:41:55 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:28:42.864 08:41:55 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:28:42.864 08:41:55 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:28:43.123 08:41:55 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:28:43.123 08:41:55 compress_compdev -- compress/compress.sh@78 -- # killprocess 1603981 00:28:43.123 08:41:55 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1603981 ']' 00:28:43.123 08:41:55 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1603981 00:28:43.123 08:41:55 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:28:43.123 08:41:55 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:43.123 08:41:55 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1603981 00:28:43.123 08:41:55 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:28:43.123 08:41:55 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:28:43.123 08:41:55 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1603981' 00:28:43.123 killing process with pid 1603981 00:28:43.123 08:41:55 compress_compdev -- common/autotest_common.sh@967 -- # kill 1603981 00:28:43.123 Received shutdown signal, test time was about 3.000000 seconds 00:28:43.123 00:28:43.123 Latency(us) 00:28:43.123 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:43.123 =================================================================================================================== 00:28:43.124 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:43.124 08:41:55 compress_compdev -- common/autotest_common.sh@972 -- # wait 1603981 00:28:48.396 08:42:00 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:28:48.396 08:42:00 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:28:48.396 08:42:00 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=1606791 00:28:48.396 08:42:00 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:48.396 08:42:00 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:28:48.396 08:42:00 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 1606791 00:28:48.396 08:42:00 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1606791 ']' 00:28:48.396 08:42:00 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:48.396 08:42:00 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:48.396 08:42:00 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:48.396 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:48.396 08:42:00 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:48.396 08:42:00 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:28:48.396 [2024-07-23 08:42:00.249897] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:28:48.396 [2024-07-23 08:42:00.250001] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1606791 ] 00:28:48.396 [2024-07-23 08:42:00.373859] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:48.396 [2024-07-23 08:42:00.594205] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:48.396 [2024-07-23 08:42:00.594270] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:48.396 [2024-07-23 08:42:00.594277] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:49.360 [2024-07-23 08:42:01.544844] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:28:49.360 08:42:01 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:49.360 08:42:01 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:28:49.360 08:42:01 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:28:49.360 08:42:01 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:28:49.360 08:42:01 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:28:52.647 [2024-07-23 08:42:04.756377] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e000024920 PMD being used: compress_qat 00:28:52.647 08:42:04 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:28:52.647 08:42:04 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:28:52.647 08:42:04 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:52.647 08:42:04 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:28:52.647 08:42:04 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:52.647 08:42:04 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:52.647 08:42:04 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:52.647 08:42:04 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:28:52.647 [ 00:28:52.647 { 00:28:52.647 "name": "Nvme0n1", 00:28:52.647 "aliases": [ 00:28:52.647 "5aedf07b-5a27-4263-ab6a-a7e6d9e63e17" 00:28:52.647 ], 00:28:52.647 "product_name": "NVMe disk", 00:28:52.647 "block_size": 512, 00:28:52.647 "num_blocks": 7814037168, 00:28:52.647 "uuid": "5aedf07b-5a27-4263-ab6a-a7e6d9e63e17", 00:28:52.647 "assigned_rate_limits": { 00:28:52.647 "rw_ios_per_sec": 0, 00:28:52.647 "rw_mbytes_per_sec": 0, 00:28:52.647 "r_mbytes_per_sec": 0, 00:28:52.647 "w_mbytes_per_sec": 0 00:28:52.647 }, 00:28:52.647 "claimed": false, 00:28:52.647 "zoned": false, 00:28:52.647 "supported_io_types": { 00:28:52.647 "read": true, 00:28:52.647 "write": true, 00:28:52.647 "unmap": true, 00:28:52.647 "flush": true, 00:28:52.647 "reset": true, 00:28:52.647 "nvme_admin": true, 00:28:52.647 "nvme_io": true, 00:28:52.647 "nvme_io_md": false, 00:28:52.647 "write_zeroes": true, 00:28:52.647 "zcopy": false, 00:28:52.647 "get_zone_info": false, 00:28:52.647 "zone_management": false, 00:28:52.647 "zone_append": false, 00:28:52.647 "compare": false, 00:28:52.647 "compare_and_write": false, 00:28:52.647 "abort": true, 00:28:52.647 "seek_hole": false, 00:28:52.647 "seek_data": false, 00:28:52.647 "copy": false, 00:28:52.647 "nvme_iov_md": false 00:28:52.647 }, 00:28:52.647 "driver_specific": { 00:28:52.647 "nvme": [ 00:28:52.647 { 00:28:52.647 "pci_address": "0000:60:00.0", 00:28:52.647 "trid": { 00:28:52.647 "trtype": "PCIe", 00:28:52.647 "traddr": "0000:60:00.0" 00:28:52.647 }, 00:28:52.647 "ctrlr_data": { 00:28:52.647 "cntlid": 0, 00:28:52.647 "vendor_id": "0x8086", 00:28:52.647 "model_number": "INTEL SSDPE2KX040T8", 00:28:52.647 "serial_number": "BTLJ81850BB64P0DGN", 00:28:52.647 "firmware_revision": "VDV1Y295", 00:28:52.647 "oacs": { 00:28:52.647 "security": 0, 00:28:52.647 "format": 1, 00:28:52.647 "firmware": 1, 00:28:52.647 "ns_manage": 1 00:28:52.647 }, 00:28:52.647 "multi_ctrlr": false, 00:28:52.647 "ana_reporting": false 00:28:52.647 }, 00:28:52.647 "vs": { 00:28:52.647 "nvme_version": "1.2" 00:28:52.647 }, 00:28:52.647 "ns_data": { 00:28:52.647 "id": 1, 00:28:52.647 "can_share": false 00:28:52.647 } 00:28:52.647 } 00:28:52.647 ], 00:28:52.647 "mp_policy": "active_passive" 00:28:52.647 } 00:28:52.647 } 00:28:52.647 ] 00:28:52.647 08:42:05 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:28:52.647 08:42:05 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:28:52.905 [2024-07-23 08:42:05.280943] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e000024ae0 PMD being used: compress_qat 00:28:54.808 a45051c3-b203-48e1-b7a2-f99a2caaa3f2 00:28:54.808 08:42:06 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:28:54.808 44340f65-1698-4875-9859-f23c006a650e 00:28:54.808 08:42:07 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:28:54.808 08:42:07 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:28:54.808 08:42:07 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:54.808 08:42:07 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:28:54.808 08:42:07 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:54.808 08:42:07 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:54.808 08:42:07 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:54.808 08:42:07 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:28:55.066 [ 00:28:55.066 { 00:28:55.066 "name": "44340f65-1698-4875-9859-f23c006a650e", 00:28:55.066 "aliases": [ 00:28:55.066 "lvs0/lv0" 00:28:55.066 ], 00:28:55.066 "product_name": "Logical Volume", 00:28:55.066 "block_size": 512, 00:28:55.066 "num_blocks": 204800, 00:28:55.066 "uuid": "44340f65-1698-4875-9859-f23c006a650e", 00:28:55.066 "assigned_rate_limits": { 00:28:55.066 "rw_ios_per_sec": 0, 00:28:55.066 "rw_mbytes_per_sec": 0, 00:28:55.066 "r_mbytes_per_sec": 0, 00:28:55.066 "w_mbytes_per_sec": 0 00:28:55.066 }, 00:28:55.066 "claimed": false, 00:28:55.066 "zoned": false, 00:28:55.066 "supported_io_types": { 00:28:55.066 "read": true, 00:28:55.066 "write": true, 00:28:55.066 "unmap": true, 00:28:55.066 "flush": false, 00:28:55.066 "reset": true, 00:28:55.066 "nvme_admin": false, 00:28:55.066 "nvme_io": false, 00:28:55.066 "nvme_io_md": false, 00:28:55.066 "write_zeroes": true, 00:28:55.066 "zcopy": false, 00:28:55.066 "get_zone_info": false, 00:28:55.066 "zone_management": false, 00:28:55.066 "zone_append": false, 00:28:55.066 "compare": false, 00:28:55.066 "compare_and_write": false, 00:28:55.066 "abort": false, 00:28:55.066 "seek_hole": true, 00:28:55.066 "seek_data": true, 00:28:55.066 "copy": false, 00:28:55.066 "nvme_iov_md": false 00:28:55.066 }, 00:28:55.066 "driver_specific": { 00:28:55.066 "lvol": { 00:28:55.066 "lvol_store_uuid": "a45051c3-b203-48e1-b7a2-f99a2caaa3f2", 00:28:55.066 "base_bdev": "Nvme0n1", 00:28:55.066 "thin_provision": true, 00:28:55.066 "num_allocated_clusters": 0, 00:28:55.066 "snapshot": false, 00:28:55.066 "clone": false, 00:28:55.066 "esnap_clone": false 00:28:55.066 } 00:28:55.066 } 00:28:55.066 } 00:28:55.066 ] 00:28:55.066 08:42:07 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:28:55.066 08:42:07 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:28:55.066 08:42:07 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:28:55.324 [2024-07-23 08:42:07.639222] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:28:55.324 COMP_lvs0/lv0 00:28:55.324 08:42:07 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:28:55.324 08:42:07 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:28:55.324 08:42:07 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:55.324 08:42:07 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:28:55.324 08:42:07 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:55.324 08:42:07 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:55.324 08:42:07 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:55.324 08:42:07 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:28:55.582 [ 00:28:55.582 { 00:28:55.582 "name": "COMP_lvs0/lv0", 00:28:55.582 "aliases": [ 00:28:55.582 "76039538-0cee-5d6d-a854-d40efb94543c" 00:28:55.582 ], 00:28:55.582 "product_name": "compress", 00:28:55.582 "block_size": 512, 00:28:55.582 "num_blocks": 200704, 00:28:55.582 "uuid": "76039538-0cee-5d6d-a854-d40efb94543c", 00:28:55.582 "assigned_rate_limits": { 00:28:55.582 "rw_ios_per_sec": 0, 00:28:55.582 "rw_mbytes_per_sec": 0, 00:28:55.582 "r_mbytes_per_sec": 0, 00:28:55.582 "w_mbytes_per_sec": 0 00:28:55.582 }, 00:28:55.582 "claimed": false, 00:28:55.582 "zoned": false, 00:28:55.582 "supported_io_types": { 00:28:55.582 "read": true, 00:28:55.582 "write": true, 00:28:55.582 "unmap": false, 00:28:55.582 "flush": false, 00:28:55.582 "reset": false, 00:28:55.582 "nvme_admin": false, 00:28:55.582 "nvme_io": false, 00:28:55.582 "nvme_io_md": false, 00:28:55.582 "write_zeroes": true, 00:28:55.582 "zcopy": false, 00:28:55.582 "get_zone_info": false, 00:28:55.582 "zone_management": false, 00:28:55.582 "zone_append": false, 00:28:55.582 "compare": false, 00:28:55.582 "compare_and_write": false, 00:28:55.582 "abort": false, 00:28:55.582 "seek_hole": false, 00:28:55.582 "seek_data": false, 00:28:55.582 "copy": false, 00:28:55.582 "nvme_iov_md": false 00:28:55.582 }, 00:28:55.582 "driver_specific": { 00:28:55.582 "compress": { 00:28:55.582 "name": "COMP_lvs0/lv0", 00:28:55.582 "base_bdev_name": "44340f65-1698-4875-9859-f23c006a650e", 00:28:55.583 "pm_path": "/tmp/pmem/32b4373b-da27-42b5-90b4-03d106f32089" 00:28:55.583 } 00:28:55.583 } 00:28:55.583 } 00:28:55.583 ] 00:28:55.583 08:42:08 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:28:55.583 08:42:08 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:28:55.583 [2024-07-23 08:42:08.095555] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000171e0 PMD being used: compress_qat 00:28:55.583 I/O targets: 00:28:55.583 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:28:55.583 00:28:55.583 00:28:55.583 CUnit - A unit testing framework for C - Version 2.1-3 00:28:55.583 http://cunit.sourceforge.net/ 00:28:55.583 00:28:55.583 00:28:55.583 Suite: bdevio tests on: COMP_lvs0/lv0 00:28:55.583 Test: blockdev write read block ...passed 00:28:55.842 Test: blockdev write zeroes read block ...passed 00:28:55.842 Test: blockdev write zeroes read no split ...passed 00:28:55.842 Test: blockdev write zeroes read split ...passed 00:28:55.842 Test: blockdev write zeroes read split partial ...passed 00:28:55.842 Test: blockdev reset ...[2024-07-23 08:42:08.232759] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:28:55.842 passed 00:28:55.842 Test: blockdev write read 8 blocks ...passed 00:28:55.842 Test: blockdev write read size > 128k ...passed 00:28:55.842 Test: blockdev write read invalid size ...passed 00:28:55.842 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:55.842 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:55.842 Test: blockdev write read max offset ...passed 00:28:55.842 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:55.842 Test: blockdev writev readv 8 blocks ...passed 00:28:55.842 Test: blockdev writev readv 30 x 1block ...passed 00:28:55.842 Test: blockdev writev readv block ...passed 00:28:55.842 Test: blockdev writev readv size > 128k ...passed 00:28:55.842 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:55.842 Test: blockdev comparev and writev ...passed 00:28:55.842 Test: blockdev nvme passthru rw ...passed 00:28:55.842 Test: blockdev nvme passthru vendor specific ...passed 00:28:55.842 Test: blockdev nvme admin passthru ...passed 00:28:55.842 Test: blockdev copy ...passed 00:28:55.842 00:28:55.842 Run Summary: Type Total Ran Passed Failed Inactive 00:28:55.842 suites 1 1 n/a 0 0 00:28:55.842 tests 23 23 23 0 0 00:28:55.842 asserts 130 130 130 0 n/a 00:28:55.842 00:28:55.842 Elapsed time = 0.417 seconds 00:28:55.842 0 00:28:55.842 08:42:08 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:28:55.842 08:42:08 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:28:56.101 08:42:08 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:28:56.360 08:42:08 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:28:56.360 08:42:08 compress_compdev -- compress/compress.sh@62 -- # killprocess 1606791 00:28:56.360 08:42:08 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1606791 ']' 00:28:56.360 08:42:08 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1606791 00:28:56.360 08:42:08 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:28:56.360 08:42:08 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:56.360 08:42:08 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1606791 00:28:56.360 08:42:08 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:56.360 08:42:08 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:56.360 08:42:08 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1606791' 00:28:56.360 killing process with pid 1606791 00:28:56.360 08:42:08 compress_compdev -- common/autotest_common.sh@967 -- # kill 1606791 00:28:56.360 08:42:08 compress_compdev -- common/autotest_common.sh@972 -- # wait 1606791 00:29:01.632 08:42:13 compress_compdev -- compress/compress.sh@91 -- # '[' 1 -eq 1 ']' 00:29:01.632 08:42:13 compress_compdev -- compress/compress.sh@92 -- # run_bdevperf 64 16384 30 00:29:01.632 08:42:13 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:01.632 08:42:13 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1609309 00:29:01.632 08:42:13 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:01.632 08:42:13 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 64 -o 16384 -w verify -t 30 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:29:01.632 08:42:13 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1609309 00:29:01.632 08:42:13 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1609309 ']' 00:29:01.632 08:42:13 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:01.632 08:42:13 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:01.632 08:42:13 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:01.632 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:01.632 08:42:13 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:01.632 08:42:13 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:01.632 [2024-07-23 08:42:13.530432] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:29:01.632 [2024-07-23 08:42:13.530525] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1609309 ] 00:29:01.632 [2024-07-23 08:42:13.652191] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:01.632 [2024-07-23 08:42:13.863465] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:01.632 [2024-07-23 08:42:13.863472] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:02.570 [2024-07-23 08:42:14.792258] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:02.570 08:42:14 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:02.570 08:42:14 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:29:02.570 08:42:14 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:29:02.570 08:42:14 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:02.570 08:42:14 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:05.858 [2024-07-23 08:42:17.980270] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e00001b8c0 PMD being used: compress_qat 00:29:05.858 08:42:18 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:05.858 08:42:18 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:05.858 08:42:18 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:05.858 08:42:18 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:05.858 08:42:18 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:05.858 08:42:18 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:05.858 08:42:18 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:05.858 08:42:18 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:05.858 [ 00:29:05.858 { 00:29:05.858 "name": "Nvme0n1", 00:29:05.858 "aliases": [ 00:29:05.858 "7871b7f6-c88a-41ad-8e29-6bf24c44b76a" 00:29:05.858 ], 00:29:05.858 "product_name": "NVMe disk", 00:29:05.858 "block_size": 512, 00:29:05.858 "num_blocks": 7814037168, 00:29:05.858 "uuid": "7871b7f6-c88a-41ad-8e29-6bf24c44b76a", 00:29:05.858 "assigned_rate_limits": { 00:29:05.858 "rw_ios_per_sec": 0, 00:29:05.858 "rw_mbytes_per_sec": 0, 00:29:05.858 "r_mbytes_per_sec": 0, 00:29:05.858 "w_mbytes_per_sec": 0 00:29:05.858 }, 00:29:05.858 "claimed": false, 00:29:05.858 "zoned": false, 00:29:05.858 "supported_io_types": { 00:29:05.858 "read": true, 00:29:05.858 "write": true, 00:29:05.858 "unmap": true, 00:29:05.858 "flush": true, 00:29:05.858 "reset": true, 00:29:05.858 "nvme_admin": true, 00:29:05.858 "nvme_io": true, 00:29:05.858 "nvme_io_md": false, 00:29:05.858 "write_zeroes": true, 00:29:05.858 "zcopy": false, 00:29:05.858 "get_zone_info": false, 00:29:05.858 "zone_management": false, 00:29:05.858 "zone_append": false, 00:29:05.858 "compare": false, 00:29:05.858 "compare_and_write": false, 00:29:05.858 "abort": true, 00:29:05.858 "seek_hole": false, 00:29:05.858 "seek_data": false, 00:29:05.858 "copy": false, 00:29:05.858 "nvme_iov_md": false 00:29:05.858 }, 00:29:05.858 "driver_specific": { 00:29:05.858 "nvme": [ 00:29:05.858 { 00:29:05.858 "pci_address": "0000:60:00.0", 00:29:05.858 "trid": { 00:29:05.858 "trtype": "PCIe", 00:29:05.858 "traddr": "0000:60:00.0" 00:29:05.858 }, 00:29:05.858 "ctrlr_data": { 00:29:05.858 "cntlid": 0, 00:29:05.858 "vendor_id": "0x8086", 00:29:05.858 "model_number": "INTEL SSDPE2KX040T8", 00:29:05.858 "serial_number": "BTLJ81850BB64P0DGN", 00:29:05.858 "firmware_revision": "VDV1Y295", 00:29:05.858 "oacs": { 00:29:05.858 "security": 0, 00:29:05.858 "format": 1, 00:29:05.858 "firmware": 1, 00:29:05.858 "ns_manage": 1 00:29:05.858 }, 00:29:05.858 "multi_ctrlr": false, 00:29:05.858 "ana_reporting": false 00:29:05.858 }, 00:29:05.858 "vs": { 00:29:05.858 "nvme_version": "1.2" 00:29:05.858 }, 00:29:05.858 "ns_data": { 00:29:05.858 "id": 1, 00:29:05.858 "can_share": false 00:29:05.859 } 00:29:05.859 } 00:29:05.859 ], 00:29:05.859 "mp_policy": "active_passive" 00:29:05.859 } 00:29:05.859 } 00:29:05.859 ] 00:29:05.859 08:42:18 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:05.859 08:42:18 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:06.117 [2024-07-23 08:42:18.527476] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e00001ba80 PMD being used: compress_qat 00:29:08.020 95d7e304-d076-42d2-8c4f-7fba74911141 00:29:08.020 08:42:20 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:08.020 2e36ae50-b673-4b5b-9f38-7d3405a96a3f 00:29:08.020 08:42:20 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:08.020 08:42:20 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:08.020 08:42:20 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:08.020 08:42:20 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:08.020 08:42:20 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:08.020 08:42:20 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:08.020 08:42:20 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:08.279 08:42:20 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:08.279 [ 00:29:08.279 { 00:29:08.279 "name": "2e36ae50-b673-4b5b-9f38-7d3405a96a3f", 00:29:08.279 "aliases": [ 00:29:08.279 "lvs0/lv0" 00:29:08.279 ], 00:29:08.279 "product_name": "Logical Volume", 00:29:08.279 "block_size": 512, 00:29:08.279 "num_blocks": 204800, 00:29:08.279 "uuid": "2e36ae50-b673-4b5b-9f38-7d3405a96a3f", 00:29:08.279 "assigned_rate_limits": { 00:29:08.279 "rw_ios_per_sec": 0, 00:29:08.279 "rw_mbytes_per_sec": 0, 00:29:08.279 "r_mbytes_per_sec": 0, 00:29:08.279 "w_mbytes_per_sec": 0 00:29:08.279 }, 00:29:08.279 "claimed": false, 00:29:08.279 "zoned": false, 00:29:08.279 "supported_io_types": { 00:29:08.279 "read": true, 00:29:08.279 "write": true, 00:29:08.279 "unmap": true, 00:29:08.279 "flush": false, 00:29:08.279 "reset": true, 00:29:08.279 "nvme_admin": false, 00:29:08.279 "nvme_io": false, 00:29:08.279 "nvme_io_md": false, 00:29:08.279 "write_zeroes": true, 00:29:08.279 "zcopy": false, 00:29:08.279 "get_zone_info": false, 00:29:08.279 "zone_management": false, 00:29:08.279 "zone_append": false, 00:29:08.279 "compare": false, 00:29:08.279 "compare_and_write": false, 00:29:08.279 "abort": false, 00:29:08.279 "seek_hole": true, 00:29:08.279 "seek_data": true, 00:29:08.279 "copy": false, 00:29:08.279 "nvme_iov_md": false 00:29:08.279 }, 00:29:08.279 "driver_specific": { 00:29:08.279 "lvol": { 00:29:08.279 "lvol_store_uuid": "95d7e304-d076-42d2-8c4f-7fba74911141", 00:29:08.279 "base_bdev": "Nvme0n1", 00:29:08.279 "thin_provision": true, 00:29:08.279 "num_allocated_clusters": 0, 00:29:08.279 "snapshot": false, 00:29:08.279 "clone": false, 00:29:08.279 "esnap_clone": false 00:29:08.279 } 00:29:08.279 } 00:29:08.279 } 00:29:08.279 ] 00:29:08.279 08:42:20 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:08.279 08:42:20 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:29:08.279 08:42:20 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:29:08.537 [2024-07-23 08:42:20.927354] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:08.537 COMP_lvs0/lv0 00:29:08.537 08:42:20 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:08.537 08:42:20 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:08.537 08:42:20 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:08.537 08:42:20 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:08.537 08:42:20 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:08.537 08:42:20 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:08.537 08:42:20 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:08.796 08:42:21 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:08.796 [ 00:29:08.796 { 00:29:08.796 "name": "COMP_lvs0/lv0", 00:29:08.796 "aliases": [ 00:29:08.796 "37631524-132f-50de-96b4-fba11642190a" 00:29:08.796 ], 00:29:08.796 "product_name": "compress", 00:29:08.796 "block_size": 512, 00:29:08.796 "num_blocks": 200704, 00:29:08.796 "uuid": "37631524-132f-50de-96b4-fba11642190a", 00:29:08.796 "assigned_rate_limits": { 00:29:08.796 "rw_ios_per_sec": 0, 00:29:08.796 "rw_mbytes_per_sec": 0, 00:29:08.796 "r_mbytes_per_sec": 0, 00:29:08.796 "w_mbytes_per_sec": 0 00:29:08.796 }, 00:29:08.796 "claimed": false, 00:29:08.796 "zoned": false, 00:29:08.796 "supported_io_types": { 00:29:08.796 "read": true, 00:29:08.796 "write": true, 00:29:08.796 "unmap": false, 00:29:08.796 "flush": false, 00:29:08.796 "reset": false, 00:29:08.796 "nvme_admin": false, 00:29:08.796 "nvme_io": false, 00:29:08.796 "nvme_io_md": false, 00:29:08.796 "write_zeroes": true, 00:29:08.796 "zcopy": false, 00:29:08.796 "get_zone_info": false, 00:29:08.796 "zone_management": false, 00:29:08.796 "zone_append": false, 00:29:08.796 "compare": false, 00:29:08.796 "compare_and_write": false, 00:29:08.796 "abort": false, 00:29:08.796 "seek_hole": false, 00:29:08.796 "seek_data": false, 00:29:08.796 "copy": false, 00:29:08.796 "nvme_iov_md": false 00:29:08.796 }, 00:29:08.796 "driver_specific": { 00:29:08.796 "compress": { 00:29:08.796 "name": "COMP_lvs0/lv0", 00:29:08.796 "base_bdev_name": "2e36ae50-b673-4b5b-9f38-7d3405a96a3f", 00:29:08.796 "pm_path": "/tmp/pmem/f007d67e-2c18-40c6-adeb-44f0f2dbbfa4" 00:29:08.796 } 00:29:08.796 } 00:29:08.796 } 00:29:08.796 ] 00:29:08.796 08:42:21 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:08.796 08:42:21 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:09.055 [2024-07-23 08:42:21.380757] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e0000101e0 PMD being used: compress_qat 00:29:09.055 [2024-07-23 08:42:21.383692] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x60e00001bc40 PMD being used: compress_qat 00:29:09.055 Running I/O for 30 seconds... 00:29:41.150 00:29:41.150 Latency(us) 00:29:41.150 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:41.150 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 64, IO size: 16384) 00:29:41.150 Verification LBA range: start 0x0 length 0xc40 00:29:41.150 COMP_lvs0/lv0 : 30.01 1661.11 25.95 0.00 0.00 38327.20 358.89 37948.46 00:29:41.150 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 64, IO size: 16384) 00:29:41.150 Verification LBA range: start 0xc40 length 0xc40 00:29:41.150 COMP_lvs0/lv0 : 30.01 5073.10 79.27 0.00 0.00 12502.41 333.53 24217.11 00:29:41.150 =================================================================================================================== 00:29:41.150 Total : 6734.21 105.22 0.00 0.00 18872.90 333.53 37948.46 00:29:41.150 0 00:29:41.150 08:42:51 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:29:41.150 08:42:51 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:41.150 08:42:51 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:41.150 08:42:51 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:41.150 08:42:51 compress_compdev -- compress/compress.sh@78 -- # killprocess 1609309 00:29:41.150 08:42:51 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1609309 ']' 00:29:41.150 08:42:51 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1609309 00:29:41.150 08:42:51 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:29:41.150 08:42:51 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:41.150 08:42:51 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1609309 00:29:41.150 08:42:51 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:29:41.150 08:42:51 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:29:41.150 08:42:51 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1609309' 00:29:41.150 killing process with pid 1609309 00:29:41.150 08:42:51 compress_compdev -- common/autotest_common.sh@967 -- # kill 1609309 00:29:41.150 Received shutdown signal, test time was about 30.000000 seconds 00:29:41.150 00:29:41.150 Latency(us) 00:29:41.150 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:41.150 =================================================================================================================== 00:29:41.150 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:41.150 08:42:51 compress_compdev -- common/autotest_common.sh@972 -- # wait 1609309 00:29:44.439 08:42:56 compress_compdev -- compress/compress.sh@95 -- # export TEST_TRANSPORT=tcp 00:29:44.439 08:42:56 compress_compdev -- compress/compress.sh@95 -- # TEST_TRANSPORT=tcp 00:29:44.439 08:42:56 compress_compdev -- compress/compress.sh@96 -- # NET_TYPE=virt 00:29:44.439 08:42:56 compress_compdev -- compress/compress.sh@96 -- # nvmftestinit 00:29:44.439 08:42:56 compress_compdev -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:29:44.439 08:42:56 compress_compdev -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:29:44.439 08:42:56 compress_compdev -- nvmf/common.sh@448 -- # prepare_net_devs 00:29:44.439 08:42:56 compress_compdev -- nvmf/common.sh@410 -- # local -g is_hw=no 00:29:44.439 08:42:56 compress_compdev -- nvmf/common.sh@412 -- # remove_spdk_ns 00:29:44.439 08:42:56 compress_compdev -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:29:44.439 08:42:56 compress_compdev -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:29:44.439 08:42:56 compress_compdev -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:29:44.439 08:42:56 compress_compdev -- nvmf/common.sh@414 -- # [[ virt != virt ]] 00:29:44.439 08:42:56 compress_compdev -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:29:44.439 08:42:56 compress_compdev -- nvmf/common.sh@423 -- # [[ virt == phy ]] 00:29:44.439 08:42:56 compress_compdev -- nvmf/common.sh@426 -- # [[ virt == phy-fallback ]] 00:29:44.439 08:42:56 compress_compdev -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:29:44.439 08:42:56 compress_compdev -- nvmf/common.sh@432 -- # nvmf_veth_init 00:29:44.439 08:42:56 compress_compdev -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:29:44.439 08:42:56 compress_compdev -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:29:44.440 Cannot find device "nvmf_tgt_br" 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@155 -- # true 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:29:44.440 Cannot find device "nvmf_tgt_br2" 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@156 -- # true 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:29:44.440 Cannot find device "nvmf_tgt_br" 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@158 -- # true 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:29:44.440 Cannot find device "nvmf_tgt_br2" 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@159 -- # true 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:29:44.440 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@162 -- # true 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:29:44.440 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@163 -- # true 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:29:44.440 08:42:56 compress_compdev -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:29:44.698 08:42:56 compress_compdev -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:29:44.698 08:42:56 compress_compdev -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:29:44.698 08:42:56 compress_compdev -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:29:44.698 08:42:57 compress_compdev -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:29:44.698 08:42:57 compress_compdev -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:29:44.698 08:42:57 compress_compdev -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:29:44.698 08:42:57 compress_compdev -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:29:44.698 08:42:57 compress_compdev -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:29:44.698 08:42:57 compress_compdev -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:29:44.698 08:42:57 compress_compdev -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:29:44.698 08:42:57 compress_compdev -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:29:44.698 08:42:57 compress_compdev -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:29:44.698 08:42:57 compress_compdev -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:29:44.698 08:42:57 compress_compdev -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:29:44.698 08:42:57 compress_compdev -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:29:44.698 08:42:57 compress_compdev -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:29:44.698 08:42:57 compress_compdev -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:29:44.698 08:42:57 compress_compdev -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:29:44.698 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:29:44.698 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.088 ms 00:29:44.698 00:29:44.698 --- 10.0.0.2 ping statistics --- 00:29:44.698 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:29:44.698 rtt min/avg/max/mdev = 0.088/0.088/0.088/0.000 ms 00:29:44.698 08:42:57 compress_compdev -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:29:44.956 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:29:44.956 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.044 ms 00:29:44.956 00:29:44.956 --- 10.0.0.3 ping statistics --- 00:29:44.956 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:29:44.956 rtt min/avg/max/mdev = 0.044/0.044/0.044/0.000 ms 00:29:44.956 08:42:57 compress_compdev -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:29:44.956 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:29:44.956 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.028 ms 00:29:44.956 00:29:44.956 --- 10.0.0.1 ping statistics --- 00:29:44.956 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:29:44.956 rtt min/avg/max/mdev = 0.028/0.028/0.028/0.000 ms 00:29:44.956 08:42:57 compress_compdev -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:29:44.956 08:42:57 compress_compdev -- nvmf/common.sh@433 -- # return 0 00:29:44.956 08:42:57 compress_compdev -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:29:44.956 08:42:57 compress_compdev -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:29:44.956 08:42:57 compress_compdev -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:29:44.956 08:42:57 compress_compdev -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:29:44.957 08:42:57 compress_compdev -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:29:44.957 08:42:57 compress_compdev -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:29:44.957 08:42:57 compress_compdev -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:29:44.957 08:42:57 compress_compdev -- compress/compress.sh@97 -- # nvmfappstart -m 0x7 00:29:44.957 08:42:57 compress_compdev -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:29:44.957 08:42:57 compress_compdev -- common/autotest_common.sh@722 -- # xtrace_disable 00:29:44.957 08:42:57 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:44.957 08:42:57 compress_compdev -- nvmf/common.sh@481 -- # nvmfpid=1617424 00:29:44.957 08:42:57 compress_compdev -- nvmf/common.sh@482 -- # waitforlisten 1617424 00:29:44.957 08:42:57 compress_compdev -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x7 00:29:44.957 08:42:57 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1617424 ']' 00:29:44.957 08:42:57 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:44.957 08:42:57 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:44.957 08:42:57 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:44.957 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:44.957 08:42:57 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:44.957 08:42:57 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:44.957 [2024-07-23 08:42:57.352959] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:29:44.957 [2024-07-23 08:42:57.353049] [ DPDK EAL parameters: nvmf -c 0x7 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:45.215 [2024-07-23 08:42:57.481583] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:45.215 [2024-07-23 08:42:57.708237] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:29:45.215 [2024-07-23 08:42:57.708279] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:29:45.215 [2024-07-23 08:42:57.708294] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:29:45.215 [2024-07-23 08:42:57.708319] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:29:45.215 [2024-07-23 08:42:57.708332] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:29:45.215 [2024-07-23 08:42:57.708415] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:45.215 [2024-07-23 08:42:57.708482] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:45.215 [2024-07-23 08:42:57.708490] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:45.780 08:42:58 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:45.780 08:42:58 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:29:45.780 08:42:58 compress_compdev -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:29:45.780 08:42:58 compress_compdev -- common/autotest_common.sh@728 -- # xtrace_disable 00:29:45.780 08:42:58 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:45.780 08:42:58 compress_compdev -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:29:45.780 08:42:58 compress_compdev -- compress/compress.sh@98 -- # trap 'nvmftestfini; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:45.780 08:42:58 compress_compdev -- compress/compress.sh@101 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_create_transport -t tcp -u 8192 00:29:46.037 [2024-07-23 08:42:58.338401] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:29:46.037 08:42:58 compress_compdev -- compress/compress.sh@102 -- # create_vols 00:29:46.037 08:42:58 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:46.037 08:42:58 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:49.313 08:43:01 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:49.313 08:43:01 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:49.313 08:43:01 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:49.313 08:43:01 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:49.313 08:43:01 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:49.313 08:43:01 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:49.313 08:43:01 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:49.313 08:43:01 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:49.313 [ 00:29:49.313 { 00:29:49.313 "name": "Nvme0n1", 00:29:49.313 "aliases": [ 00:29:49.313 "e213337d-9621-4568-89bb-d5cb91e22a43" 00:29:49.313 ], 00:29:49.313 "product_name": "NVMe disk", 00:29:49.313 "block_size": 512, 00:29:49.313 "num_blocks": 7814037168, 00:29:49.313 "uuid": "e213337d-9621-4568-89bb-d5cb91e22a43", 00:29:49.313 "assigned_rate_limits": { 00:29:49.313 "rw_ios_per_sec": 0, 00:29:49.313 "rw_mbytes_per_sec": 0, 00:29:49.313 "r_mbytes_per_sec": 0, 00:29:49.313 "w_mbytes_per_sec": 0 00:29:49.313 }, 00:29:49.313 "claimed": false, 00:29:49.313 "zoned": false, 00:29:49.313 "supported_io_types": { 00:29:49.313 "read": true, 00:29:49.313 "write": true, 00:29:49.313 "unmap": true, 00:29:49.313 "flush": true, 00:29:49.313 "reset": true, 00:29:49.313 "nvme_admin": true, 00:29:49.313 "nvme_io": true, 00:29:49.313 "nvme_io_md": false, 00:29:49.313 "write_zeroes": true, 00:29:49.313 "zcopy": false, 00:29:49.313 "get_zone_info": false, 00:29:49.313 "zone_management": false, 00:29:49.313 "zone_append": false, 00:29:49.313 "compare": false, 00:29:49.313 "compare_and_write": false, 00:29:49.313 "abort": true, 00:29:49.313 "seek_hole": false, 00:29:49.313 "seek_data": false, 00:29:49.313 "copy": false, 00:29:49.313 "nvme_iov_md": false 00:29:49.313 }, 00:29:49.313 "driver_specific": { 00:29:49.313 "nvme": [ 00:29:49.313 { 00:29:49.313 "pci_address": "0000:60:00.0", 00:29:49.313 "trid": { 00:29:49.313 "trtype": "PCIe", 00:29:49.313 "traddr": "0000:60:00.0" 00:29:49.313 }, 00:29:49.313 "ctrlr_data": { 00:29:49.313 "cntlid": 0, 00:29:49.313 "vendor_id": "0x8086", 00:29:49.313 "model_number": "INTEL SSDPE2KX040T8", 00:29:49.313 "serial_number": "BTLJ81850BB64P0DGN", 00:29:49.313 "firmware_revision": "VDV1Y295", 00:29:49.313 "oacs": { 00:29:49.313 "security": 0, 00:29:49.313 "format": 1, 00:29:49.313 "firmware": 1, 00:29:49.313 "ns_manage": 1 00:29:49.313 }, 00:29:49.313 "multi_ctrlr": false, 00:29:49.313 "ana_reporting": false 00:29:49.313 }, 00:29:49.313 "vs": { 00:29:49.313 "nvme_version": "1.2" 00:29:49.313 }, 00:29:49.313 "ns_data": { 00:29:49.313 "id": 1, 00:29:49.313 "can_share": false 00:29:49.313 } 00:29:49.313 } 00:29:49.313 ], 00:29:49.313 "mp_policy": "active_passive" 00:29:49.313 } 00:29:49.313 } 00:29:49.313 ] 00:29:49.313 08:43:01 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:49.313 08:43:01 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:51.207 1d1ec1d4-f568-4e45-8206-aee3bcd88791 00:29:51.207 08:43:03 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:51.465 acfd18b6-8644-4532-aff6-4b3fa2e1a38f 00:29:51.465 08:43:03 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:51.465 08:43:03 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:51.465 08:43:03 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:51.465 08:43:03 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:51.465 08:43:03 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:51.465 08:43:03 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:51.465 08:43:03 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:51.722 08:43:04 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:51.723 [ 00:29:51.723 { 00:29:51.723 "name": "acfd18b6-8644-4532-aff6-4b3fa2e1a38f", 00:29:51.723 "aliases": [ 00:29:51.723 "lvs0/lv0" 00:29:51.723 ], 00:29:51.723 "product_name": "Logical Volume", 00:29:51.723 "block_size": 512, 00:29:51.723 "num_blocks": 204800, 00:29:51.723 "uuid": "acfd18b6-8644-4532-aff6-4b3fa2e1a38f", 00:29:51.723 "assigned_rate_limits": { 00:29:51.723 "rw_ios_per_sec": 0, 00:29:51.723 "rw_mbytes_per_sec": 0, 00:29:51.723 "r_mbytes_per_sec": 0, 00:29:51.723 "w_mbytes_per_sec": 0 00:29:51.723 }, 00:29:51.723 "claimed": false, 00:29:51.723 "zoned": false, 00:29:51.723 "supported_io_types": { 00:29:51.723 "read": true, 00:29:51.723 "write": true, 00:29:51.723 "unmap": true, 00:29:51.723 "flush": false, 00:29:51.723 "reset": true, 00:29:51.723 "nvme_admin": false, 00:29:51.723 "nvme_io": false, 00:29:51.723 "nvme_io_md": false, 00:29:51.723 "write_zeroes": true, 00:29:51.723 "zcopy": false, 00:29:51.723 "get_zone_info": false, 00:29:51.723 "zone_management": false, 00:29:51.723 "zone_append": false, 00:29:51.723 "compare": false, 00:29:51.723 "compare_and_write": false, 00:29:51.723 "abort": false, 00:29:51.723 "seek_hole": true, 00:29:51.723 "seek_data": true, 00:29:51.723 "copy": false, 00:29:51.723 "nvme_iov_md": false 00:29:51.723 }, 00:29:51.723 "driver_specific": { 00:29:51.723 "lvol": { 00:29:51.723 "lvol_store_uuid": "1d1ec1d4-f568-4e45-8206-aee3bcd88791", 00:29:51.723 "base_bdev": "Nvme0n1", 00:29:51.723 "thin_provision": true, 00:29:51.723 "num_allocated_clusters": 0, 00:29:51.723 "snapshot": false, 00:29:51.723 "clone": false, 00:29:51.723 "esnap_clone": false 00:29:51.723 } 00:29:51.723 } 00:29:51.723 } 00:29:51.723 ] 00:29:51.723 08:43:04 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:51.723 08:43:04 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:29:51.723 08:43:04 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:29:51.980 [2024-07-23 08:43:04.338792] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:51.980 COMP_lvs0/lv0 00:29:51.980 08:43:04 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:51.980 08:43:04 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:51.980 08:43:04 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:51.980 08:43:04 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:51.980 08:43:04 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:51.980 08:43:04 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:51.980 08:43:04 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:52.237 08:43:04 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:52.237 [ 00:29:52.237 { 00:29:52.237 "name": "COMP_lvs0/lv0", 00:29:52.237 "aliases": [ 00:29:52.237 "6651cb70-e57b-557c-86b7-22bd522b2b04" 00:29:52.237 ], 00:29:52.237 "product_name": "compress", 00:29:52.237 "block_size": 512, 00:29:52.237 "num_blocks": 200704, 00:29:52.237 "uuid": "6651cb70-e57b-557c-86b7-22bd522b2b04", 00:29:52.237 "assigned_rate_limits": { 00:29:52.237 "rw_ios_per_sec": 0, 00:29:52.237 "rw_mbytes_per_sec": 0, 00:29:52.237 "r_mbytes_per_sec": 0, 00:29:52.237 "w_mbytes_per_sec": 0 00:29:52.237 }, 00:29:52.237 "claimed": false, 00:29:52.237 "zoned": false, 00:29:52.237 "supported_io_types": { 00:29:52.237 "read": true, 00:29:52.237 "write": true, 00:29:52.237 "unmap": false, 00:29:52.237 "flush": false, 00:29:52.237 "reset": false, 00:29:52.237 "nvme_admin": false, 00:29:52.237 "nvme_io": false, 00:29:52.237 "nvme_io_md": false, 00:29:52.237 "write_zeroes": true, 00:29:52.237 "zcopy": false, 00:29:52.237 "get_zone_info": false, 00:29:52.237 "zone_management": false, 00:29:52.237 "zone_append": false, 00:29:52.237 "compare": false, 00:29:52.237 "compare_and_write": false, 00:29:52.237 "abort": false, 00:29:52.237 "seek_hole": false, 00:29:52.237 "seek_data": false, 00:29:52.237 "copy": false, 00:29:52.237 "nvme_iov_md": false 00:29:52.237 }, 00:29:52.237 "driver_specific": { 00:29:52.237 "compress": { 00:29:52.237 "name": "COMP_lvs0/lv0", 00:29:52.237 "base_bdev_name": "acfd18b6-8644-4532-aff6-4b3fa2e1a38f", 00:29:52.237 "pm_path": "/tmp/pmem/4f246b44-d4f8-456c-b61f-83ed6d5e3b98" 00:29:52.237 } 00:29:52.237 } 00:29:52.237 } 00:29:52.237 ] 00:29:52.237 08:43:04 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:52.237 08:43:04 compress_compdev -- compress/compress.sh@103 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode0 -a -s SPDK0 00:29:52.495 08:43:04 compress_compdev -- compress/compress.sh@104 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode0 COMP_lvs0/lv0 00:29:52.779 08:43:05 compress_compdev -- compress/compress.sh@105 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode0 -t tcp -a 10.0.0.2 -s 4420 00:29:52.779 [2024-07-23 08:43:05.256473] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:29:53.047 08:43:05 compress_compdev -- compress/compress.sh@109 -- # perf_pid=1618958 00:29:53.047 08:43:05 compress_compdev -- compress/compress.sh@108 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_nvme_perf -r 'trtype:tcp adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420' -o 4096 -q 64 -s 512 -w randrw -t 30 -c 0x18 -M 50 00:29:53.047 08:43:05 compress_compdev -- compress/compress.sh@112 -- # trap 'killprocess $perf_pid; compress_err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:53.047 08:43:05 compress_compdev -- compress/compress.sh@113 -- # wait 1618958 00:29:53.047 [2024-07-23 08:43:05.563764] subsystem.c:1572:spdk_nvmf_subsystem_listener_allowed: *WARNING*: Allowing connection to discovery subsystem on TCP/10.0.0.2/4420, even though this listener was not added to the discovery subsystem. This behavior is deprecated and will be removed in a future release. 00:30:25.112 Initializing NVMe Controllers 00:30:25.112 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode0 00:30:25.112 Associating TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 with lcore 3 00:30:25.112 Associating TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 with lcore 4 00:30:25.112 Initialization complete. Launching workers. 00:30:25.112 ======================================================== 00:30:25.112 Latency(us) 00:30:25.113 Device Information : IOPS MiB/s Average min max 00:30:25.113 TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 from core 3: 5686.33 22.21 11255.99 1562.94 30544.89 00:30:25.113 TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 from core 4: 3584.17 14.00 17858.52 2931.62 35831.66 00:30:25.113 ======================================================== 00:30:25.113 Total : 9270.50 36.21 13808.67 1562.94 35831.66 00:30:25.113 00:30:25.113 08:43:35 compress_compdev -- compress/compress.sh@114 -- # destroy_vols 00:30:25.113 08:43:35 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:25.113 08:43:35 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:25.113 08:43:36 compress_compdev -- compress/compress.sh@116 -- # trap - SIGINT SIGTERM EXIT 00:30:25.113 08:43:36 compress_compdev -- compress/compress.sh@117 -- # nvmftestfini 00:30:25.113 08:43:36 compress_compdev -- nvmf/common.sh@488 -- # nvmfcleanup 00:30:25.113 08:43:36 compress_compdev -- nvmf/common.sh@117 -- # sync 00:30:25.113 08:43:36 compress_compdev -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:30:25.113 08:43:36 compress_compdev -- nvmf/common.sh@120 -- # set +e 00:30:25.113 08:43:36 compress_compdev -- nvmf/common.sh@121 -- # for i in {1..20} 00:30:25.113 08:43:36 compress_compdev -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:30:25.113 rmmod nvme_tcp 00:30:25.113 rmmod nvme_fabrics 00:30:25.113 rmmod nvme_keyring 00:30:25.113 08:43:36 compress_compdev -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:30:25.113 08:43:36 compress_compdev -- nvmf/common.sh@124 -- # set -e 00:30:25.113 08:43:36 compress_compdev -- nvmf/common.sh@125 -- # return 0 00:30:25.113 08:43:36 compress_compdev -- nvmf/common.sh@489 -- # '[' -n 1617424 ']' 00:30:25.113 08:43:36 compress_compdev -- nvmf/common.sh@490 -- # killprocess 1617424 00:30:25.113 08:43:36 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1617424 ']' 00:30:25.113 08:43:36 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1617424 00:30:25.113 08:43:36 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:25.113 08:43:36 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:25.113 08:43:36 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1617424 00:30:25.113 08:43:36 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:25.113 08:43:36 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:25.113 08:43:36 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1617424' 00:30:25.113 killing process with pid 1617424 00:30:25.113 08:43:36 compress_compdev -- common/autotest_common.sh@967 -- # kill 1617424 00:30:25.113 08:43:36 compress_compdev -- common/autotest_common.sh@972 -- # wait 1617424 00:30:29.296 08:43:41 compress_compdev -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:30:29.296 08:43:41 compress_compdev -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:30:29.296 08:43:41 compress_compdev -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:30:29.296 08:43:41 compress_compdev -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:30:29.296 08:43:41 compress_compdev -- nvmf/common.sh@278 -- # remove_spdk_ns 00:30:29.296 08:43:41 compress_compdev -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:30:29.296 08:43:41 compress_compdev -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:30:29.296 08:43:41 compress_compdev -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:30:29.296 08:43:41 compress_compdev -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:30:29.296 08:43:41 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:30:29.296 00:30:29.296 real 2m29.305s 00:30:29.296 user 6m45.135s 00:30:29.296 sys 0m11.943s 00:30:29.296 08:43:41 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:29.296 08:43:41 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:29.296 ************************************ 00:30:29.297 END TEST compress_compdev 00:30:29.297 ************************************ 00:30:29.297 08:43:41 -- common/autotest_common.sh@1142 -- # return 0 00:30:29.297 08:43:41 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:30:29.297 08:43:41 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:29.297 08:43:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:29.297 08:43:41 -- common/autotest_common.sh@10 -- # set +x 00:30:29.297 ************************************ 00:30:29.297 START TEST compress_isal 00:30:29.297 ************************************ 00:30:29.297 08:43:41 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:30:29.297 * Looking for test storage... 00:30:29.297 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:30:29.297 08:43:41 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:800e967b-538f-e911-906e-001635649f5c 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=800e967b-538f-e911-906e-001635649f5c 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:29.297 08:43:41 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:29.297 08:43:41 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:29.297 08:43:41 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:29.297 08:43:41 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:29.297 08:43:41 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:29.297 08:43:41 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:29.297 08:43:41 compress_isal -- paths/export.sh@5 -- # export PATH 00:30:29.297 08:43:41 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@47 -- # : 0 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:30:29.297 08:43:41 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:30:29.297 08:43:41 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:29.297 08:43:41 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:30:29.297 08:43:41 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:30:29.297 08:43:41 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:30:29.297 08:43:41 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:29.297 08:43:41 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1625304 00:30:29.297 08:43:41 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:29.297 08:43:41 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1625304 00:30:29.297 08:43:41 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1625304 ']' 00:30:29.297 08:43:41 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:30:29.297 08:43:41 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:29.297 08:43:41 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:29.297 08:43:41 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:29.297 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:29.297 08:43:41 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:29.297 08:43:41 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:29.297 [2024-07-23 08:43:41.433620] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:30:29.297 [2024-07-23 08:43:41.433717] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1625304 ] 00:30:29.297 [2024-07-23 08:43:41.559523] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:29.297 [2024-07-23 08:43:41.777602] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:29.297 [2024-07-23 08:43:41.777615] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:29.864 08:43:42 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:29.864 08:43:42 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:29.864 08:43:42 compress_isal -- compress/compress.sh@74 -- # create_vols 00:30:29.864 08:43:42 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:29.864 08:43:42 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:33.151 08:43:45 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:33.151 08:43:45 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:33.151 08:43:45 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:33.151 08:43:45 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:33.151 08:43:45 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:33.151 08:43:45 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:33.151 08:43:45 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:33.151 08:43:45 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:33.151 [ 00:30:33.151 { 00:30:33.151 "name": "Nvme0n1", 00:30:33.151 "aliases": [ 00:30:33.151 "a7e7c179-133c-4d42-b305-5783b78c099c" 00:30:33.151 ], 00:30:33.151 "product_name": "NVMe disk", 00:30:33.151 "block_size": 512, 00:30:33.151 "num_blocks": 7814037168, 00:30:33.151 "uuid": "a7e7c179-133c-4d42-b305-5783b78c099c", 00:30:33.151 "assigned_rate_limits": { 00:30:33.151 "rw_ios_per_sec": 0, 00:30:33.151 "rw_mbytes_per_sec": 0, 00:30:33.151 "r_mbytes_per_sec": 0, 00:30:33.151 "w_mbytes_per_sec": 0 00:30:33.151 }, 00:30:33.151 "claimed": false, 00:30:33.151 "zoned": false, 00:30:33.151 "supported_io_types": { 00:30:33.151 "read": true, 00:30:33.151 "write": true, 00:30:33.151 "unmap": true, 00:30:33.151 "flush": true, 00:30:33.151 "reset": true, 00:30:33.151 "nvme_admin": true, 00:30:33.151 "nvme_io": true, 00:30:33.151 "nvme_io_md": false, 00:30:33.151 "write_zeroes": true, 00:30:33.151 "zcopy": false, 00:30:33.151 "get_zone_info": false, 00:30:33.151 "zone_management": false, 00:30:33.151 "zone_append": false, 00:30:33.151 "compare": false, 00:30:33.151 "compare_and_write": false, 00:30:33.151 "abort": true, 00:30:33.151 "seek_hole": false, 00:30:33.151 "seek_data": false, 00:30:33.151 "copy": false, 00:30:33.151 "nvme_iov_md": false 00:30:33.151 }, 00:30:33.151 "driver_specific": { 00:30:33.151 "nvme": [ 00:30:33.151 { 00:30:33.151 "pci_address": "0000:60:00.0", 00:30:33.151 "trid": { 00:30:33.151 "trtype": "PCIe", 00:30:33.151 "traddr": "0000:60:00.0" 00:30:33.151 }, 00:30:33.151 "ctrlr_data": { 00:30:33.151 "cntlid": 0, 00:30:33.151 "vendor_id": "0x8086", 00:30:33.151 "model_number": "INTEL SSDPE2KX040T8", 00:30:33.151 "serial_number": "BTLJ81850BB64P0DGN", 00:30:33.151 "firmware_revision": "VDV1Y295", 00:30:33.151 "oacs": { 00:30:33.151 "security": 0, 00:30:33.151 "format": 1, 00:30:33.151 "firmware": 1, 00:30:33.151 "ns_manage": 1 00:30:33.151 }, 00:30:33.151 "multi_ctrlr": false, 00:30:33.151 "ana_reporting": false 00:30:33.151 }, 00:30:33.151 "vs": { 00:30:33.151 "nvme_version": "1.2" 00:30:33.151 }, 00:30:33.151 "ns_data": { 00:30:33.151 "id": 1, 00:30:33.151 "can_share": false 00:30:33.151 } 00:30:33.151 } 00:30:33.151 ], 00:30:33.151 "mp_policy": "active_passive" 00:30:33.151 } 00:30:33.151 } 00:30:33.151 ] 00:30:33.151 08:43:45 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:33.151 08:43:45 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:35.050 45ef956f-6ed3-4c4c-a795-79aa03098088 00:30:35.050 08:43:47 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:35.308 0d9b67fc-1c13-4229-8bea-950612f14a24 00:30:35.308 08:43:47 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:35.308 08:43:47 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:35.308 08:43:47 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:35.308 08:43:47 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:35.308 08:43:47 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:35.308 08:43:47 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:35.308 08:43:47 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:35.308 08:43:47 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:35.566 [ 00:30:35.566 { 00:30:35.566 "name": "0d9b67fc-1c13-4229-8bea-950612f14a24", 00:30:35.566 "aliases": [ 00:30:35.566 "lvs0/lv0" 00:30:35.566 ], 00:30:35.566 "product_name": "Logical Volume", 00:30:35.566 "block_size": 512, 00:30:35.566 "num_blocks": 204800, 00:30:35.566 "uuid": "0d9b67fc-1c13-4229-8bea-950612f14a24", 00:30:35.566 "assigned_rate_limits": { 00:30:35.566 "rw_ios_per_sec": 0, 00:30:35.566 "rw_mbytes_per_sec": 0, 00:30:35.566 "r_mbytes_per_sec": 0, 00:30:35.566 "w_mbytes_per_sec": 0 00:30:35.566 }, 00:30:35.566 "claimed": false, 00:30:35.566 "zoned": false, 00:30:35.566 "supported_io_types": { 00:30:35.566 "read": true, 00:30:35.566 "write": true, 00:30:35.566 "unmap": true, 00:30:35.566 "flush": false, 00:30:35.566 "reset": true, 00:30:35.566 "nvme_admin": false, 00:30:35.566 "nvme_io": false, 00:30:35.566 "nvme_io_md": false, 00:30:35.566 "write_zeroes": true, 00:30:35.566 "zcopy": false, 00:30:35.566 "get_zone_info": false, 00:30:35.566 "zone_management": false, 00:30:35.566 "zone_append": false, 00:30:35.566 "compare": false, 00:30:35.566 "compare_and_write": false, 00:30:35.566 "abort": false, 00:30:35.566 "seek_hole": true, 00:30:35.566 "seek_data": true, 00:30:35.566 "copy": false, 00:30:35.566 "nvme_iov_md": false 00:30:35.566 }, 00:30:35.566 "driver_specific": { 00:30:35.566 "lvol": { 00:30:35.566 "lvol_store_uuid": "45ef956f-6ed3-4c4c-a795-79aa03098088", 00:30:35.566 "base_bdev": "Nvme0n1", 00:30:35.566 "thin_provision": true, 00:30:35.566 "num_allocated_clusters": 0, 00:30:35.566 "snapshot": false, 00:30:35.566 "clone": false, 00:30:35.566 "esnap_clone": false 00:30:35.566 } 00:30:35.566 } 00:30:35.566 } 00:30:35.566 ] 00:30:35.566 08:43:47 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:35.566 08:43:47 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:35.566 08:43:47 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:35.825 [2024-07-23 08:43:48.123370] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:35.825 COMP_lvs0/lv0 00:30:35.825 08:43:48 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:35.825 08:43:48 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:35.825 08:43:48 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:35.825 08:43:48 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:35.825 08:43:48 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:35.825 08:43:48 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:35.825 08:43:48 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:35.825 08:43:48 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:36.083 [ 00:30:36.083 { 00:30:36.083 "name": "COMP_lvs0/lv0", 00:30:36.083 "aliases": [ 00:30:36.083 "666ff0bc-0e9d-5a17-a67e-096cfa8453e8" 00:30:36.083 ], 00:30:36.083 "product_name": "compress", 00:30:36.083 "block_size": 512, 00:30:36.083 "num_blocks": 200704, 00:30:36.084 "uuid": "666ff0bc-0e9d-5a17-a67e-096cfa8453e8", 00:30:36.084 "assigned_rate_limits": { 00:30:36.084 "rw_ios_per_sec": 0, 00:30:36.084 "rw_mbytes_per_sec": 0, 00:30:36.084 "r_mbytes_per_sec": 0, 00:30:36.084 "w_mbytes_per_sec": 0 00:30:36.084 }, 00:30:36.084 "claimed": false, 00:30:36.084 "zoned": false, 00:30:36.084 "supported_io_types": { 00:30:36.084 "read": true, 00:30:36.084 "write": true, 00:30:36.084 "unmap": false, 00:30:36.084 "flush": false, 00:30:36.084 "reset": false, 00:30:36.084 "nvme_admin": false, 00:30:36.084 "nvme_io": false, 00:30:36.084 "nvme_io_md": false, 00:30:36.084 "write_zeroes": true, 00:30:36.084 "zcopy": false, 00:30:36.084 "get_zone_info": false, 00:30:36.084 "zone_management": false, 00:30:36.084 "zone_append": false, 00:30:36.084 "compare": false, 00:30:36.084 "compare_and_write": false, 00:30:36.084 "abort": false, 00:30:36.084 "seek_hole": false, 00:30:36.084 "seek_data": false, 00:30:36.084 "copy": false, 00:30:36.084 "nvme_iov_md": false 00:30:36.084 }, 00:30:36.084 "driver_specific": { 00:30:36.084 "compress": { 00:30:36.084 "name": "COMP_lvs0/lv0", 00:30:36.084 "base_bdev_name": "0d9b67fc-1c13-4229-8bea-950612f14a24", 00:30:36.084 "pm_path": "/tmp/pmem/333a8a48-04a4-4576-a08f-98b94584e263" 00:30:36.084 } 00:30:36.084 } 00:30:36.084 } 00:30:36.084 ] 00:30:36.084 08:43:48 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:36.084 08:43:48 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:36.084 Running I/O for 3 seconds... 00:30:39.405 00:30:39.405 Latency(us) 00:30:39.405 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:39.405 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:39.405 Verification LBA range: start 0x0 length 0x3100 00:30:39.405 COMP_lvs0/lv0 : 3.01 2953.21 11.54 0.00 0.00 10791.77 59.00 18599.74 00:30:39.405 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:39.405 Verification LBA range: start 0x3100 length 0x3100 00:30:39.405 COMP_lvs0/lv0 : 3.01 2968.31 11.59 0.00 0.00 10731.46 60.22 19223.89 00:30:39.405 =================================================================================================================== 00:30:39.405 Total : 5921.52 23.13 0.00 0.00 10761.53 59.00 19223.89 00:30:39.405 0 00:30:39.405 08:43:51 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:30:39.405 08:43:51 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:39.405 08:43:51 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:39.664 08:43:52 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:39.664 08:43:52 compress_isal -- compress/compress.sh@78 -- # killprocess 1625304 00:30:39.664 08:43:52 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1625304 ']' 00:30:39.664 08:43:52 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1625304 00:30:39.664 08:43:52 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:39.664 08:43:52 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:39.664 08:43:52 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1625304 00:30:39.664 08:43:52 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:39.664 08:43:52 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:39.664 08:43:52 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1625304' 00:30:39.664 killing process with pid 1625304 00:30:39.664 08:43:52 compress_isal -- common/autotest_common.sh@967 -- # kill 1625304 00:30:39.664 Received shutdown signal, test time was about 3.000000 seconds 00:30:39.664 00:30:39.664 Latency(us) 00:30:39.664 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:39.664 =================================================================================================================== 00:30:39.664 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:39.664 08:43:52 compress_isal -- common/autotest_common.sh@972 -- # wait 1625304 00:30:44.926 08:43:56 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:30:44.926 08:43:56 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:44.926 08:43:56 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1628178 00:30:44.926 08:43:56 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:44.926 08:43:56 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:30:44.926 08:43:56 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1628178 00:30:44.926 08:43:56 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1628178 ']' 00:30:44.926 08:43:56 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:44.926 08:43:56 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:44.926 08:43:56 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:44.926 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:44.926 08:43:56 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:44.926 08:43:56 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:44.926 [2024-07-23 08:43:57.078664] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:30:44.926 [2024-07-23 08:43:57.078762] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1628178 ] 00:30:44.926 [2024-07-23 08:43:57.202932] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:44.926 [2024-07-23 08:43:57.427118] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:44.926 [2024-07-23 08:43:57.427131] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:45.495 08:43:57 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:45.495 08:43:57 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:45.495 08:43:57 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:30:45.495 08:43:57 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:45.495 08:43:57 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:48.771 08:44:00 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:48.771 08:44:00 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:48.771 08:44:00 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:48.771 08:44:00 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:48.771 08:44:00 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:48.771 08:44:00 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:48.771 08:44:00 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:48.771 08:44:01 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:48.771 [ 00:30:48.771 { 00:30:48.771 "name": "Nvme0n1", 00:30:48.771 "aliases": [ 00:30:48.771 "b774d7fe-4753-4590-9f00-0aabc978b5f7" 00:30:48.771 ], 00:30:48.771 "product_name": "NVMe disk", 00:30:48.771 "block_size": 512, 00:30:48.771 "num_blocks": 7814037168, 00:30:48.771 "uuid": "b774d7fe-4753-4590-9f00-0aabc978b5f7", 00:30:48.771 "assigned_rate_limits": { 00:30:48.771 "rw_ios_per_sec": 0, 00:30:48.771 "rw_mbytes_per_sec": 0, 00:30:48.771 "r_mbytes_per_sec": 0, 00:30:48.771 "w_mbytes_per_sec": 0 00:30:48.771 }, 00:30:48.771 "claimed": false, 00:30:48.771 "zoned": false, 00:30:48.771 "supported_io_types": { 00:30:48.771 "read": true, 00:30:48.771 "write": true, 00:30:48.771 "unmap": true, 00:30:48.771 "flush": true, 00:30:48.771 "reset": true, 00:30:48.771 "nvme_admin": true, 00:30:48.771 "nvme_io": true, 00:30:48.771 "nvme_io_md": false, 00:30:48.771 "write_zeroes": true, 00:30:48.771 "zcopy": false, 00:30:48.772 "get_zone_info": false, 00:30:48.772 "zone_management": false, 00:30:48.772 "zone_append": false, 00:30:48.772 "compare": false, 00:30:48.772 "compare_and_write": false, 00:30:48.772 "abort": true, 00:30:48.772 "seek_hole": false, 00:30:48.772 "seek_data": false, 00:30:48.772 "copy": false, 00:30:48.772 "nvme_iov_md": false 00:30:48.772 }, 00:30:48.772 "driver_specific": { 00:30:48.772 "nvme": [ 00:30:48.772 { 00:30:48.772 "pci_address": "0000:60:00.0", 00:30:48.772 "trid": { 00:30:48.772 "trtype": "PCIe", 00:30:48.772 "traddr": "0000:60:00.0" 00:30:48.772 }, 00:30:48.772 "ctrlr_data": { 00:30:48.772 "cntlid": 0, 00:30:48.772 "vendor_id": "0x8086", 00:30:48.772 "model_number": "INTEL SSDPE2KX040T8", 00:30:48.772 "serial_number": "BTLJ81850BB64P0DGN", 00:30:48.772 "firmware_revision": "VDV1Y295", 00:30:48.772 "oacs": { 00:30:48.772 "security": 0, 00:30:48.772 "format": 1, 00:30:48.772 "firmware": 1, 00:30:48.772 "ns_manage": 1 00:30:48.772 }, 00:30:48.772 "multi_ctrlr": false, 00:30:48.772 "ana_reporting": false 00:30:48.772 }, 00:30:48.772 "vs": { 00:30:48.772 "nvme_version": "1.2" 00:30:48.772 }, 00:30:48.772 "ns_data": { 00:30:48.772 "id": 1, 00:30:48.772 "can_share": false 00:30:48.772 } 00:30:48.772 } 00:30:48.772 ], 00:30:48.772 "mp_policy": "active_passive" 00:30:48.772 } 00:30:48.772 } 00:30:48.772 ] 00:30:48.772 08:44:01 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:48.772 08:44:01 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:50.674 a4105f52-db70-4ff7-a9ec-7d034b692d23 00:30:50.674 08:44:03 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:50.933 3e16d3f6-eaca-4f0c-a4cc-d22aacd60289 00:30:50.933 08:44:03 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:50.933 08:44:03 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:50.933 08:44:03 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:50.933 08:44:03 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:50.933 08:44:03 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:50.933 08:44:03 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:50.933 08:44:03 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:50.933 08:44:03 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:51.192 [ 00:30:51.192 { 00:30:51.192 "name": "3e16d3f6-eaca-4f0c-a4cc-d22aacd60289", 00:30:51.192 "aliases": [ 00:30:51.192 "lvs0/lv0" 00:30:51.192 ], 00:30:51.192 "product_name": "Logical Volume", 00:30:51.192 "block_size": 512, 00:30:51.192 "num_blocks": 204800, 00:30:51.192 "uuid": "3e16d3f6-eaca-4f0c-a4cc-d22aacd60289", 00:30:51.192 "assigned_rate_limits": { 00:30:51.192 "rw_ios_per_sec": 0, 00:30:51.192 "rw_mbytes_per_sec": 0, 00:30:51.192 "r_mbytes_per_sec": 0, 00:30:51.192 "w_mbytes_per_sec": 0 00:30:51.192 }, 00:30:51.192 "claimed": false, 00:30:51.192 "zoned": false, 00:30:51.192 "supported_io_types": { 00:30:51.192 "read": true, 00:30:51.192 "write": true, 00:30:51.192 "unmap": true, 00:30:51.192 "flush": false, 00:30:51.192 "reset": true, 00:30:51.192 "nvme_admin": false, 00:30:51.192 "nvme_io": false, 00:30:51.192 "nvme_io_md": false, 00:30:51.192 "write_zeroes": true, 00:30:51.192 "zcopy": false, 00:30:51.192 "get_zone_info": false, 00:30:51.192 "zone_management": false, 00:30:51.192 "zone_append": false, 00:30:51.192 "compare": false, 00:30:51.192 "compare_and_write": false, 00:30:51.192 "abort": false, 00:30:51.192 "seek_hole": true, 00:30:51.192 "seek_data": true, 00:30:51.192 "copy": false, 00:30:51.192 "nvme_iov_md": false 00:30:51.192 }, 00:30:51.192 "driver_specific": { 00:30:51.192 "lvol": { 00:30:51.192 "lvol_store_uuid": "a4105f52-db70-4ff7-a9ec-7d034b692d23", 00:30:51.192 "base_bdev": "Nvme0n1", 00:30:51.192 "thin_provision": true, 00:30:51.192 "num_allocated_clusters": 0, 00:30:51.192 "snapshot": false, 00:30:51.192 "clone": false, 00:30:51.192 "esnap_clone": false 00:30:51.192 } 00:30:51.192 } 00:30:51.192 } 00:30:51.192 ] 00:30:51.192 08:44:03 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:51.192 08:44:03 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:30:51.192 08:44:03 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:30:51.450 [2024-07-23 08:44:03.774047] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:51.450 COMP_lvs0/lv0 00:30:51.450 08:44:03 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:51.450 08:44:03 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:51.450 08:44:03 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:51.450 08:44:03 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:51.450 08:44:03 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:51.450 08:44:03 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:51.450 08:44:03 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:51.450 08:44:03 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:51.709 [ 00:30:51.709 { 00:30:51.709 "name": "COMP_lvs0/lv0", 00:30:51.709 "aliases": [ 00:30:51.709 "ce2480b6-01a4-5c3f-9cfb-0da2d2ee960a" 00:30:51.709 ], 00:30:51.709 "product_name": "compress", 00:30:51.709 "block_size": 512, 00:30:51.709 "num_blocks": 200704, 00:30:51.709 "uuid": "ce2480b6-01a4-5c3f-9cfb-0da2d2ee960a", 00:30:51.709 "assigned_rate_limits": { 00:30:51.709 "rw_ios_per_sec": 0, 00:30:51.709 "rw_mbytes_per_sec": 0, 00:30:51.709 "r_mbytes_per_sec": 0, 00:30:51.709 "w_mbytes_per_sec": 0 00:30:51.709 }, 00:30:51.709 "claimed": false, 00:30:51.709 "zoned": false, 00:30:51.709 "supported_io_types": { 00:30:51.709 "read": true, 00:30:51.709 "write": true, 00:30:51.709 "unmap": false, 00:30:51.709 "flush": false, 00:30:51.709 "reset": false, 00:30:51.709 "nvme_admin": false, 00:30:51.709 "nvme_io": false, 00:30:51.709 "nvme_io_md": false, 00:30:51.709 "write_zeroes": true, 00:30:51.709 "zcopy": false, 00:30:51.709 "get_zone_info": false, 00:30:51.709 "zone_management": false, 00:30:51.709 "zone_append": false, 00:30:51.709 "compare": false, 00:30:51.709 "compare_and_write": false, 00:30:51.709 "abort": false, 00:30:51.709 "seek_hole": false, 00:30:51.709 "seek_data": false, 00:30:51.709 "copy": false, 00:30:51.709 "nvme_iov_md": false 00:30:51.709 }, 00:30:51.709 "driver_specific": { 00:30:51.709 "compress": { 00:30:51.709 "name": "COMP_lvs0/lv0", 00:30:51.709 "base_bdev_name": "3e16d3f6-eaca-4f0c-a4cc-d22aacd60289", 00:30:51.709 "pm_path": "/tmp/pmem/73bb6cc7-063b-4ed8-b22a-ce71796c2b5d" 00:30:51.709 } 00:30:51.709 } 00:30:51.709 } 00:30:51.709 ] 00:30:51.709 08:44:04 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:51.709 08:44:04 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:51.709 Running I/O for 3 seconds... 00:30:54.993 00:30:54.993 Latency(us) 00:30:54.993 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:54.993 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:54.993 Verification LBA range: start 0x0 length 0x3100 00:30:54.993 COMP_lvs0/lv0 : 3.02 3011.17 11.76 0.00 0.00 10558.61 58.76 18474.91 00:30:54.993 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:54.993 Verification LBA range: start 0x3100 length 0x3100 00:30:54.993 COMP_lvs0/lv0 : 3.01 3049.52 11.91 0.00 0.00 10430.73 57.78 19223.89 00:30:54.993 =================================================================================================================== 00:30:54.993 Total : 6060.68 23.67 0.00 0.00 10494.31 57.78 19223.89 00:30:54.993 0 00:30:54.993 08:44:07 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:30:54.993 08:44:07 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:54.993 08:44:07 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:55.251 08:44:07 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:55.251 08:44:07 compress_isal -- compress/compress.sh@78 -- # killprocess 1628178 00:30:55.251 08:44:07 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1628178 ']' 00:30:55.251 08:44:07 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1628178 00:30:55.251 08:44:07 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:55.251 08:44:07 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:55.251 08:44:07 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1628178 00:30:55.251 08:44:07 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:55.251 08:44:07 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:55.251 08:44:07 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1628178' 00:30:55.251 killing process with pid 1628178 00:30:55.251 08:44:07 compress_isal -- common/autotest_common.sh@967 -- # kill 1628178 00:30:55.251 Received shutdown signal, test time was about 3.000000 seconds 00:30:55.251 00:30:55.251 Latency(us) 00:30:55.251 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:55.251 =================================================================================================================== 00:30:55.251 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:55.251 08:44:07 compress_isal -- common/autotest_common.sh@972 -- # wait 1628178 00:31:00.514 08:44:12 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:31:00.514 08:44:12 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:00.514 08:44:12 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1631072 00:31:00.514 08:44:12 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:00.514 08:44:12 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:00.514 08:44:12 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1631072 00:31:00.514 08:44:12 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1631072 ']' 00:31:00.514 08:44:12 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:00.514 08:44:12 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:00.514 08:44:12 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:00.514 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:00.514 08:44:12 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:00.514 08:44:12 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:00.514 [2024-07-23 08:44:12.700323] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:31:00.514 [2024-07-23 08:44:12.700417] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1631072 ] 00:31:00.514 [2024-07-23 08:44:12.821711] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:00.772 [2024-07-23 08:44:13.043704] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:00.772 [2024-07-23 08:44:13.043710] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:01.030 08:44:13 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:01.030 08:44:13 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:01.031 08:44:13 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:31:01.031 08:44:13 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:01.031 08:44:13 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:04.344 08:44:16 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:04.344 08:44:16 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:04.344 08:44:16 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:04.344 08:44:16 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:04.344 08:44:16 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:04.344 08:44:16 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:04.344 08:44:16 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:04.344 08:44:16 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:04.603 [ 00:31:04.603 { 00:31:04.603 "name": "Nvme0n1", 00:31:04.603 "aliases": [ 00:31:04.603 "37f8f964-af99-4ce5-82e5-85438a8e61b3" 00:31:04.603 ], 00:31:04.603 "product_name": "NVMe disk", 00:31:04.603 "block_size": 512, 00:31:04.603 "num_blocks": 7814037168, 00:31:04.603 "uuid": "37f8f964-af99-4ce5-82e5-85438a8e61b3", 00:31:04.603 "assigned_rate_limits": { 00:31:04.603 "rw_ios_per_sec": 0, 00:31:04.603 "rw_mbytes_per_sec": 0, 00:31:04.603 "r_mbytes_per_sec": 0, 00:31:04.603 "w_mbytes_per_sec": 0 00:31:04.603 }, 00:31:04.603 "claimed": false, 00:31:04.603 "zoned": false, 00:31:04.603 "supported_io_types": { 00:31:04.603 "read": true, 00:31:04.603 "write": true, 00:31:04.603 "unmap": true, 00:31:04.603 "flush": true, 00:31:04.603 "reset": true, 00:31:04.603 "nvme_admin": true, 00:31:04.603 "nvme_io": true, 00:31:04.603 "nvme_io_md": false, 00:31:04.603 "write_zeroes": true, 00:31:04.603 "zcopy": false, 00:31:04.603 "get_zone_info": false, 00:31:04.603 "zone_management": false, 00:31:04.603 "zone_append": false, 00:31:04.603 "compare": false, 00:31:04.603 "compare_and_write": false, 00:31:04.603 "abort": true, 00:31:04.603 "seek_hole": false, 00:31:04.603 "seek_data": false, 00:31:04.603 "copy": false, 00:31:04.603 "nvme_iov_md": false 00:31:04.603 }, 00:31:04.603 "driver_specific": { 00:31:04.603 "nvme": [ 00:31:04.603 { 00:31:04.603 "pci_address": "0000:60:00.0", 00:31:04.603 "trid": { 00:31:04.603 "trtype": "PCIe", 00:31:04.603 "traddr": "0000:60:00.0" 00:31:04.603 }, 00:31:04.603 "ctrlr_data": { 00:31:04.603 "cntlid": 0, 00:31:04.603 "vendor_id": "0x8086", 00:31:04.603 "model_number": "INTEL SSDPE2KX040T8", 00:31:04.603 "serial_number": "BTLJ81850BB64P0DGN", 00:31:04.603 "firmware_revision": "VDV1Y295", 00:31:04.603 "oacs": { 00:31:04.603 "security": 0, 00:31:04.603 "format": 1, 00:31:04.603 "firmware": 1, 00:31:04.603 "ns_manage": 1 00:31:04.603 }, 00:31:04.603 "multi_ctrlr": false, 00:31:04.603 "ana_reporting": false 00:31:04.603 }, 00:31:04.603 "vs": { 00:31:04.603 "nvme_version": "1.2" 00:31:04.603 }, 00:31:04.603 "ns_data": { 00:31:04.603 "id": 1, 00:31:04.603 "can_share": false 00:31:04.603 } 00:31:04.603 } 00:31:04.603 ], 00:31:04.603 "mp_policy": "active_passive" 00:31:04.603 } 00:31:04.603 } 00:31:04.603 ] 00:31:04.603 08:44:16 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:04.603 08:44:16 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:06.501 c9918de2-7276-4f1d-93af-12fcb95e4fbe 00:31:06.501 08:44:18 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:06.501 abfe3709-a2cf-4b3a-84b8-cf99fceb17ce 00:31:06.501 08:44:18 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:06.501 08:44:18 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:06.501 08:44:18 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:06.501 08:44:18 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:06.501 08:44:18 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:06.501 08:44:18 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:06.501 08:44:18 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:06.759 08:44:19 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:06.759 [ 00:31:06.759 { 00:31:06.759 "name": "abfe3709-a2cf-4b3a-84b8-cf99fceb17ce", 00:31:06.759 "aliases": [ 00:31:06.759 "lvs0/lv0" 00:31:06.759 ], 00:31:06.759 "product_name": "Logical Volume", 00:31:06.759 "block_size": 512, 00:31:06.759 "num_blocks": 204800, 00:31:06.759 "uuid": "abfe3709-a2cf-4b3a-84b8-cf99fceb17ce", 00:31:06.759 "assigned_rate_limits": { 00:31:06.759 "rw_ios_per_sec": 0, 00:31:06.759 "rw_mbytes_per_sec": 0, 00:31:06.759 "r_mbytes_per_sec": 0, 00:31:06.759 "w_mbytes_per_sec": 0 00:31:06.759 }, 00:31:06.759 "claimed": false, 00:31:06.759 "zoned": false, 00:31:06.759 "supported_io_types": { 00:31:06.760 "read": true, 00:31:06.760 "write": true, 00:31:06.760 "unmap": true, 00:31:06.760 "flush": false, 00:31:06.760 "reset": true, 00:31:06.760 "nvme_admin": false, 00:31:06.760 "nvme_io": false, 00:31:06.760 "nvme_io_md": false, 00:31:06.760 "write_zeroes": true, 00:31:06.760 "zcopy": false, 00:31:06.760 "get_zone_info": false, 00:31:06.760 "zone_management": false, 00:31:06.760 "zone_append": false, 00:31:06.760 "compare": false, 00:31:06.760 "compare_and_write": false, 00:31:06.760 "abort": false, 00:31:06.760 "seek_hole": true, 00:31:06.760 "seek_data": true, 00:31:06.760 "copy": false, 00:31:06.760 "nvme_iov_md": false 00:31:06.760 }, 00:31:06.760 "driver_specific": { 00:31:06.760 "lvol": { 00:31:06.760 "lvol_store_uuid": "c9918de2-7276-4f1d-93af-12fcb95e4fbe", 00:31:06.760 "base_bdev": "Nvme0n1", 00:31:06.760 "thin_provision": true, 00:31:06.760 "num_allocated_clusters": 0, 00:31:06.760 "snapshot": false, 00:31:06.760 "clone": false, 00:31:06.760 "esnap_clone": false 00:31:06.760 } 00:31:06.760 } 00:31:06.760 } 00:31:06.760 ] 00:31:06.760 08:44:19 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:06.760 08:44:19 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:31:06.760 08:44:19 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:31:07.018 [2024-07-23 08:44:19.383784] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:07.018 COMP_lvs0/lv0 00:31:07.018 08:44:19 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:07.018 08:44:19 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:07.018 08:44:19 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:07.018 08:44:19 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:07.018 08:44:19 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:07.018 08:44:19 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:07.018 08:44:19 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:07.276 08:44:19 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:07.276 [ 00:31:07.276 { 00:31:07.276 "name": "COMP_lvs0/lv0", 00:31:07.276 "aliases": [ 00:31:07.276 "4524ab39-3a31-5b3f-bcf0-783251149797" 00:31:07.276 ], 00:31:07.276 "product_name": "compress", 00:31:07.276 "block_size": 4096, 00:31:07.276 "num_blocks": 25088, 00:31:07.276 "uuid": "4524ab39-3a31-5b3f-bcf0-783251149797", 00:31:07.276 "assigned_rate_limits": { 00:31:07.276 "rw_ios_per_sec": 0, 00:31:07.276 "rw_mbytes_per_sec": 0, 00:31:07.276 "r_mbytes_per_sec": 0, 00:31:07.276 "w_mbytes_per_sec": 0 00:31:07.276 }, 00:31:07.276 "claimed": false, 00:31:07.276 "zoned": false, 00:31:07.276 "supported_io_types": { 00:31:07.276 "read": true, 00:31:07.276 "write": true, 00:31:07.276 "unmap": false, 00:31:07.276 "flush": false, 00:31:07.276 "reset": false, 00:31:07.276 "nvme_admin": false, 00:31:07.276 "nvme_io": false, 00:31:07.276 "nvme_io_md": false, 00:31:07.276 "write_zeroes": true, 00:31:07.276 "zcopy": false, 00:31:07.276 "get_zone_info": false, 00:31:07.276 "zone_management": false, 00:31:07.276 "zone_append": false, 00:31:07.276 "compare": false, 00:31:07.276 "compare_and_write": false, 00:31:07.276 "abort": false, 00:31:07.276 "seek_hole": false, 00:31:07.276 "seek_data": false, 00:31:07.276 "copy": false, 00:31:07.276 "nvme_iov_md": false 00:31:07.276 }, 00:31:07.276 "driver_specific": { 00:31:07.276 "compress": { 00:31:07.276 "name": "COMP_lvs0/lv0", 00:31:07.276 "base_bdev_name": "abfe3709-a2cf-4b3a-84b8-cf99fceb17ce", 00:31:07.276 "pm_path": "/tmp/pmem/dd154e59-7533-452b-8349-c3f20a9cf9ec" 00:31:07.276 } 00:31:07.276 } 00:31:07.276 } 00:31:07.276 ] 00:31:07.276 08:44:19 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:07.276 08:44:19 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:07.533 Running I/O for 3 seconds... 00:31:10.814 00:31:10.814 Latency(us) 00:31:10.814 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:10.814 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:10.814 Verification LBA range: start 0x0 length 0x3100 00:31:10.814 COMP_lvs0/lv0 : 3.01 2945.72 11.51 0.00 0.00 10816.80 62.17 18100.42 00:31:10.814 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:10.814 Verification LBA range: start 0x3100 length 0x3100 00:31:10.814 COMP_lvs0/lv0 : 3.01 2961.64 11.57 0.00 0.00 10760.98 61.93 16976.94 00:31:10.814 =================================================================================================================== 00:31:10.814 Total : 5907.36 23.08 0.00 0.00 10788.82 61.93 18100.42 00:31:10.814 0 00:31:10.814 08:44:22 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:10.814 08:44:22 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:10.814 08:44:23 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:10.814 08:44:23 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:10.814 08:44:23 compress_isal -- compress/compress.sh@78 -- # killprocess 1631072 00:31:10.814 08:44:23 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1631072 ']' 00:31:10.814 08:44:23 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1631072 00:31:10.814 08:44:23 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:10.814 08:44:23 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:10.814 08:44:23 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1631072 00:31:10.814 08:44:23 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:10.814 08:44:23 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:10.814 08:44:23 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1631072' 00:31:10.814 killing process with pid 1631072 00:31:10.814 08:44:23 compress_isal -- common/autotest_common.sh@967 -- # kill 1631072 00:31:10.814 Received shutdown signal, test time was about 3.000000 seconds 00:31:10.814 00:31:10.814 Latency(us) 00:31:10.814 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:10.814 =================================================================================================================== 00:31:10.814 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:10.814 08:44:23 compress_isal -- common/autotest_common.sh@972 -- # wait 1631072 00:31:16.075 08:44:28 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:31:16.075 08:44:28 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:16.075 08:44:28 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=1633902 00:31:16.075 08:44:28 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:16.075 08:44:28 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:31:16.075 08:44:28 compress_isal -- compress/compress.sh@57 -- # waitforlisten 1633902 00:31:16.075 08:44:28 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1633902 ']' 00:31:16.075 08:44:28 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:16.075 08:44:28 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:16.075 08:44:28 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:16.075 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:16.075 08:44:28 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:16.075 08:44:28 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:16.075 [2024-07-23 08:44:28.342411] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:31:16.075 [2024-07-23 08:44:28.342500] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1633902 ] 00:31:16.075 [2024-07-23 08:44:28.464558] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:16.334 [2024-07-23 08:44:28.676104] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:16.334 [2024-07-23 08:44:28.676170] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:16.334 [2024-07-23 08:44:28.676177] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:16.900 08:44:29 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:16.900 08:44:29 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:16.900 08:44:29 compress_isal -- compress/compress.sh@58 -- # create_vols 00:31:16.900 08:44:29 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:16.900 08:44:29 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:20.181 08:44:32 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:20.181 08:44:32 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:20.181 08:44:32 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:20.181 08:44:32 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:20.181 08:44:32 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:20.181 08:44:32 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:20.181 08:44:32 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:20.181 08:44:32 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:20.182 [ 00:31:20.182 { 00:31:20.182 "name": "Nvme0n1", 00:31:20.182 "aliases": [ 00:31:20.182 "f66f04fe-472c-4271-97c4-7f5c537f4a4f" 00:31:20.182 ], 00:31:20.182 "product_name": "NVMe disk", 00:31:20.182 "block_size": 512, 00:31:20.182 "num_blocks": 7814037168, 00:31:20.182 "uuid": "f66f04fe-472c-4271-97c4-7f5c537f4a4f", 00:31:20.182 "assigned_rate_limits": { 00:31:20.182 "rw_ios_per_sec": 0, 00:31:20.182 "rw_mbytes_per_sec": 0, 00:31:20.182 "r_mbytes_per_sec": 0, 00:31:20.182 "w_mbytes_per_sec": 0 00:31:20.182 }, 00:31:20.182 "claimed": false, 00:31:20.182 "zoned": false, 00:31:20.182 "supported_io_types": { 00:31:20.182 "read": true, 00:31:20.182 "write": true, 00:31:20.182 "unmap": true, 00:31:20.182 "flush": true, 00:31:20.182 "reset": true, 00:31:20.182 "nvme_admin": true, 00:31:20.182 "nvme_io": true, 00:31:20.182 "nvme_io_md": false, 00:31:20.182 "write_zeroes": true, 00:31:20.182 "zcopy": false, 00:31:20.182 "get_zone_info": false, 00:31:20.182 "zone_management": false, 00:31:20.182 "zone_append": false, 00:31:20.182 "compare": false, 00:31:20.182 "compare_and_write": false, 00:31:20.182 "abort": true, 00:31:20.182 "seek_hole": false, 00:31:20.182 "seek_data": false, 00:31:20.182 "copy": false, 00:31:20.182 "nvme_iov_md": false 00:31:20.182 }, 00:31:20.182 "driver_specific": { 00:31:20.182 "nvme": [ 00:31:20.182 { 00:31:20.182 "pci_address": "0000:60:00.0", 00:31:20.182 "trid": { 00:31:20.182 "trtype": "PCIe", 00:31:20.182 "traddr": "0000:60:00.0" 00:31:20.182 }, 00:31:20.182 "ctrlr_data": { 00:31:20.182 "cntlid": 0, 00:31:20.182 "vendor_id": "0x8086", 00:31:20.182 "model_number": "INTEL SSDPE2KX040T8", 00:31:20.182 "serial_number": "BTLJ81850BB64P0DGN", 00:31:20.182 "firmware_revision": "VDV1Y295", 00:31:20.182 "oacs": { 00:31:20.182 "security": 0, 00:31:20.182 "format": 1, 00:31:20.182 "firmware": 1, 00:31:20.182 "ns_manage": 1 00:31:20.182 }, 00:31:20.182 "multi_ctrlr": false, 00:31:20.182 "ana_reporting": false 00:31:20.182 }, 00:31:20.182 "vs": { 00:31:20.182 "nvme_version": "1.2" 00:31:20.182 }, 00:31:20.182 "ns_data": { 00:31:20.182 "id": 1, 00:31:20.182 "can_share": false 00:31:20.182 } 00:31:20.182 } 00:31:20.182 ], 00:31:20.182 "mp_policy": "active_passive" 00:31:20.182 } 00:31:20.182 } 00:31:20.182 ] 00:31:20.182 08:44:32 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:20.182 08:44:32 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:22.081 c05bb765-9d51-4f2f-9ee0-6bc201fa5d31 00:31:22.081 08:44:34 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:22.081 023098c5-0f22-48bd-88e0-16109fef143d 00:31:22.339 08:44:34 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:22.339 08:44:34 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:22.339 08:44:34 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:22.339 08:44:34 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:22.339 08:44:34 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:22.339 08:44:34 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:22.339 08:44:34 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:22.339 08:44:34 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:22.597 [ 00:31:22.597 { 00:31:22.597 "name": "023098c5-0f22-48bd-88e0-16109fef143d", 00:31:22.597 "aliases": [ 00:31:22.597 "lvs0/lv0" 00:31:22.597 ], 00:31:22.597 "product_name": "Logical Volume", 00:31:22.597 "block_size": 512, 00:31:22.597 "num_blocks": 204800, 00:31:22.597 "uuid": "023098c5-0f22-48bd-88e0-16109fef143d", 00:31:22.597 "assigned_rate_limits": { 00:31:22.597 "rw_ios_per_sec": 0, 00:31:22.597 "rw_mbytes_per_sec": 0, 00:31:22.597 "r_mbytes_per_sec": 0, 00:31:22.597 "w_mbytes_per_sec": 0 00:31:22.597 }, 00:31:22.597 "claimed": false, 00:31:22.597 "zoned": false, 00:31:22.597 "supported_io_types": { 00:31:22.597 "read": true, 00:31:22.597 "write": true, 00:31:22.597 "unmap": true, 00:31:22.597 "flush": false, 00:31:22.597 "reset": true, 00:31:22.597 "nvme_admin": false, 00:31:22.597 "nvme_io": false, 00:31:22.597 "nvme_io_md": false, 00:31:22.597 "write_zeroes": true, 00:31:22.597 "zcopy": false, 00:31:22.597 "get_zone_info": false, 00:31:22.597 "zone_management": false, 00:31:22.597 "zone_append": false, 00:31:22.597 "compare": false, 00:31:22.597 "compare_and_write": false, 00:31:22.597 "abort": false, 00:31:22.597 "seek_hole": true, 00:31:22.597 "seek_data": true, 00:31:22.597 "copy": false, 00:31:22.597 "nvme_iov_md": false 00:31:22.597 }, 00:31:22.597 "driver_specific": { 00:31:22.597 "lvol": { 00:31:22.597 "lvol_store_uuid": "c05bb765-9d51-4f2f-9ee0-6bc201fa5d31", 00:31:22.597 "base_bdev": "Nvme0n1", 00:31:22.597 "thin_provision": true, 00:31:22.597 "num_allocated_clusters": 0, 00:31:22.597 "snapshot": false, 00:31:22.597 "clone": false, 00:31:22.597 "esnap_clone": false 00:31:22.597 } 00:31:22.597 } 00:31:22.597 } 00:31:22.597 ] 00:31:22.597 08:44:34 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:22.597 08:44:34 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:22.597 08:44:34 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:22.891 [2024-07-23 08:44:35.154560] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:22.891 COMP_lvs0/lv0 00:31:22.891 08:44:35 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:22.891 08:44:35 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:22.891 08:44:35 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:22.891 08:44:35 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:22.891 08:44:35 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:22.891 08:44:35 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:22.891 08:44:35 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:22.891 08:44:35 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:23.172 [ 00:31:23.172 { 00:31:23.172 "name": "COMP_lvs0/lv0", 00:31:23.172 "aliases": [ 00:31:23.172 "2f2a93d8-c2b7-5e65-a1a0-b9253d663a92" 00:31:23.172 ], 00:31:23.172 "product_name": "compress", 00:31:23.172 "block_size": 512, 00:31:23.172 "num_blocks": 200704, 00:31:23.172 "uuid": "2f2a93d8-c2b7-5e65-a1a0-b9253d663a92", 00:31:23.172 "assigned_rate_limits": { 00:31:23.172 "rw_ios_per_sec": 0, 00:31:23.172 "rw_mbytes_per_sec": 0, 00:31:23.172 "r_mbytes_per_sec": 0, 00:31:23.172 "w_mbytes_per_sec": 0 00:31:23.172 }, 00:31:23.172 "claimed": false, 00:31:23.172 "zoned": false, 00:31:23.172 "supported_io_types": { 00:31:23.172 "read": true, 00:31:23.172 "write": true, 00:31:23.172 "unmap": false, 00:31:23.172 "flush": false, 00:31:23.172 "reset": false, 00:31:23.172 "nvme_admin": false, 00:31:23.172 "nvme_io": false, 00:31:23.172 "nvme_io_md": false, 00:31:23.172 "write_zeroes": true, 00:31:23.172 "zcopy": false, 00:31:23.172 "get_zone_info": false, 00:31:23.172 "zone_management": false, 00:31:23.172 "zone_append": false, 00:31:23.172 "compare": false, 00:31:23.172 "compare_and_write": false, 00:31:23.172 "abort": false, 00:31:23.172 "seek_hole": false, 00:31:23.172 "seek_data": false, 00:31:23.172 "copy": false, 00:31:23.172 "nvme_iov_md": false 00:31:23.172 }, 00:31:23.172 "driver_specific": { 00:31:23.172 "compress": { 00:31:23.172 "name": "COMP_lvs0/lv0", 00:31:23.172 "base_bdev_name": "023098c5-0f22-48bd-88e0-16109fef143d", 00:31:23.172 "pm_path": "/tmp/pmem/0f1ebc41-a64d-46c5-806c-1ed19997bfc2" 00:31:23.172 } 00:31:23.172 } 00:31:23.172 } 00:31:23.172 ] 00:31:23.172 08:44:35 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:23.172 08:44:35 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:23.172 I/O targets: 00:31:23.172 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:31:23.172 00:31:23.172 00:31:23.172 CUnit - A unit testing framework for C - Version 2.1-3 00:31:23.172 http://cunit.sourceforge.net/ 00:31:23.172 00:31:23.172 00:31:23.172 Suite: bdevio tests on: COMP_lvs0/lv0 00:31:23.172 Test: blockdev write read block ...passed 00:31:23.172 Test: blockdev write zeroes read block ...passed 00:31:23.172 Test: blockdev write zeroes read no split ...passed 00:31:23.430 Test: blockdev write zeroes read split ...passed 00:31:23.430 Test: blockdev write zeroes read split partial ...passed 00:31:23.430 Test: blockdev reset ...[2024-07-23 08:44:35.774282] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:31:23.430 passed 00:31:23.430 Test: blockdev write read 8 blocks ...passed 00:31:23.430 Test: blockdev write read size > 128k ...passed 00:31:23.430 Test: blockdev write read invalid size ...passed 00:31:23.430 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:23.430 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:23.430 Test: blockdev write read max offset ...passed 00:31:23.430 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:23.430 Test: blockdev writev readv 8 blocks ...passed 00:31:23.430 Test: blockdev writev readv 30 x 1block ...passed 00:31:23.430 Test: blockdev writev readv block ...passed 00:31:23.430 Test: blockdev writev readv size > 128k ...passed 00:31:23.430 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:23.430 Test: blockdev comparev and writev ...passed 00:31:23.430 Test: blockdev nvme passthru rw ...passed 00:31:23.430 Test: blockdev nvme passthru vendor specific ...passed 00:31:23.430 Test: blockdev nvme admin passthru ...passed 00:31:23.430 Test: blockdev copy ...passed 00:31:23.430 00:31:23.430 Run Summary: Type Total Ran Passed Failed Inactive 00:31:23.430 suites 1 1 n/a 0 0 00:31:23.430 tests 23 23 23 0 0 00:31:23.430 asserts 130 130 130 0 n/a 00:31:23.430 00:31:23.430 Elapsed time = 0.443 seconds 00:31:23.430 0 00:31:23.430 08:44:35 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:31:23.430 08:44:35 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:23.688 08:44:36 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:23.946 08:44:36 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:31:23.946 08:44:36 compress_isal -- compress/compress.sh@62 -- # killprocess 1633902 00:31:23.946 08:44:36 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1633902 ']' 00:31:23.946 08:44:36 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1633902 00:31:23.946 08:44:36 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:23.946 08:44:36 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:23.946 08:44:36 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1633902 00:31:23.946 08:44:36 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:23.946 08:44:36 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:23.946 08:44:36 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1633902' 00:31:23.946 killing process with pid 1633902 00:31:23.946 08:44:36 compress_isal -- common/autotest_common.sh@967 -- # kill 1633902 00:31:23.946 08:44:36 compress_isal -- common/autotest_common.sh@972 -- # wait 1633902 00:31:29.207 08:44:41 compress_isal -- compress/compress.sh@91 -- # '[' 1 -eq 1 ']' 00:31:29.207 08:44:41 compress_isal -- compress/compress.sh@92 -- # run_bdevperf 64 16384 30 00:31:29.207 08:44:41 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:29.207 08:44:41 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1636193 00:31:29.208 08:44:41 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:29.208 08:44:41 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 64 -o 16384 -w verify -t 30 -C -m 0x6 00:31:29.208 08:44:41 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1636193 00:31:29.208 08:44:41 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1636193 ']' 00:31:29.208 08:44:41 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:29.208 08:44:41 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:29.208 08:44:41 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:29.208 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:29.208 08:44:41 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:29.208 08:44:41 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:29.208 [2024-07-23 08:44:41.264723] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:31:29.208 [2024-07-23 08:44:41.264824] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1636193 ] 00:31:29.208 [2024-07-23 08:44:41.389199] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:29.208 [2024-07-23 08:44:41.599935] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:29.208 [2024-07-23 08:44:41.599941] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:29.773 08:44:42 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:29.773 08:44:42 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:29.773 08:44:42 compress_isal -- compress/compress.sh@74 -- # create_vols 00:31:29.773 08:44:42 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:29.773 08:44:42 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:33.054 08:44:45 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:33.054 08:44:45 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:33.054 08:44:45 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:33.054 08:44:45 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:33.054 08:44:45 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:33.054 08:44:45 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:33.054 08:44:45 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:33.054 08:44:45 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:33.054 [ 00:31:33.054 { 00:31:33.054 "name": "Nvme0n1", 00:31:33.054 "aliases": [ 00:31:33.054 "b98e1455-8d2e-42c0-8598-37a4c3e66a8d" 00:31:33.054 ], 00:31:33.054 "product_name": "NVMe disk", 00:31:33.054 "block_size": 512, 00:31:33.054 "num_blocks": 7814037168, 00:31:33.054 "uuid": "b98e1455-8d2e-42c0-8598-37a4c3e66a8d", 00:31:33.054 "assigned_rate_limits": { 00:31:33.054 "rw_ios_per_sec": 0, 00:31:33.054 "rw_mbytes_per_sec": 0, 00:31:33.054 "r_mbytes_per_sec": 0, 00:31:33.054 "w_mbytes_per_sec": 0 00:31:33.054 }, 00:31:33.054 "claimed": false, 00:31:33.054 "zoned": false, 00:31:33.054 "supported_io_types": { 00:31:33.054 "read": true, 00:31:33.054 "write": true, 00:31:33.054 "unmap": true, 00:31:33.054 "flush": true, 00:31:33.054 "reset": true, 00:31:33.054 "nvme_admin": true, 00:31:33.054 "nvme_io": true, 00:31:33.054 "nvme_io_md": false, 00:31:33.054 "write_zeroes": true, 00:31:33.054 "zcopy": false, 00:31:33.054 "get_zone_info": false, 00:31:33.054 "zone_management": false, 00:31:33.054 "zone_append": false, 00:31:33.054 "compare": false, 00:31:33.054 "compare_and_write": false, 00:31:33.054 "abort": true, 00:31:33.054 "seek_hole": false, 00:31:33.054 "seek_data": false, 00:31:33.054 "copy": false, 00:31:33.054 "nvme_iov_md": false 00:31:33.054 }, 00:31:33.054 "driver_specific": { 00:31:33.054 "nvme": [ 00:31:33.054 { 00:31:33.054 "pci_address": "0000:60:00.0", 00:31:33.054 "trid": { 00:31:33.054 "trtype": "PCIe", 00:31:33.054 "traddr": "0000:60:00.0" 00:31:33.054 }, 00:31:33.054 "ctrlr_data": { 00:31:33.054 "cntlid": 0, 00:31:33.054 "vendor_id": "0x8086", 00:31:33.054 "model_number": "INTEL SSDPE2KX040T8", 00:31:33.054 "serial_number": "BTLJ81850BB64P0DGN", 00:31:33.054 "firmware_revision": "VDV1Y295", 00:31:33.054 "oacs": { 00:31:33.054 "security": 0, 00:31:33.054 "format": 1, 00:31:33.054 "firmware": 1, 00:31:33.054 "ns_manage": 1 00:31:33.054 }, 00:31:33.054 "multi_ctrlr": false, 00:31:33.054 "ana_reporting": false 00:31:33.054 }, 00:31:33.054 "vs": { 00:31:33.054 "nvme_version": "1.2" 00:31:33.054 }, 00:31:33.054 "ns_data": { 00:31:33.054 "id": 1, 00:31:33.054 "can_share": false 00:31:33.054 } 00:31:33.054 } 00:31:33.054 ], 00:31:33.054 "mp_policy": "active_passive" 00:31:33.054 } 00:31:33.054 } 00:31:33.054 ] 00:31:33.054 08:44:45 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:33.054 08:44:45 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:34.954 443f1a13-c566-400b-8ff1-efbef34b41ac 00:31:34.954 08:44:47 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:35.212 e9aebebb-e1ab-4bd0-babb-db3c3232a006 00:31:35.212 08:44:47 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:35.212 08:44:47 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:35.212 08:44:47 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:35.212 08:44:47 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:35.212 08:44:47 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:35.212 08:44:47 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:35.212 08:44:47 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:35.212 08:44:47 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:35.470 [ 00:31:35.470 { 00:31:35.470 "name": "e9aebebb-e1ab-4bd0-babb-db3c3232a006", 00:31:35.470 "aliases": [ 00:31:35.470 "lvs0/lv0" 00:31:35.470 ], 00:31:35.470 "product_name": "Logical Volume", 00:31:35.470 "block_size": 512, 00:31:35.470 "num_blocks": 204800, 00:31:35.470 "uuid": "e9aebebb-e1ab-4bd0-babb-db3c3232a006", 00:31:35.470 "assigned_rate_limits": { 00:31:35.470 "rw_ios_per_sec": 0, 00:31:35.470 "rw_mbytes_per_sec": 0, 00:31:35.470 "r_mbytes_per_sec": 0, 00:31:35.470 "w_mbytes_per_sec": 0 00:31:35.470 }, 00:31:35.470 "claimed": false, 00:31:35.470 "zoned": false, 00:31:35.470 "supported_io_types": { 00:31:35.470 "read": true, 00:31:35.470 "write": true, 00:31:35.470 "unmap": true, 00:31:35.470 "flush": false, 00:31:35.470 "reset": true, 00:31:35.470 "nvme_admin": false, 00:31:35.470 "nvme_io": false, 00:31:35.470 "nvme_io_md": false, 00:31:35.470 "write_zeroes": true, 00:31:35.470 "zcopy": false, 00:31:35.470 "get_zone_info": false, 00:31:35.470 "zone_management": false, 00:31:35.470 "zone_append": false, 00:31:35.470 "compare": false, 00:31:35.470 "compare_and_write": false, 00:31:35.470 "abort": false, 00:31:35.470 "seek_hole": true, 00:31:35.470 "seek_data": true, 00:31:35.470 "copy": false, 00:31:35.470 "nvme_iov_md": false 00:31:35.470 }, 00:31:35.470 "driver_specific": { 00:31:35.470 "lvol": { 00:31:35.470 "lvol_store_uuid": "443f1a13-c566-400b-8ff1-efbef34b41ac", 00:31:35.470 "base_bdev": "Nvme0n1", 00:31:35.470 "thin_provision": true, 00:31:35.470 "num_allocated_clusters": 0, 00:31:35.470 "snapshot": false, 00:31:35.470 "clone": false, 00:31:35.470 "esnap_clone": false 00:31:35.470 } 00:31:35.470 } 00:31:35.470 } 00:31:35.470 ] 00:31:35.470 08:44:47 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:35.470 08:44:47 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:35.470 08:44:47 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:35.727 [2024-07-23 08:44:48.032824] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:35.727 COMP_lvs0/lv0 00:31:35.727 08:44:48 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:35.727 08:44:48 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:35.727 08:44:48 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:35.727 08:44:48 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:35.727 08:44:48 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:35.727 08:44:48 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:35.727 08:44:48 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:35.727 08:44:48 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:35.985 [ 00:31:35.985 { 00:31:35.985 "name": "COMP_lvs0/lv0", 00:31:35.985 "aliases": [ 00:31:35.985 "032321b5-fec2-5b60-85a5-4e89d7b28e7b" 00:31:35.985 ], 00:31:35.985 "product_name": "compress", 00:31:35.985 "block_size": 512, 00:31:35.985 "num_blocks": 200704, 00:31:35.985 "uuid": "032321b5-fec2-5b60-85a5-4e89d7b28e7b", 00:31:35.985 "assigned_rate_limits": { 00:31:35.985 "rw_ios_per_sec": 0, 00:31:35.985 "rw_mbytes_per_sec": 0, 00:31:35.985 "r_mbytes_per_sec": 0, 00:31:35.985 "w_mbytes_per_sec": 0 00:31:35.985 }, 00:31:35.985 "claimed": false, 00:31:35.985 "zoned": false, 00:31:35.985 "supported_io_types": { 00:31:35.985 "read": true, 00:31:35.985 "write": true, 00:31:35.985 "unmap": false, 00:31:35.985 "flush": false, 00:31:35.985 "reset": false, 00:31:35.985 "nvme_admin": false, 00:31:35.985 "nvme_io": false, 00:31:35.985 "nvme_io_md": false, 00:31:35.985 "write_zeroes": true, 00:31:35.985 "zcopy": false, 00:31:35.985 "get_zone_info": false, 00:31:35.985 "zone_management": false, 00:31:35.985 "zone_append": false, 00:31:35.985 "compare": false, 00:31:35.985 "compare_and_write": false, 00:31:35.985 "abort": false, 00:31:35.985 "seek_hole": false, 00:31:35.985 "seek_data": false, 00:31:35.985 "copy": false, 00:31:35.985 "nvme_iov_md": false 00:31:35.985 }, 00:31:35.985 "driver_specific": { 00:31:35.985 "compress": { 00:31:35.985 "name": "COMP_lvs0/lv0", 00:31:35.985 "base_bdev_name": "e9aebebb-e1ab-4bd0-babb-db3c3232a006", 00:31:35.985 "pm_path": "/tmp/pmem/4aa700a8-5aad-45d3-aaab-c45e3c5538f6" 00:31:35.985 } 00:31:35.985 } 00:31:35.985 } 00:31:35.985 ] 00:31:35.985 08:44:48 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:35.985 08:44:48 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:36.242 Running I/O for 30 seconds... 00:32:08.319 00:32:08.319 Latency(us) 00:32:08.319 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:08.319 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 64, IO size: 16384) 00:32:08.319 Verification LBA range: start 0x0 length 0xc40 00:32:08.319 COMP_lvs0/lv0 : 30.01 1486.96 23.23 0.00 0.00 42882.20 84.85 40445.07 00:32:08.319 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 64, IO size: 16384) 00:32:08.319 Verification LBA range: start 0xc40 length 0xc40 00:32:08.319 COMP_lvs0/lv0 : 30.01 4635.55 72.43 0.00 0.00 13705.97 604.65 29335.16 00:32:08.319 =================================================================================================================== 00:32:08.319 Total : 6122.52 95.66 0.00 0.00 20792.74 84.85 40445.07 00:32:08.319 0 00:32:08.319 08:45:18 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:32:08.319 08:45:18 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:08.319 08:45:18 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:08.319 08:45:18 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:32:08.319 08:45:18 compress_isal -- compress/compress.sh@78 -- # killprocess 1636193 00:32:08.319 08:45:18 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1636193 ']' 00:32:08.319 08:45:18 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1636193 00:32:08.319 08:45:18 compress_isal -- common/autotest_common.sh@953 -- # uname 00:32:08.319 08:45:19 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:08.319 08:45:19 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1636193 00:32:08.319 08:45:19 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:08.319 08:45:19 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:08.319 08:45:19 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1636193' 00:32:08.319 killing process with pid 1636193 00:32:08.319 08:45:19 compress_isal -- common/autotest_common.sh@967 -- # kill 1636193 00:32:08.319 Received shutdown signal, test time was about 30.000000 seconds 00:32:08.319 00:32:08.319 Latency(us) 00:32:08.319 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:08.319 =================================================================================================================== 00:32:08.319 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:08.319 08:45:19 compress_isal -- common/autotest_common.sh@972 -- # wait 1636193 00:32:11.616 08:45:23 compress_isal -- compress/compress.sh@95 -- # export TEST_TRANSPORT=tcp 00:32:11.616 08:45:23 compress_isal -- compress/compress.sh@95 -- # TEST_TRANSPORT=tcp 00:32:11.616 08:45:23 compress_isal -- compress/compress.sh@96 -- # NET_TYPE=virt 00:32:11.616 08:45:23 compress_isal -- compress/compress.sh@96 -- # nvmftestinit 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@448 -- # prepare_net_devs 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@410 -- # local -g is_hw=no 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@412 -- # remove_spdk_ns 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:32:11.616 08:45:23 compress_isal -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:32:11.616 08:45:23 compress_isal -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@414 -- # [[ virt != virt ]] 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@423 -- # [[ virt == phy ]] 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@426 -- # [[ virt == phy-fallback ]] 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@432 -- # nvmf_veth_init 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:32:11.616 08:45:23 compress_isal -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:32:11.616 08:45:24 compress_isal -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:32:11.616 Cannot find device "nvmf_tgt_br" 00:32:11.616 08:45:24 compress_isal -- nvmf/common.sh@155 -- # true 00:32:11.616 08:45:24 compress_isal -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:32:11.616 Cannot find device "nvmf_tgt_br2" 00:32:11.616 08:45:24 compress_isal -- nvmf/common.sh@156 -- # true 00:32:11.616 08:45:24 compress_isal -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:32:11.616 08:45:24 compress_isal -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:32:11.616 Cannot find device "nvmf_tgt_br" 00:32:11.616 08:45:24 compress_isal -- nvmf/common.sh@158 -- # true 00:32:11.616 08:45:24 compress_isal -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:32:11.616 Cannot find device "nvmf_tgt_br2" 00:32:11.616 08:45:24 compress_isal -- nvmf/common.sh@159 -- # true 00:32:11.616 08:45:24 compress_isal -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:32:11.877 08:45:24 compress_isal -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:32:11.877 08:45:24 compress_isal -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:32:11.877 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:32:11.877 08:45:24 compress_isal -- nvmf/common.sh@162 -- # true 00:32:11.877 08:45:24 compress_isal -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:32:11.877 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:32:11.877 08:45:24 compress_isal -- nvmf/common.sh@163 -- # true 00:32:11.877 08:45:24 compress_isal -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:32:11.877 08:45:24 compress_isal -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:32:11.877 08:45:24 compress_isal -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:32:11.877 08:45:24 compress_isal -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:32:11.877 08:45:24 compress_isal -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:32:11.877 08:45:24 compress_isal -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:32:11.877 08:45:24 compress_isal -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:32:11.877 08:45:24 compress_isal -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:32:11.877 08:45:24 compress_isal -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:32:11.877 08:45:24 compress_isal -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:32:11.877 08:45:24 compress_isal -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:32:12.136 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:32:12.136 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.078 ms 00:32:12.136 00:32:12.136 --- 10.0.0.2 ping statistics --- 00:32:12.136 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:32:12.136 rtt min/avg/max/mdev = 0.078/0.078/0.078/0.000 ms 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:32:12.136 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:32:12.136 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.072 ms 00:32:12.136 00:32:12.136 --- 10.0.0.3 ping statistics --- 00:32:12.136 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:32:12.136 rtt min/avg/max/mdev = 0.072/0.072/0.072/0.000 ms 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:32:12.136 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:32:12.136 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.042 ms 00:32:12.136 00:32:12.136 --- 10.0.0.1 ping statistics --- 00:32:12.136 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:32:12.136 rtt min/avg/max/mdev = 0.042/0.042/0.042/0.000 ms 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@433 -- # return 0 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:32:12.136 08:45:24 compress_isal -- compress/compress.sh@97 -- # nvmfappstart -m 0x7 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:32:12.136 08:45:24 compress_isal -- common/autotest_common.sh@722 -- # xtrace_disable 00:32:12.136 08:45:24 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@481 -- # nvmfpid=1644276 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@482 -- # waitforlisten 1644276 00:32:12.136 08:45:24 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1644276 ']' 00:32:12.136 08:45:24 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:12.136 08:45:24 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:12.136 08:45:24 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:12.136 08:45:24 compress_isal -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x7 00:32:12.136 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:12.136 08:45:24 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:12.136 08:45:24 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:12.395 [2024-07-23 08:45:24.735538] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:32:12.395 [2024-07-23 08:45:24.735631] [ DPDK EAL parameters: nvmf -c 0x7 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:12.395 [2024-07-23 08:45:24.863465] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:12.654 [2024-07-23 08:45:25.071993] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:32:12.654 [2024-07-23 08:45:25.072038] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:32:12.654 [2024-07-23 08:45:25.072052] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:32:12.654 [2024-07-23 08:45:25.072060] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:32:12.654 [2024-07-23 08:45:25.072073] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:32:12.654 [2024-07-23 08:45:25.072140] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:12.654 [2024-07-23 08:45:25.072206] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:12.654 [2024-07-23 08:45:25.072215] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:13.221 08:45:25 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:13.221 08:45:25 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:32:13.221 08:45:25 compress_isal -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:32:13.221 08:45:25 compress_isal -- common/autotest_common.sh@728 -- # xtrace_disable 00:32:13.221 08:45:25 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:13.221 08:45:25 compress_isal -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:32:13.221 08:45:25 compress_isal -- compress/compress.sh@98 -- # trap 'nvmftestfini; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:13.221 08:45:25 compress_isal -- compress/compress.sh@101 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_create_transport -t tcp -u 8192 00:32:13.221 [2024-07-23 08:45:25.695465] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:32:13.221 08:45:25 compress_isal -- compress/compress.sh@102 -- # create_vols 00:32:13.221 08:45:25 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:13.221 08:45:25 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:16.508 08:45:28 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:16.509 08:45:28 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:32:16.509 08:45:28 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:16.509 08:45:28 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:16.509 08:45:28 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:16.509 08:45:28 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:16.509 08:45:28 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:16.509 08:45:28 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:16.768 [ 00:32:16.768 { 00:32:16.768 "name": "Nvme0n1", 00:32:16.768 "aliases": [ 00:32:16.768 "7a4c8b90-fcd6-4a47-8c58-9aa806b43745" 00:32:16.768 ], 00:32:16.768 "product_name": "NVMe disk", 00:32:16.768 "block_size": 512, 00:32:16.768 "num_blocks": 7814037168, 00:32:16.768 "uuid": "7a4c8b90-fcd6-4a47-8c58-9aa806b43745", 00:32:16.768 "assigned_rate_limits": { 00:32:16.768 "rw_ios_per_sec": 0, 00:32:16.768 "rw_mbytes_per_sec": 0, 00:32:16.768 "r_mbytes_per_sec": 0, 00:32:16.768 "w_mbytes_per_sec": 0 00:32:16.768 }, 00:32:16.768 "claimed": false, 00:32:16.768 "zoned": false, 00:32:16.768 "supported_io_types": { 00:32:16.768 "read": true, 00:32:16.768 "write": true, 00:32:16.768 "unmap": true, 00:32:16.768 "flush": true, 00:32:16.768 "reset": true, 00:32:16.768 "nvme_admin": true, 00:32:16.768 "nvme_io": true, 00:32:16.768 "nvme_io_md": false, 00:32:16.768 "write_zeroes": true, 00:32:16.768 "zcopy": false, 00:32:16.768 "get_zone_info": false, 00:32:16.768 "zone_management": false, 00:32:16.768 "zone_append": false, 00:32:16.768 "compare": false, 00:32:16.768 "compare_and_write": false, 00:32:16.768 "abort": true, 00:32:16.768 "seek_hole": false, 00:32:16.768 "seek_data": false, 00:32:16.768 "copy": false, 00:32:16.768 "nvme_iov_md": false 00:32:16.768 }, 00:32:16.768 "driver_specific": { 00:32:16.768 "nvme": [ 00:32:16.768 { 00:32:16.768 "pci_address": "0000:60:00.0", 00:32:16.768 "trid": { 00:32:16.768 "trtype": "PCIe", 00:32:16.768 "traddr": "0000:60:00.0" 00:32:16.768 }, 00:32:16.768 "ctrlr_data": { 00:32:16.768 "cntlid": 0, 00:32:16.768 "vendor_id": "0x8086", 00:32:16.768 "model_number": "INTEL SSDPE2KX040T8", 00:32:16.768 "serial_number": "BTLJ81850BB64P0DGN", 00:32:16.768 "firmware_revision": "VDV1Y295", 00:32:16.768 "oacs": { 00:32:16.768 "security": 0, 00:32:16.768 "format": 1, 00:32:16.768 "firmware": 1, 00:32:16.768 "ns_manage": 1 00:32:16.768 }, 00:32:16.768 "multi_ctrlr": false, 00:32:16.768 "ana_reporting": false 00:32:16.768 }, 00:32:16.768 "vs": { 00:32:16.768 "nvme_version": "1.2" 00:32:16.768 }, 00:32:16.768 "ns_data": { 00:32:16.768 "id": 1, 00:32:16.768 "can_share": false 00:32:16.768 } 00:32:16.768 } 00:32:16.768 ], 00:32:16.768 "mp_policy": "active_passive" 00:32:16.768 } 00:32:16.768 } 00:32:16.768 ] 00:32:16.768 08:45:29 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:16.768 08:45:29 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:18.671 eea9f900-1b20-49b6-8e85-5b6508c5b530 00:32:18.671 08:45:31 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:18.930 e5a106ac-c126-428f-80a6-4b63302e83ae 00:32:18.930 08:45:31 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:18.930 08:45:31 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:32:18.930 08:45:31 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:18.930 08:45:31 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:18.930 08:45:31 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:18.930 08:45:31 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:18.930 08:45:31 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:18.930 08:45:31 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:19.189 [ 00:32:19.189 { 00:32:19.189 "name": "e5a106ac-c126-428f-80a6-4b63302e83ae", 00:32:19.189 "aliases": [ 00:32:19.189 "lvs0/lv0" 00:32:19.189 ], 00:32:19.189 "product_name": "Logical Volume", 00:32:19.189 "block_size": 512, 00:32:19.189 "num_blocks": 204800, 00:32:19.189 "uuid": "e5a106ac-c126-428f-80a6-4b63302e83ae", 00:32:19.189 "assigned_rate_limits": { 00:32:19.189 "rw_ios_per_sec": 0, 00:32:19.189 "rw_mbytes_per_sec": 0, 00:32:19.189 "r_mbytes_per_sec": 0, 00:32:19.189 "w_mbytes_per_sec": 0 00:32:19.189 }, 00:32:19.189 "claimed": false, 00:32:19.189 "zoned": false, 00:32:19.189 "supported_io_types": { 00:32:19.189 "read": true, 00:32:19.189 "write": true, 00:32:19.189 "unmap": true, 00:32:19.189 "flush": false, 00:32:19.189 "reset": true, 00:32:19.189 "nvme_admin": false, 00:32:19.189 "nvme_io": false, 00:32:19.189 "nvme_io_md": false, 00:32:19.189 "write_zeroes": true, 00:32:19.189 "zcopy": false, 00:32:19.189 "get_zone_info": false, 00:32:19.189 "zone_management": false, 00:32:19.189 "zone_append": false, 00:32:19.189 "compare": false, 00:32:19.189 "compare_and_write": false, 00:32:19.189 "abort": false, 00:32:19.189 "seek_hole": true, 00:32:19.189 "seek_data": true, 00:32:19.189 "copy": false, 00:32:19.189 "nvme_iov_md": false 00:32:19.189 }, 00:32:19.189 "driver_specific": { 00:32:19.189 "lvol": { 00:32:19.189 "lvol_store_uuid": "eea9f900-1b20-49b6-8e85-5b6508c5b530", 00:32:19.189 "base_bdev": "Nvme0n1", 00:32:19.189 "thin_provision": true, 00:32:19.189 "num_allocated_clusters": 0, 00:32:19.189 "snapshot": false, 00:32:19.189 "clone": false, 00:32:19.189 "esnap_clone": false 00:32:19.189 } 00:32:19.189 } 00:32:19.189 } 00:32:19.189 ] 00:32:19.189 08:45:31 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:19.189 08:45:31 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:32:19.189 08:45:31 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:32:19.448 [2024-07-23 08:45:31.806991] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:19.448 COMP_lvs0/lv0 00:32:19.448 08:45:31 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:19.449 08:45:31 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:32:19.449 08:45:31 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:19.449 08:45:31 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:19.449 08:45:31 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:19.449 08:45:31 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:19.449 08:45:31 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:19.708 08:45:32 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:19.708 [ 00:32:19.708 { 00:32:19.708 "name": "COMP_lvs0/lv0", 00:32:19.708 "aliases": [ 00:32:19.708 "ed922644-b2d8-5891-8d63-dedf2e19a781" 00:32:19.708 ], 00:32:19.708 "product_name": "compress", 00:32:19.708 "block_size": 512, 00:32:19.708 "num_blocks": 200704, 00:32:19.708 "uuid": "ed922644-b2d8-5891-8d63-dedf2e19a781", 00:32:19.708 "assigned_rate_limits": { 00:32:19.708 "rw_ios_per_sec": 0, 00:32:19.708 "rw_mbytes_per_sec": 0, 00:32:19.708 "r_mbytes_per_sec": 0, 00:32:19.708 "w_mbytes_per_sec": 0 00:32:19.708 }, 00:32:19.708 "claimed": false, 00:32:19.708 "zoned": false, 00:32:19.708 "supported_io_types": { 00:32:19.708 "read": true, 00:32:19.708 "write": true, 00:32:19.708 "unmap": false, 00:32:19.708 "flush": false, 00:32:19.708 "reset": false, 00:32:19.708 "nvme_admin": false, 00:32:19.708 "nvme_io": false, 00:32:19.708 "nvme_io_md": false, 00:32:19.708 "write_zeroes": true, 00:32:19.708 "zcopy": false, 00:32:19.708 "get_zone_info": false, 00:32:19.708 "zone_management": false, 00:32:19.708 "zone_append": false, 00:32:19.708 "compare": false, 00:32:19.708 "compare_and_write": false, 00:32:19.708 "abort": false, 00:32:19.708 "seek_hole": false, 00:32:19.708 "seek_data": false, 00:32:19.708 "copy": false, 00:32:19.708 "nvme_iov_md": false 00:32:19.708 }, 00:32:19.708 "driver_specific": { 00:32:19.708 "compress": { 00:32:19.708 "name": "COMP_lvs0/lv0", 00:32:19.708 "base_bdev_name": "e5a106ac-c126-428f-80a6-4b63302e83ae", 00:32:19.708 "pm_path": "/tmp/pmem/df54263c-6ccf-4d74-b308-1a90edf77a18" 00:32:19.708 } 00:32:19.708 } 00:32:19.708 } 00:32:19.708 ] 00:32:19.708 08:45:32 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:19.708 08:45:32 compress_isal -- compress/compress.sh@103 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode0 -a -s SPDK0 00:32:19.967 08:45:32 compress_isal -- compress/compress.sh@104 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode0 COMP_lvs0/lv0 00:32:20.226 08:45:32 compress_isal -- compress/compress.sh@105 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode0 -t tcp -a 10.0.0.2 -s 4420 00:32:20.226 [2024-07-23 08:45:32.666471] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:32:20.226 08:45:32 compress_isal -- compress/compress.sh@109 -- # perf_pid=1645624 00:32:20.226 08:45:32 compress_isal -- compress/compress.sh@112 -- # trap 'killprocess $perf_pid; compress_err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:20.226 08:45:32 compress_isal -- compress/compress.sh@108 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_nvme_perf -r 'trtype:tcp adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420' -o 4096 -q 64 -s 512 -w randrw -t 30 -c 0x18 -M 50 00:32:20.226 08:45:32 compress_isal -- compress/compress.sh@113 -- # wait 1645624 00:32:20.486 [2024-07-23 08:45:33.001893] subsystem.c:1572:spdk_nvmf_subsystem_listener_allowed: *WARNING*: Allowing connection to discovery subsystem on TCP/10.0.0.2/4420, even though this listener was not added to the discovery subsystem. This behavior is deprecated and will be removed in a future release. 00:32:52.646 Initializing NVMe Controllers 00:32:52.646 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode0 00:32:52.646 Associating TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 with lcore 3 00:32:52.646 Associating TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 with lcore 4 00:32:52.646 Initialization complete. Launching workers. 00:32:52.646 ======================================================== 00:32:52.646 Latency(us) 00:32:52.646 Device Information : IOPS MiB/s Average min max 00:32:52.646 TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 from core 3: 5754.27 22.48 11123.29 1710.61 26844.55 00:32:52.646 TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 from core 4: 3621.53 14.15 17674.52 3297.19 35029.65 00:32:52.646 ======================================================== 00:32:52.646 Total : 9375.80 36.62 13653.79 1710.61 35029.65 00:32:52.646 00:32:52.646 08:46:03 compress_isal -- compress/compress.sh@114 -- # destroy_vols 00:32:52.646 08:46:03 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:52.646 08:46:03 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:52.646 08:46:03 compress_isal -- compress/compress.sh@116 -- # trap - SIGINT SIGTERM EXIT 00:32:52.646 08:46:03 compress_isal -- compress/compress.sh@117 -- # nvmftestfini 00:32:52.646 08:46:03 compress_isal -- nvmf/common.sh@488 -- # nvmfcleanup 00:32:52.646 08:46:03 compress_isal -- nvmf/common.sh@117 -- # sync 00:32:52.646 08:46:03 compress_isal -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:32:52.646 08:46:03 compress_isal -- nvmf/common.sh@120 -- # set +e 00:32:52.646 08:46:03 compress_isal -- nvmf/common.sh@121 -- # for i in {1..20} 00:32:52.646 08:46:03 compress_isal -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:32:52.646 rmmod nvme_tcp 00:32:52.646 rmmod nvme_fabrics 00:32:52.646 rmmod nvme_keyring 00:32:52.646 08:46:03 compress_isal -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:32:52.646 08:46:03 compress_isal -- nvmf/common.sh@124 -- # set -e 00:32:52.646 08:46:03 compress_isal -- nvmf/common.sh@125 -- # return 0 00:32:52.646 08:46:03 compress_isal -- nvmf/common.sh@489 -- # '[' -n 1644276 ']' 00:32:52.646 08:46:03 compress_isal -- nvmf/common.sh@490 -- # killprocess 1644276 00:32:52.646 08:46:03 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1644276 ']' 00:32:52.646 08:46:03 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1644276 00:32:52.646 08:46:03 compress_isal -- common/autotest_common.sh@953 -- # uname 00:32:52.646 08:46:03 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:52.646 08:46:03 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1644276 00:32:52.646 08:46:03 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:52.646 08:46:03 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:52.646 08:46:03 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1644276' 00:32:52.646 killing process with pid 1644276 00:32:52.646 08:46:03 compress_isal -- common/autotest_common.sh@967 -- # kill 1644276 00:32:52.646 08:46:03 compress_isal -- common/autotest_common.sh@972 -- # wait 1644276 00:32:56.839 08:46:08 compress_isal -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:32:56.839 08:46:08 compress_isal -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:32:56.839 08:46:08 compress_isal -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:32:56.839 08:46:08 compress_isal -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:32:56.839 08:46:08 compress_isal -- nvmf/common.sh@278 -- # remove_spdk_ns 00:32:56.839 08:46:08 compress_isal -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:32:56.839 08:46:08 compress_isal -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:32:56.839 08:46:08 compress_isal -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:32:56.840 08:46:08 compress_isal -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:32:56.840 08:46:08 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:32:56.840 00:32:56.840 real 2m27.300s 00:32:56.840 user 6m41.697s 00:32:56.840 sys 0m10.968s 00:32:56.840 08:46:08 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:56.840 08:46:08 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:56.840 ************************************ 00:32:56.840 END TEST compress_isal 00:32:56.840 ************************************ 00:32:56.840 08:46:08 -- common/autotest_common.sh@1142 -- # return 0 00:32:56.840 08:46:08 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:32:56.840 08:46:08 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:32:56.840 08:46:08 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:32:56.840 08:46:08 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:56.840 08:46:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:56.840 08:46:08 -- common/autotest_common.sh@10 -- # set +x 00:32:56.840 ************************************ 00:32:56.840 START TEST blockdev_crypto_aesni 00:32:56.840 ************************************ 00:32:56.840 08:46:08 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:32:56.840 * Looking for test storage... 00:32:56.840 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # uname -s 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@681 -- # test_type=crypto_aesni 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # crypto_device= 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # dek= 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # env_ctx= 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == bdev ]] 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == crypto_* ]] 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1652183 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 1652183 00:32:56.840 08:46:08 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:32:56.840 08:46:08 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 1652183 ']' 00:32:56.840 08:46:08 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:56.840 08:46:08 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:56.840 08:46:08 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:56.840 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:56.840 08:46:08 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:56.840 08:46:08 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:56.840 [2024-07-23 08:46:08.866483] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:32:56.840 [2024-07-23 08:46:08.866582] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1652183 ] 00:32:56.840 [2024-07-23 08:46:08.989594] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:56.840 [2024-07-23 08:46:09.215044] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:57.409 08:46:09 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:57.409 08:46:09 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:32:57.409 08:46:09 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:32:57.409 08:46:09 blockdev_crypto_aesni -- bdev/blockdev.sh@704 -- # setup_crypto_aesni_conf 00:32:57.409 08:46:09 blockdev_crypto_aesni -- bdev/blockdev.sh@145 -- # rpc_cmd 00:32:57.409 08:46:09 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:57.409 08:46:09 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:57.409 [2024-07-23 08:46:09.636641] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:57.409 [2024-07-23 08:46:09.644681] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:57.409 [2024-07-23 08:46:09.652688] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:57.409 [2024-07-23 08:46:09.919894] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:00.699 true 00:33:00.699 true 00:33:00.699 true 00:33:00.699 true 00:33:00.959 Malloc0 00:33:00.959 Malloc1 00:33:00.959 Malloc2 00:33:00.959 Malloc3 00:33:00.959 [2024-07-23 08:46:13.363933] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:00.959 crypto_ram 00:33:00.959 [2024-07-23 08:46:13.371938] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:00.959 crypto_ram2 00:33:00.959 [2024-07-23 08:46:13.379947] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:00.959 crypto_ram3 00:33:00.959 [2024-07-23 08:46:13.387975] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:00.959 crypto_ram4 00:33:00.959 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:00.959 08:46:13 blockdev_crypto_aesni -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:33:00.959 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:00.959 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:00.959 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:00.959 08:46:13 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # cat 00:33:00.959 08:46:13 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:33:00.959 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:00.959 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:00.959 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:00.959 08:46:13 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:33:00.959 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:00.959 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:00.959 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:00.959 08:46:13 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:33:00.959 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:00.959 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:00.959 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:00.959 08:46:13 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:33:00.959 08:46:13 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:33:00.959 08:46:13 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:33:00.959 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:00.959 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:01.218 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:01.218 08:46:13 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:33:01.218 08:46:13 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "7de9bad8-45a9-5247-8040-c82e239f95bb"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7de9bad8-45a9-5247-8040-c82e239f95bb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "6293bd3c-ce9e-5abf-b0d8-117169017913"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6293bd3c-ce9e-5abf-b0d8-117169017913",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "daee6235-120f-5395-b90e-b70f31bd65fa"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "daee6235-120f-5395-b90e-b70f31bd65fa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "a9c5aa43-a972-5c38-b1c7-e65b7a271e98"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "a9c5aa43-a972-5c38-b1c7-e65b7a271e98",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:33:01.218 08:46:13 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r .name 00:33:01.218 08:46:13 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:33:01.218 08:46:13 blockdev_crypto_aesni -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:33:01.218 08:46:13 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:33:01.218 08:46:13 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # killprocess 1652183 00:33:01.218 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 1652183 ']' 00:33:01.218 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 1652183 00:33:01.218 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:33:01.218 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:01.218 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1652183 00:33:01.218 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:01.218 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:01.218 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1652183' 00:33:01.218 killing process with pid 1652183 00:33:01.218 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 1652183 00:33:01.218 08:46:13 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 1652183 00:33:05.410 08:46:17 blockdev_crypto_aesni -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:05.410 08:46:17 blockdev_crypto_aesni -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:05.410 08:46:17 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:33:05.410 08:46:17 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:05.410 08:46:17 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:05.410 ************************************ 00:33:05.410 START TEST bdev_hello_world 00:33:05.410 ************************************ 00:33:05.410 08:46:17 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:05.410 [2024-07-23 08:46:17.294036] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:33:05.410 [2024-07-23 08:46:17.294117] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1653706 ] 00:33:05.410 [2024-07-23 08:46:17.411386] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:05.410 [2024-07-23 08:46:17.617708] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:05.410 [2024-07-23 08:46:17.638933] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:05.410 [2024-07-23 08:46:17.646961] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:05.410 [2024-07-23 08:46:17.654971] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:05.669 [2024-07-23 08:46:17.968898] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:08.205 [2024-07-23 08:46:20.711195] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:08.206 [2024-07-23 08:46:20.711265] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:08.206 [2024-07-23 08:46:20.711278] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:08.206 [2024-07-23 08:46:20.719213] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:08.206 [2024-07-23 08:46:20.719248] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:08.206 [2024-07-23 08:46:20.719258] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:08.466 [2024-07-23 08:46:20.727228] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:08.466 [2024-07-23 08:46:20.727272] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:08.466 [2024-07-23 08:46:20.727283] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:08.466 [2024-07-23 08:46:20.735249] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:08.466 [2024-07-23 08:46:20.735278] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:08.466 [2024-07-23 08:46:20.735287] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:08.466 [2024-07-23 08:46:20.943406] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:33:08.466 [2024-07-23 08:46:20.943449] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:33:08.466 [2024-07-23 08:46:20.943466] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:33:08.466 [2024-07-23 08:46:20.945097] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:33:08.466 [2024-07-23 08:46:20.945210] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:33:08.466 [2024-07-23 08:46:20.945226] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:33:08.466 [2024-07-23 08:46:20.945279] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:33:08.466 00:33:08.466 [2024-07-23 08:46:20.945297] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:33:10.999 00:33:10.999 real 0m6.115s 00:33:10.999 user 0m5.591s 00:33:10.999 sys 0m0.466s 00:33:10.999 08:46:23 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:10.999 08:46:23 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:33:10.999 ************************************ 00:33:10.999 END TEST bdev_hello_world 00:33:10.999 ************************************ 00:33:10.999 08:46:23 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:10.999 08:46:23 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:33:10.999 08:46:23 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:10.999 08:46:23 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:10.999 08:46:23 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:10.999 ************************************ 00:33:10.999 START TEST bdev_bounds 00:33:10.999 ************************************ 00:33:10.999 08:46:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:33:10.999 08:46:23 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1654740 00:33:10.999 08:46:23 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:33:10.999 08:46:23 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:10.999 08:46:23 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1654740' 00:33:10.999 Process bdevio pid: 1654740 00:33:10.999 08:46:23 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1654740 00:33:10.999 08:46:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1654740 ']' 00:33:10.999 08:46:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:10.999 08:46:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:10.999 08:46:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:10.999 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:10.999 08:46:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:10.999 08:46:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:10.999 [2024-07-23 08:46:23.486504] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:33:10.999 [2024-07-23 08:46:23.486598] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1654740 ] 00:33:11.258 [2024-07-23 08:46:23.616501] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:33:11.517 [2024-07-23 08:46:23.848747] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:11.517 [2024-07-23 08:46:23.848815] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:11.517 [2024-07-23 08:46:23.848820] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:11.517 [2024-07-23 08:46:23.870131] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:11.517 [2024-07-23 08:46:23.878152] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:11.517 [2024-07-23 08:46:23.886177] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:11.776 [2024-07-23 08:46:24.203126] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:15.103 [2024-07-23 08:46:26.921298] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:15.103 [2024-07-23 08:46:26.921362] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:15.103 [2024-07-23 08:46:26.921391] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:15.103 [2024-07-23 08:46:26.929317] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:15.103 [2024-07-23 08:46:26.929351] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:15.103 [2024-07-23 08:46:26.929361] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:15.103 [2024-07-23 08:46:26.937341] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:15.103 [2024-07-23 08:46:26.937366] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:15.103 [2024-07-23 08:46:26.937374] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:15.103 [2024-07-23 08:46:26.945360] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:15.103 [2024-07-23 08:46:26.945389] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:15.103 [2024-07-23 08:46:26.945397] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:15.362 08:46:27 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:15.362 08:46:27 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:33:15.362 08:46:27 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:33:15.362 I/O targets: 00:33:15.362 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:33:15.362 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:33:15.362 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:33:15.362 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:33:15.362 00:33:15.362 00:33:15.362 CUnit - A unit testing framework for C - Version 2.1-3 00:33:15.362 http://cunit.sourceforge.net/ 00:33:15.362 00:33:15.362 00:33:15.362 Suite: bdevio tests on: crypto_ram4 00:33:15.362 Test: blockdev write read block ...passed 00:33:15.362 Test: blockdev write zeroes read block ...passed 00:33:15.362 Test: blockdev write zeroes read no split ...passed 00:33:15.362 Test: blockdev write zeroes read split ...passed 00:33:15.362 Test: blockdev write zeroes read split partial ...passed 00:33:15.362 Test: blockdev reset ...passed 00:33:15.362 Test: blockdev write read 8 blocks ...passed 00:33:15.362 Test: blockdev write read size > 128k ...passed 00:33:15.362 Test: blockdev write read invalid size ...passed 00:33:15.362 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:15.362 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:15.362 Test: blockdev write read max offset ...passed 00:33:15.362 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:15.362 Test: blockdev writev readv 8 blocks ...passed 00:33:15.362 Test: blockdev writev readv 30 x 1block ...passed 00:33:15.362 Test: blockdev writev readv block ...passed 00:33:15.362 Test: blockdev writev readv size > 128k ...passed 00:33:15.362 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:15.362 Test: blockdev comparev and writev ...passed 00:33:15.362 Test: blockdev nvme passthru rw ...passed 00:33:15.362 Test: blockdev nvme passthru vendor specific ...passed 00:33:15.362 Test: blockdev nvme admin passthru ...passed 00:33:15.362 Test: blockdev copy ...passed 00:33:15.362 Suite: bdevio tests on: crypto_ram3 00:33:15.362 Test: blockdev write read block ...passed 00:33:15.362 Test: blockdev write zeroes read block ...passed 00:33:15.362 Test: blockdev write zeroes read no split ...passed 00:33:15.362 Test: blockdev write zeroes read split ...passed 00:33:15.362 Test: blockdev write zeroes read split partial ...passed 00:33:15.362 Test: blockdev reset ...passed 00:33:15.362 Test: blockdev write read 8 blocks ...passed 00:33:15.622 Test: blockdev write read size > 128k ...passed 00:33:15.622 Test: blockdev write read invalid size ...passed 00:33:15.622 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:15.622 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:15.622 Test: blockdev write read max offset ...passed 00:33:15.622 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:15.622 Test: blockdev writev readv 8 blocks ...passed 00:33:15.622 Test: blockdev writev readv 30 x 1block ...passed 00:33:15.622 Test: blockdev writev readv block ...passed 00:33:15.622 Test: blockdev writev readv size > 128k ...passed 00:33:15.622 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:15.622 Test: blockdev comparev and writev ...passed 00:33:15.622 Test: blockdev nvme passthru rw ...passed 00:33:15.622 Test: blockdev nvme passthru vendor specific ...passed 00:33:15.622 Test: blockdev nvme admin passthru ...passed 00:33:15.622 Test: blockdev copy ...passed 00:33:15.622 Suite: bdevio tests on: crypto_ram2 00:33:15.622 Test: blockdev write read block ...passed 00:33:15.622 Test: blockdev write zeroes read block ...passed 00:33:15.622 Test: blockdev write zeroes read no split ...passed 00:33:15.622 Test: blockdev write zeroes read split ...passed 00:33:15.622 Test: blockdev write zeroes read split partial ...passed 00:33:15.622 Test: blockdev reset ...passed 00:33:15.622 Test: blockdev write read 8 blocks ...passed 00:33:15.622 Test: blockdev write read size > 128k ...passed 00:33:15.622 Test: blockdev write read invalid size ...passed 00:33:15.622 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:15.622 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:15.622 Test: blockdev write read max offset ...passed 00:33:15.622 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:15.622 Test: blockdev writev readv 8 blocks ...passed 00:33:15.622 Test: blockdev writev readv 30 x 1block ...passed 00:33:15.622 Test: blockdev writev readv block ...passed 00:33:15.622 Test: blockdev writev readv size > 128k ...passed 00:33:15.622 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:15.622 Test: blockdev comparev and writev ...passed 00:33:15.622 Test: blockdev nvme passthru rw ...passed 00:33:15.622 Test: blockdev nvme passthru vendor specific ...passed 00:33:15.622 Test: blockdev nvme admin passthru ...passed 00:33:15.622 Test: blockdev copy ...passed 00:33:15.622 Suite: bdevio tests on: crypto_ram 00:33:15.622 Test: blockdev write read block ...passed 00:33:15.622 Test: blockdev write zeroes read block ...passed 00:33:15.622 Test: blockdev write zeroes read no split ...passed 00:33:15.882 Test: blockdev write zeroes read split ...passed 00:33:15.882 Test: blockdev write zeroes read split partial ...passed 00:33:15.882 Test: blockdev reset ...passed 00:33:15.882 Test: blockdev write read 8 blocks ...passed 00:33:15.882 Test: blockdev write read size > 128k ...passed 00:33:15.882 Test: blockdev write read invalid size ...passed 00:33:15.882 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:15.882 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:15.882 Test: blockdev write read max offset ...passed 00:33:15.882 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:15.882 Test: blockdev writev readv 8 blocks ...passed 00:33:15.882 Test: blockdev writev readv 30 x 1block ...passed 00:33:15.882 Test: blockdev writev readv block ...passed 00:33:15.882 Test: blockdev writev readv size > 128k ...passed 00:33:15.882 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:15.882 Test: blockdev comparev and writev ...passed 00:33:15.882 Test: blockdev nvme passthru rw ...passed 00:33:15.882 Test: blockdev nvme passthru vendor specific ...passed 00:33:15.882 Test: blockdev nvme admin passthru ...passed 00:33:15.882 Test: blockdev copy ...passed 00:33:15.882 00:33:15.882 Run Summary: Type Total Ran Passed Failed Inactive 00:33:15.882 suites 4 4 n/a 0 0 00:33:15.882 tests 92 92 92 0 0 00:33:15.882 asserts 520 520 520 0 n/a 00:33:15.882 00:33:15.882 Elapsed time = 1.427 seconds 00:33:15.882 0 00:33:15.882 08:46:28 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1654740 00:33:15.882 08:46:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1654740 ']' 00:33:15.882 08:46:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1654740 00:33:15.882 08:46:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:33:15.882 08:46:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:15.882 08:46:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1654740 00:33:15.882 08:46:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:15.882 08:46:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:15.882 08:46:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1654740' 00:33:15.882 killing process with pid 1654740 00:33:15.882 08:46:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1654740 00:33:15.882 08:46:28 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1654740 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:33:18.419 00:33:18.419 real 0m7.278s 00:33:18.419 user 0m20.009s 00:33:18.419 sys 0m0.654s 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:18.419 ************************************ 00:33:18.419 END TEST bdev_bounds 00:33:18.419 ************************************ 00:33:18.419 08:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:18.419 08:46:30 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:33:18.419 08:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:33:18.419 08:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:18.419 08:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:18.419 ************************************ 00:33:18.419 START TEST bdev_nbd 00:33:18.419 ************************************ 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1656169 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1656169 /var/tmp/spdk-nbd.sock 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1656169 ']' 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:33:18.419 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:18.419 08:46:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:18.419 [2024-07-23 08:46:30.836237] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:33:18.419 [2024-07-23 08:46:30.836326] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:18.679 [2024-07-23 08:46:30.959910] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:18.679 [2024-07-23 08:46:31.166311] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:18.679 [2024-07-23 08:46:31.187524] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:18.679 [2024-07-23 08:46:31.195556] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:18.938 [2024-07-23 08:46:31.203566] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:19.197 [2024-07-23 08:46:31.499702] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:21.731 [2024-07-23 08:46:34.223593] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:21.731 [2024-07-23 08:46:34.223677] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:21.731 [2024-07-23 08:46:34.223692] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:21.731 [2024-07-23 08:46:34.231617] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:21.731 [2024-07-23 08:46:34.231646] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:21.731 [2024-07-23 08:46:34.231656] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:21.731 [2024-07-23 08:46:34.239629] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:21.731 [2024-07-23 08:46:34.239668] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:21.731 [2024-07-23 08:46:34.239676] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:21.731 [2024-07-23 08:46:34.247656] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:21.731 [2024-07-23 08:46:34.247680] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:21.731 [2024-07-23 08:46:34.247689] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:22.668 08:46:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:22.668 08:46:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:33:22.668 08:46:34 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:33:22.668 08:46:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:22.668 08:46:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:22.668 08:46:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:33:22.668 08:46:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:33:22.668 08:46:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:22.668 08:46:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:22.668 08:46:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:33:22.668 08:46:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:33:22.668 08:46:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:33:22.668 08:46:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:33:22.668 08:46:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:22.668 08:46:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:33:22.668 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:33:22.668 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:33:22.668 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:33:22.668 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:22.668 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:22.927 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:22.927 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:22.927 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:22.927 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:22.927 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:22.927 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:22.927 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:22.927 1+0 records in 00:33:22.927 1+0 records out 00:33:22.927 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278867 s, 14.7 MB/s 00:33:22.927 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:22.927 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:22.927 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:22.927 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:22.927 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:22.927 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:22.927 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:22.927 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:22.928 1+0 records in 00:33:22.928 1+0 records out 00:33:22.928 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000249618 s, 16.4 MB/s 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:22.928 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:23.187 1+0 records in 00:33:23.187 1+0 records out 00:33:23.187 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259708 s, 15.8 MB/s 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:23.187 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:23.447 1+0 records in 00:33:23.447 1+0 records out 00:33:23.447 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270729 s, 15.1 MB/s 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:23.447 08:46:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:23.709 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:33:23.709 { 00:33:23.709 "nbd_device": "/dev/nbd0", 00:33:23.709 "bdev_name": "crypto_ram" 00:33:23.709 }, 00:33:23.709 { 00:33:23.709 "nbd_device": "/dev/nbd1", 00:33:23.709 "bdev_name": "crypto_ram2" 00:33:23.709 }, 00:33:23.709 { 00:33:23.709 "nbd_device": "/dev/nbd2", 00:33:23.709 "bdev_name": "crypto_ram3" 00:33:23.709 }, 00:33:23.709 { 00:33:23.709 "nbd_device": "/dev/nbd3", 00:33:23.709 "bdev_name": "crypto_ram4" 00:33:23.709 } 00:33:23.709 ]' 00:33:23.709 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:33:23.709 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:33:23.709 { 00:33:23.709 "nbd_device": "/dev/nbd0", 00:33:23.709 "bdev_name": "crypto_ram" 00:33:23.709 }, 00:33:23.709 { 00:33:23.709 "nbd_device": "/dev/nbd1", 00:33:23.709 "bdev_name": "crypto_ram2" 00:33:23.709 }, 00:33:23.709 { 00:33:23.709 "nbd_device": "/dev/nbd2", 00:33:23.709 "bdev_name": "crypto_ram3" 00:33:23.709 }, 00:33:23.709 { 00:33:23.709 "nbd_device": "/dev/nbd3", 00:33:23.709 "bdev_name": "crypto_ram4" 00:33:23.709 } 00:33:23.709 ]' 00:33:23.709 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:33:23.709 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:33:23.709 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:23.709 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:33:23.709 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:23.709 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:23.709 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:23.709 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:23.968 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:33:24.228 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:33:24.228 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:33:24.228 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:33:24.228 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:24.228 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:24.228 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:33:24.228 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:24.228 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:24.228 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:24.228 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:33:24.487 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:33:24.487 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:33:24.487 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:33:24.487 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:24.487 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:24.487 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:33:24.487 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:24.487 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:24.487 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:24.487 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:24.487 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:24.487 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:24.487 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:24.487 08:46:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:24.746 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:24.746 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:24.746 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:24.746 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:24.746 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:24.746 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:24.746 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:33:24.746 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:33:24.746 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:33:24.746 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:24.746 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:24.746 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:24.746 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:33:24.746 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:24.746 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:33:24.747 /dev/nbd0 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:24.747 1+0 records in 00:33:24.747 1+0 records out 00:33:24.747 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00025831 s, 15.9 MB/s 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:24.747 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:33:25.005 /dev/nbd1 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:25.005 1+0 records in 00:33:25.005 1+0 records out 00:33:25.005 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290232 s, 14.1 MB/s 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:25.005 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:33:25.264 /dev/nbd10 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:25.264 1+0 records in 00:33:25.264 1+0 records out 00:33:25.264 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000212982 s, 19.2 MB/s 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:25.264 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:33:25.523 /dev/nbd11 00:33:25.523 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:33:25.523 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:33:25.523 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:33:25.523 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:25.523 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:25.523 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:25.523 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:33:25.523 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:25.523 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:25.523 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:25.523 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:25.523 1+0 records in 00:33:25.523 1+0 records out 00:33:25.523 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280209 s, 14.6 MB/s 00:33:25.524 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:25.524 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:25.524 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:25.524 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:25.524 08:46:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:25.524 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:25.524 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:25.524 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:25.524 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:25.524 08:46:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:33:25.782 { 00:33:25.782 "nbd_device": "/dev/nbd0", 00:33:25.782 "bdev_name": "crypto_ram" 00:33:25.782 }, 00:33:25.782 { 00:33:25.782 "nbd_device": "/dev/nbd1", 00:33:25.782 "bdev_name": "crypto_ram2" 00:33:25.782 }, 00:33:25.782 { 00:33:25.782 "nbd_device": "/dev/nbd10", 00:33:25.782 "bdev_name": "crypto_ram3" 00:33:25.782 }, 00:33:25.782 { 00:33:25.782 "nbd_device": "/dev/nbd11", 00:33:25.782 "bdev_name": "crypto_ram4" 00:33:25.782 } 00:33:25.782 ]' 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:33:25.782 { 00:33:25.782 "nbd_device": "/dev/nbd0", 00:33:25.782 "bdev_name": "crypto_ram" 00:33:25.782 }, 00:33:25.782 { 00:33:25.782 "nbd_device": "/dev/nbd1", 00:33:25.782 "bdev_name": "crypto_ram2" 00:33:25.782 }, 00:33:25.782 { 00:33:25.782 "nbd_device": "/dev/nbd10", 00:33:25.782 "bdev_name": "crypto_ram3" 00:33:25.782 }, 00:33:25.782 { 00:33:25.782 "nbd_device": "/dev/nbd11", 00:33:25.782 "bdev_name": "crypto_ram4" 00:33:25.782 } 00:33:25.782 ]' 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:33:25.782 /dev/nbd1 00:33:25.782 /dev/nbd10 00:33:25.782 /dev/nbd11' 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:33:25.782 /dev/nbd1 00:33:25.782 /dev/nbd10 00:33:25.782 /dev/nbd11' 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:33:25.782 256+0 records in 00:33:25.782 256+0 records out 00:33:25.782 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104584 s, 100 MB/s 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:33:25.782 256+0 records in 00:33:25.782 256+0 records out 00:33:25.782 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0323497 s, 32.4 MB/s 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:33:25.782 256+0 records in 00:33:25.782 256+0 records out 00:33:25.782 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.036854 s, 28.5 MB/s 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:33:25.782 256+0 records in 00:33:25.782 256+0 records out 00:33:25.782 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0302249 s, 34.7 MB/s 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:25.782 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:33:26.041 256+0 records in 00:33:26.041 256+0 records out 00:33:26.041 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0294409 s, 35.6 MB/s 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:26.041 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:26.299 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:26.299 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:26.299 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:26.299 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:26.299 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:26.299 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:26.299 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:26.299 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:26.299 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:26.299 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:33:26.563 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:33:26.563 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:33:26.563 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:33:26.563 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:26.563 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:26.563 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:33:26.563 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:26.563 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:26.563 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:26.563 08:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:33:26.822 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:33:27.080 malloc_lvol_verify 00:33:27.080 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:33:27.339 3b21db2f-cb0b-4730-9292-fa3a6da5afe4 00:33:27.339 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:33:27.597 2720a688-df63-4375-8aab-0f53c80660a4 00:33:27.597 08:46:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:33:27.597 /dev/nbd0 00:33:27.597 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:33:27.597 mke2fs 1.46.5 (30-Dec-2021) 00:33:27.597 Discarding device blocks: 0/4096 done 00:33:27.597 Creating filesystem with 4096 1k blocks and 1024 inodes 00:33:27.597 00:33:27.597 Allocating group tables: 0/1 done 00:33:27.597 Writing inode tables: 0/1 done 00:33:27.597 Creating journal (1024 blocks): done 00:33:27.597 Writing superblocks and filesystem accounting information: 0/1 done 00:33:27.597 00:33:27.597 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:33:27.597 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:33:27.597 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:27.597 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:33:27.597 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:27.597 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:27.597 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:27.597 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1656169 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1656169 ']' 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1656169 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1656169 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1656169' 00:33:27.855 killing process with pid 1656169 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1656169 00:33:27.855 08:46:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1656169 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:33:30.501 00:33:30.501 real 0m11.948s 00:33:30.501 user 0m14.510s 00:33:30.501 sys 0m2.710s 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:30.501 ************************************ 00:33:30.501 END TEST bdev_nbd 00:33:30.501 ************************************ 00:33:30.501 08:46:42 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:30.501 08:46:42 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:33:30.501 08:46:42 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = nvme ']' 00:33:30.501 08:46:42 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = gpt ']' 00:33:30.501 08:46:42 blockdev_crypto_aesni -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:33:30.501 08:46:42 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:30.501 08:46:42 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:30.501 08:46:42 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:30.501 ************************************ 00:33:30.501 START TEST bdev_fio 00:33:30.501 ************************************ 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:30.501 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:33:30.501 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram4]' 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram4 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:30.502 ************************************ 00:33:30.502 START TEST bdev_fio_rw_verify 00:33:30.502 ************************************ 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib=/usr/lib64/libasan.so.8 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1347 -- # break 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:30.502 08:46:42 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:30.761 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:30.761 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:30.761 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:30.762 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:30.762 fio-3.35 00:33:30.762 Starting 4 threads 00:33:45.641 00:33:45.641 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1659325: Tue Jul 23 08:46:56 2024 00:33:45.641 read: IOPS=24.2k, BW=94.5MiB/s (99.1MB/s)(945MiB/10001msec) 00:33:45.641 slat (usec): min=13, max=439, avg=55.67, stdev=29.02 00:33:45.641 clat (usec): min=11, max=1516, avg=307.77, stdev=188.42 00:33:45.641 lat (usec): min=43, max=1585, avg=363.45, stdev=205.00 00:33:45.641 clat percentiles (usec): 00:33:45.641 | 50.000th=[ 265], 99.000th=[ 873], 99.900th=[ 1004], 99.990th=[ 1090], 00:33:45.641 | 99.999th=[ 1352] 00:33:45.641 write: IOPS=26.4k, BW=103MiB/s (108MB/s)(1010MiB/9780msec); 0 zone resets 00:33:45.641 slat (usec): min=15, max=521, avg=65.41, stdev=28.42 00:33:45.641 clat (usec): min=27, max=3099, avg=367.39, stdev=225.95 00:33:45.641 lat (usec): min=68, max=3351, avg=432.80, stdev=241.34 00:33:45.641 clat percentiles (usec): 00:33:45.642 | 50.000th=[ 326], 99.000th=[ 1106], 99.900th=[ 1303], 99.990th=[ 1647], 00:33:45.642 | 99.999th=[ 2900] 00:33:45.642 bw ( KiB/s): min=89896, max=139600, per=98.11%, avg=103738.11, stdev=2723.51, samples=76 00:33:45.642 iops : min=22474, max=34900, avg=25934.53, stdev=680.88, samples=76 00:33:45.642 lat (usec) : 20=0.01%, 50=0.02%, 100=7.69%, 250=32.83%, 500=41.09% 00:33:45.642 lat (usec) : 750=13.12%, 1000=4.10% 00:33:45.642 lat (msec) : 2=1.15%, 4=0.01% 00:33:45.642 cpu : usr=99.36%, sys=0.18%, ctx=129, majf=0, minf=25368 00:33:45.642 IO depths : 1=10.1%, 2=25.6%, 4=51.2%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:45.642 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:45.642 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:45.642 issued rwts: total=241875,258537,0,0 short=0,0,0,0 dropped=0,0,0,0 00:33:45.642 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:45.642 00:33:45.642 Run status group 0 (all jobs): 00:33:45.642 READ: bw=94.5MiB/s (99.1MB/s), 94.5MiB/s-94.5MiB/s (99.1MB/s-99.1MB/s), io=945MiB (991MB), run=10001-10001msec 00:33:45.642 WRITE: bw=103MiB/s (108MB/s), 103MiB/s-103MiB/s (108MB/s-108MB/s), io=1010MiB (1059MB), run=9780-9780msec 00:33:47.018 ----------------------------------------------------- 00:33:47.018 Suppressions used: 00:33:47.018 count bytes template 00:33:47.018 4 47 /usr/src/fio/parse.c 00:33:47.018 1313 126048 /usr/src/fio/iolog.c 00:33:47.018 1 8 libtcmalloc_minimal.so 00:33:47.018 1 904 libcrypto.so 00:33:47.018 ----------------------------------------------------- 00:33:47.018 00:33:47.018 00:33:47.018 real 0m16.425s 00:33:47.018 user 0m51.074s 00:33:47.018 sys 0m0.741s 00:33:47.018 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:47.018 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:33:47.018 ************************************ 00:33:47.018 END TEST bdev_fio_rw_verify 00:33:47.018 ************************************ 00:33:47.018 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:47.018 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:33:47.018 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:47.018 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:33:47.018 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "7de9bad8-45a9-5247-8040-c82e239f95bb"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7de9bad8-45a9-5247-8040-c82e239f95bb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "6293bd3c-ce9e-5abf-b0d8-117169017913"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6293bd3c-ce9e-5abf-b0d8-117169017913",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "daee6235-120f-5395-b90e-b70f31bd65fa"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "daee6235-120f-5395-b90e-b70f31bd65fa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "a9c5aa43-a972-5c38-b1c7-e65b7a271e98"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "a9c5aa43-a972-5c38-b1c7-e65b7a271e98",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:33:47.019 crypto_ram2 00:33:47.019 crypto_ram3 00:33:47.019 crypto_ram4 ]] 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "7de9bad8-45a9-5247-8040-c82e239f95bb"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7de9bad8-45a9-5247-8040-c82e239f95bb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "6293bd3c-ce9e-5abf-b0d8-117169017913"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6293bd3c-ce9e-5abf-b0d8-117169017913",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "daee6235-120f-5395-b90e-b70f31bd65fa"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "daee6235-120f-5395-b90e-b70f31bd65fa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "a9c5aa43-a972-5c38-b1c7-e65b7a271e98"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "a9c5aa43-a972-5c38-b1c7-e65b7a271e98",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram4]' 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram4 00:33:47.019 08:46:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:47.020 ************************************ 00:33:47.020 START TEST bdev_fio_trim 00:33:47.020 ************************************ 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib=/usr/lib64/libasan.so.8 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1347 -- # break 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:47.020 08:46:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:47.594 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:47.594 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:47.594 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:47.594 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:47.594 fio-3.35 00:33:47.594 Starting 4 threads 00:34:02.474 00:34:02.474 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1662448: Tue Jul 23 08:47:13 2024 00:34:02.474 write: IOPS=48.1k, BW=188MiB/s (197MB/s)(1880MiB/10001msec); 0 zone resets 00:34:02.474 slat (usec): min=12, max=478, avg=46.49, stdev=26.14 00:34:02.474 clat (usec): min=39, max=1657, avg=210.75, stdev=147.69 00:34:02.474 lat (usec): min=52, max=1713, avg=257.24, stdev=167.06 00:34:02.474 clat percentiles (usec): 00:34:02.474 | 50.000th=[ 167], 99.000th=[ 775], 99.900th=[ 906], 99.990th=[ 988], 00:34:02.474 | 99.999th=[ 1319] 00:34:02.474 bw ( KiB/s): min=146872, max=251384, per=98.70%, avg=189946.11, stdev=11580.14, samples=76 00:34:02.474 iops : min=36718, max=62846, avg=47486.53, stdev=2895.04, samples=76 00:34:02.474 trim: IOPS=48.1k, BW=188MiB/s (197MB/s)(1880MiB/10001msec); 0 zone resets 00:34:02.474 slat (usec): min=4, max=313, avg=13.70, stdev= 6.84 00:34:02.474 clat (usec): min=43, max=1210, avg=198.16, stdev=97.70 00:34:02.474 lat (usec): min=54, max=1297, avg=211.86, stdev=101.09 00:34:02.474 clat percentiles (usec): 00:34:02.474 | 50.000th=[ 180], 99.000th=[ 529], 99.900th=[ 619], 99.990th=[ 734], 00:34:02.474 | 99.999th=[ 1004] 00:34:02.474 bw ( KiB/s): min=146872, max=251384, per=98.70%, avg=189947.37, stdev=11580.61, samples=76 00:34:02.474 iops : min=36718, max=62846, avg=47486.84, stdev=2895.15, samples=76 00:34:02.474 lat (usec) : 50=0.70%, 100=14.97%, 250=58.96%, 500=21.89%, 750=2.85% 00:34:02.474 lat (usec) : 1000=0.62% 00:34:02.474 lat (msec) : 2=0.01% 00:34:02.474 cpu : usr=99.55%, sys=0.06%, ctx=62, majf=0, minf=7678 00:34:02.474 IO depths : 1=8.0%, 2=26.3%, 4=52.6%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:02.474 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:02.474 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:02.474 issued rwts: total=0,481160,481161,0 short=0,0,0,0 dropped=0,0,0,0 00:34:02.474 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:02.474 00:34:02.474 Run status group 0 (all jobs): 00:34:02.474 WRITE: bw=188MiB/s (197MB/s), 188MiB/s-188MiB/s (197MB/s-197MB/s), io=1880MiB (1971MB), run=10001-10001msec 00:34:02.474 TRIM: bw=188MiB/s (197MB/s), 188MiB/s-188MiB/s (197MB/s-197MB/s), io=1880MiB (1971MB), run=10001-10001msec 00:34:03.406 ----------------------------------------------------- 00:34:03.406 Suppressions used: 00:34:03.406 count bytes template 00:34:03.406 4 47 /usr/src/fio/parse.c 00:34:03.406 1 8 libtcmalloc_minimal.so 00:34:03.406 1 904 libcrypto.so 00:34:03.406 ----------------------------------------------------- 00:34:03.406 00:34:03.406 00:34:03.406 real 0m16.446s 00:34:03.406 user 0m51.074s 00:34:03.406 sys 0m0.683s 00:34:03.406 08:47:15 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:03.406 08:47:15 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:34:03.406 ************************************ 00:34:03.406 END TEST bdev_fio_trim 00:34:03.406 ************************************ 00:34:03.664 08:47:15 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:03.664 08:47:15 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:34:03.664 08:47:15 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:03.664 08:47:15 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:34:03.664 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:03.664 08:47:15 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:34:03.664 00:34:03.664 real 0m33.188s 00:34:03.664 user 1m42.332s 00:34:03.664 sys 0m1.573s 00:34:03.665 08:47:15 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:03.665 08:47:15 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:03.665 ************************************ 00:34:03.665 END TEST bdev_fio 00:34:03.665 ************************************ 00:34:03.665 08:47:15 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:34:03.665 08:47:15 blockdev_crypto_aesni -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:03.665 08:47:15 blockdev_crypto_aesni -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:03.665 08:47:15 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:03.665 08:47:15 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:03.665 08:47:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:03.665 ************************************ 00:34:03.665 START TEST bdev_verify 00:34:03.665 ************************************ 00:34:03.665 08:47:16 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:03.665 [2024-07-23 08:47:16.085019] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:34:03.665 [2024-07-23 08:47:16.085102] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1664731 ] 00:34:03.923 [2024-07-23 08:47:16.210846] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:03.923 [2024-07-23 08:47:16.432072] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:03.923 [2024-07-23 08:47:16.432082] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:04.181 [2024-07-23 08:47:16.453322] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:34:04.181 [2024-07-23 08:47:16.461346] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:04.181 [2024-07-23 08:47:16.469362] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:04.439 [2024-07-23 08:47:16.780390] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:34:07.011 [2024-07-23 08:47:19.504588] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:34:07.011 [2024-07-23 08:47:19.504653] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:07.011 [2024-07-23 08:47:19.504668] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.011 [2024-07-23 08:47:19.512608] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:34:07.011 [2024-07-23 08:47:19.512643] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:07.011 [2024-07-23 08:47:19.512654] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.011 [2024-07-23 08:47:19.520631] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:34:07.011 [2024-07-23 08:47:19.520660] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:07.011 [2024-07-23 08:47:19.520670] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.011 [2024-07-23 08:47:19.528658] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:34:07.011 [2024-07-23 08:47:19.528685] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:07.011 [2024-07-23 08:47:19.528693] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.268 Running I/O for 5 seconds... 00:34:12.528 00:34:12.528 Latency(us) 00:34:12.528 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:12.528 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:12.528 Verification LBA range: start 0x0 length 0x1000 00:34:12.528 crypto_ram : 5.06 632.05 2.47 0.00 0.00 202131.37 4712.35 151793.86 00:34:12.528 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:12.528 Verification LBA range: start 0x1000 length 0x1000 00:34:12.528 crypto_ram : 5.06 632.25 2.47 0.00 0.00 202047.00 5991.86 152792.50 00:34:12.528 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:12.528 Verification LBA range: start 0x0 length 0x1000 00:34:12.528 crypto_ram2 : 5.06 631.96 2.47 0.00 0.00 201599.13 4930.80 138811.49 00:34:12.528 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:12.528 Verification LBA range: start 0x1000 length 0x1000 00:34:12.528 crypto_ram2 : 5.06 632.16 2.47 0.00 0.00 201506.64 6491.18 138811.49 00:34:12.528 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:12.528 Verification LBA range: start 0x0 length 0x1000 00:34:12.528 crypto_ram3 : 5.05 4927.61 19.25 0.00 0.00 25731.28 3167.57 23343.30 00:34:12.528 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:12.528 Verification LBA range: start 0x1000 length 0x1000 00:34:12.528 crypto_ram3 : 5.05 4951.56 19.34 0.00 0.00 25606.84 2761.87 23842.62 00:34:12.528 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:12.528 Verification LBA range: start 0x0 length 0x1000 00:34:12.528 crypto_ram4 : 5.05 4937.91 19.29 0.00 0.00 25641.49 2902.31 20971.52 00:34:12.528 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:12.528 Verification LBA range: start 0x1000 length 0x1000 00:34:12.528 crypto_ram4 : 5.06 4962.35 19.38 0.00 0.00 25516.52 2387.38 20971.52 00:34:12.528 =================================================================================================================== 00:34:12.528 Total : 22307.84 87.14 0.00 0.00 45625.97 2387.38 152792.50 00:34:15.056 00:34:15.056 real 0m11.232s 00:34:15.056 user 0m21.061s 00:34:15.056 sys 0m0.472s 00:34:15.056 08:47:27 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:15.056 08:47:27 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:34:15.056 ************************************ 00:34:15.056 END TEST bdev_verify 00:34:15.056 ************************************ 00:34:15.056 08:47:27 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:34:15.056 08:47:27 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:15.056 08:47:27 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:15.056 08:47:27 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:15.056 08:47:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:15.056 ************************************ 00:34:15.056 START TEST bdev_verify_big_io 00:34:15.056 ************************************ 00:34:15.056 08:47:27 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:15.056 [2024-07-23 08:47:27.388615] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:34:15.056 [2024-07-23 08:47:27.388695] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1666766 ] 00:34:15.056 [2024-07-23 08:47:27.510794] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:15.314 [2024-07-23 08:47:27.723826] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:15.314 [2024-07-23 08:47:27.723836] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:15.314 [2024-07-23 08:47:27.745096] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:34:15.314 [2024-07-23 08:47:27.753122] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:15.314 [2024-07-23 08:47:27.761144] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:15.571 [2024-07-23 08:47:28.076582] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:34:18.856 [2024-07-23 08:47:30.799110] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:34:18.856 [2024-07-23 08:47:30.799178] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:18.856 [2024-07-23 08:47:30.799190] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:18.856 [2024-07-23 08:47:30.807127] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:34:18.856 [2024-07-23 08:47:30.807155] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:18.856 [2024-07-23 08:47:30.807166] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:18.856 [2024-07-23 08:47:30.815148] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:34:18.856 [2024-07-23 08:47:30.815178] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:18.856 [2024-07-23 08:47:30.815188] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:18.856 [2024-07-23 08:47:30.823169] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:34:18.856 [2024-07-23 08:47:30.823195] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:18.856 [2024-07-23 08:47:30.823204] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:18.856 Running I/O for 5 seconds... 00:34:20.759 [2024-07-23 08:47:33.212225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.759 [2024-07-23 08:47:33.212325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.759 [2024-07-23 08:47:33.212629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.759 [2024-07-23 08:47:33.212667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.759 [2024-07-23 08:47:33.222716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.759 [2024-07-23 08:47:33.222776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.759 [2024-07-23 08:47:33.222809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.759 [2024-07-23 08:47:33.222840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:21.022 [2024-07-23 08:47:33.341461] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.022 [2024-07-23 08:47:33.341557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.022 [2024-07-23 08:47:33.342655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.022 [2024-07-23 08:47:33.495723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.022 [2024-07-23 08:47:33.495793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.023 [2024-07-23 08:47:33.495847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.023 [2024-07-23 08:47:33.496137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.023 [2024-07-23 08:47:33.497283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.023 [2024-07-23 08:47:33.497351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.023 [2024-07-23 08:47:33.497412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.023 [2024-07-23 08:47:33.497460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.023 [2024-07-23 08:47:33.497867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.023 [2024-07-23 08:47:33.497920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.023 [2024-07-23 08:47:33.497958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.023 [2024-07-23 08:47:33.497999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.023 [2024-07-23 08:47:33.499162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.023 [2024-07-23 08:47:33.499215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.023 [2024-07-23 08:47:33.499257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.024 [2024-07-23 08:47:33.499302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.024 [2024-07-23 08:47:33.499753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.024 [2024-07-23 08:47:33.499805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.024 [2024-07-23 08:47:33.499843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.024 [2024-07-23 08:47:33.499879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.024 [2024-07-23 08:47:33.501041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.024 [2024-07-23 08:47:33.501092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.024 [2024-07-23 08:47:33.501132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.024 [2024-07-23 08:47:33.501169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.024 [2024-07-23 08:47:33.501594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.025 [2024-07-23 08:47:33.501652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.025 [2024-07-23 08:47:33.501696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.025 [2024-07-23 08:47:33.501743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.025 [2024-07-23 08:47:33.502887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.025 [2024-07-23 08:47:33.502945] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.025 [2024-07-23 08:47:33.502982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.025 [2024-07-23 08:47:33.503018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.025 [2024-07-23 08:47:33.503370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.025 [2024-07-23 08:47:33.503429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.025 [2024-07-23 08:47:33.503468] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.025 [2024-07-23 08:47:33.503511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.025 [2024-07-23 08:47:33.504555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.025 [2024-07-23 08:47:33.504633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.025 [2024-07-23 08:47:33.504673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.025 [2024-07-23 08:47:33.504709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.025 [2024-07-23 08:47:33.505079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.026 [2024-07-23 08:47:33.505130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.026 [2024-07-23 08:47:33.505168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.026 [2024-07-23 08:47:33.505205] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.026 [2024-07-23 08:47:33.506269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.026 [2024-07-23 08:47:33.506320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.026 [2024-07-23 08:47:33.506362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.026 [2024-07-23 08:47:33.506415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.026 [2024-07-23 08:47:33.506847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.026 [2024-07-23 08:47:33.506897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.026 [2024-07-23 08:47:33.506934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.026 [2024-07-23 08:47:33.506971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.026 [2024-07-23 08:47:33.508097] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.026 [2024-07-23 08:47:33.508157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.026 [2024-07-23 08:47:33.508199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.026 [2024-07-23 08:47:33.508247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.508588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.508646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.508685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.508722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.509902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.509952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.509989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.510031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.510372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.510423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.510460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.510495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.511708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.512847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.512907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.514027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.515291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.515349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.027 [2024-07-23 08:47:33.516269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.028 [2024-07-23 08:47:33.516311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.028 [2024-07-23 08:47:33.517389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.028 [2024-07-23 08:47:33.518428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.028 [2024-07-23 08:47:33.518475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.028 [2024-07-23 08:47:33.518783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.028 [2024-07-23 08:47:33.520262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.028 [2024-07-23 08:47:33.520322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.028 [2024-07-23 08:47:33.520935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.028 [2024-07-23 08:47:33.520977] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.028 [2024-07-23 08:47:33.521971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.028 [2024-07-23 08:47:33.522958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.028 [2024-07-23 08:47:33.523003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.028 [2024-07-23 08:47:33.523951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.028 [2024-07-23 08:47:33.525211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.028 [2024-07-23 08:47:33.525266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.029 [2024-07-23 08:47:33.526421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.029 [2024-07-23 08:47:33.526471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.029 [2024-07-23 08:47:33.527563] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.029 [2024-07-23 08:47:33.528423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.029 [2024-07-23 08:47:33.528470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.029 [2024-07-23 08:47:33.529130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.029 [2024-07-23 08:47:33.530543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.029 [2024-07-23 08:47:33.530596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.029 [2024-07-23 08:47:33.531052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.029 [2024-07-23 08:47:33.531101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.029 [2024-07-23 08:47:33.532347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.029 [2024-07-23 08:47:33.533575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.029 [2024-07-23 08:47:33.533634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.029 [2024-07-23 08:47:33.533975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.030 [2024-07-23 08:47:33.535333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.030 [2024-07-23 08:47:33.535387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.030 [2024-07-23 08:47:33.536592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.030 [2024-07-23 08:47:33.536646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.030 [2024-07-23 08:47:33.538078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.030 [2024-07-23 08:47:33.539162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.030 [2024-07-23 08:47:33.539215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.540282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.541744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.541799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.542595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.542645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.543734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.544051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.544095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.545069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.545668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.545724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.546779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.546825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.547859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.549003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.549047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.549334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.291 [2024-07-23 08:47:33.550878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.550940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.551516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.551559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.552497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.552817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.552864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.553836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.555361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.555416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.556079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.556121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.557210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.558028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.558072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.558917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.559964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.560018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.560980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.561023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.564453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.565378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.565423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.565720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.566320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.566373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.567680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.567721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.568679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.569298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.569345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.570491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.572132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.572188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.572576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.572623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.573616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.574579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.574630] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.575783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.577477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.577530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.578710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.578756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.580701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.581855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.581906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.583169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.584131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.584189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.585346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.585389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.588385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.589332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.589377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.590337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.591128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.591183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.592225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.592268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.595411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.595466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.596447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.596488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.597762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.597814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.598777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.598820] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.602674] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.602734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.603585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.603631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.605095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.605153] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.605442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.605481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.609683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.609747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.610699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.610746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.611529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.611581] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.612367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.612408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.616155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.616209] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.616838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.616880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.618496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.618552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.619725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.619768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.292 [2024-07-23 08:47:33.623896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.623956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.625196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.625245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.626724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.626777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.627686] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.627729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.630656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.630709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.631384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.631424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.632683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.632736] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.632775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.633737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.637549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.637603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.637898] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.637937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.638385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.639682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.639732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.640982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.645263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.645316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.645926] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.645974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.646396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.646743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.647738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.647780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.293 [2024-07-23 08:47:33.651587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.551 [2024-07-23 08:47:33.870309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:21.551 [2024-07-23 08:47:33.870432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:21.551 [2024-07-23 08:47:33.870473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:21.551 [2024-07-23 08:47:33.870495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.809 [2024-07-23 08:47:34.289124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.809 [2024-07-23 08:47:34.289220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:21.809 [2024-07-23 08:47:34.290544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:22.378 [2024-07-23 08:47:34.848190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:22.378 [2024-07-23 08:47:34.848275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:22.378 [2024-07-23 08:47:34.848313] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:22.378 [2024-07-23 08:47:34.848333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:23.315 [2024-07-23 08:47:35.675003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.315 [2024-07-23 08:47:35.676276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.315 [2024-07-23 08:47:35.676322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.315 [2024-07-23 08:47:35.676356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:23.574 [2024-07-23 08:47:35.940434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:23.574 [2024-07-23 08:47:35.940530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:23.574 [2024-07-23 08:47:35.940573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:23.574 [2024-07-23 08:47:35.941637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:24.512 00:34:24.512 Latency(us) 00:34:24.512 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:24.512 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:24.512 Verification LBA range: start 0x0 length 0x100 00:34:24.512 crypto_ram : 5.54 52.91 3.31 0.00 0.00 2290916.70 14417.92 1997287.62 00:34:24.512 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:24.512 Verification LBA range: start 0x100 length 0x100 00:34:24.512 crypto_ram : 5.55 53.90 3.37 0.00 0.00 2240205.04 3744.91 1957341.87 00:34:24.512 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:24.512 Verification LBA range: start 0x0 length 0x100 00:34:24.512 crypto_ram2 : 5.57 57.50 3.59 0.00 0.00 2079656.37 4805.97 1997287.62 00:34:24.512 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:24.512 Verification LBA range: start 0x100 length 0x100 00:34:24.512 crypto_ram2 : 5.58 59.83 3.74 0.00 0.00 2005525.59 8238.81 1957341.87 00:34:24.512 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:24.512 Verification LBA range: start 0x0 length 0x100 00:34:24.512 crypto_ram3 : 5.43 388.21 24.26 0.00 0.00 302835.36 11172.33 547256.81 00:34:24.512 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:24.512 Verification LBA range: start 0x100 length 0x100 00:34:24.512 crypto_ram3 : 5.44 394.33 24.65 0.00 0.00 298527.17 49932.19 431414.13 00:34:24.512 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:24.512 Verification LBA range: start 0x0 length 0x100 00:34:24.512 crypto_ram4 : 5.50 404.90 25.31 0.00 0.00 284092.27 2122.12 419430.40 00:34:24.512 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:24.512 Verification LBA range: start 0x100 length 0x100 00:34:24.512 crypto_ram4 : 5.50 409.08 25.57 0.00 0.00 281588.12 8800.55 423424.98 00:34:24.512 =================================================================================================================== 00:34:24.512 Total : 1820.66 113.79 0.00 0.00 523276.78 2122.12 1997287.62 00:34:27.082 00:34:27.082 real 0m11.907s 00:34:27.082 user 0m22.411s 00:34:27.082 sys 0m0.506s 00:34:27.082 08:47:39 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:27.082 08:47:39 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:34:27.082 ************************************ 00:34:27.082 END TEST bdev_verify_big_io 00:34:27.082 ************************************ 00:34:27.082 08:47:39 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:34:27.082 08:47:39 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:27.082 08:47:39 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:27.082 08:47:39 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:27.082 08:47:39 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:27.082 ************************************ 00:34:27.082 START TEST bdev_write_zeroes 00:34:27.082 ************************************ 00:34:27.082 08:47:39 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:27.082 [2024-07-23 08:47:39.362596] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:34:27.082 [2024-07-23 08:47:39.362688] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1668805 ] 00:34:27.082 [2024-07-23 08:47:39.488315] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:27.341 [2024-07-23 08:47:39.709791] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:27.341 [2024-07-23 08:47:39.731019] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:34:27.341 [2024-07-23 08:47:39.739040] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:27.341 [2024-07-23 08:47:39.747060] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:27.599 [2024-07-23 08:47:40.044411] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:34:30.887 [2024-07-23 08:47:42.778275] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:34:30.887 [2024-07-23 08:47:42.778333] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:30.887 [2024-07-23 08:47:42.778345] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:30.887 [2024-07-23 08:47:42.786295] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:34:30.887 [2024-07-23 08:47:42.786327] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:30.887 [2024-07-23 08:47:42.786337] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:30.887 [2024-07-23 08:47:42.794322] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:34:30.887 [2024-07-23 08:47:42.794350] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:30.887 [2024-07-23 08:47:42.794359] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:30.887 [2024-07-23 08:47:42.802327] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:34:30.887 [2024-07-23 08:47:42.802356] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:30.887 [2024-07-23 08:47:42.802364] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:30.887 Running I/O for 1 seconds... 00:34:31.822 00:34:31.822 Latency(us) 00:34:31.822 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:31.822 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:31.822 crypto_ram : 1.02 2514.87 9.82 0.00 0.00 50568.62 4431.48 60417.95 00:34:31.822 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:31.822 crypto_ram2 : 1.02 2520.90 9.85 0.00 0.00 50227.24 4244.24 60417.95 00:34:31.822 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:31.822 crypto_ram3 : 1.02 19513.33 76.22 0.00 0.00 6478.89 1919.27 8426.06 00:34:31.822 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:31.822 crypto_ram4 : 1.02 19552.89 76.38 0.00 0.00 6450.68 1911.47 7895.53 00:34:31.822 =================================================================================================================== 00:34:31.822 Total : 44101.99 172.27 0.00 0.00 11499.15 1911.47 60417.95 00:34:34.383 00:34:34.383 real 0m7.073s 00:34:34.383 user 0m6.546s 00:34:34.383 sys 0m0.469s 00:34:34.383 08:47:46 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:34.383 08:47:46 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:34:34.383 ************************************ 00:34:34.383 END TEST bdev_write_zeroes 00:34:34.383 ************************************ 00:34:34.383 08:47:46 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:34:34.383 08:47:46 blockdev_crypto_aesni -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:34.383 08:47:46 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:34.383 08:47:46 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:34.383 08:47:46 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:34.383 ************************************ 00:34:34.383 START TEST bdev_json_nonenclosed 00:34:34.383 ************************************ 00:34:34.383 08:47:46 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:34.383 [2024-07-23 08:47:46.509262] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:34:34.383 [2024-07-23 08:47:46.509346] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1670089 ] 00:34:34.383 [2024-07-23 08:47:46.638576] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:34.383 [2024-07-23 08:47:46.851704] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:34.383 [2024-07-23 08:47:46.851779] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:34:34.383 [2024-07-23 08:47:46.851795] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:34.383 [2024-07-23 08:47:46.851805] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:34.951 00:34:34.951 real 0m0.840s 00:34:34.951 user 0m0.660s 00:34:34.951 sys 0m0.175s 00:34:34.951 08:47:47 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:34:34.951 08:47:47 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:34.951 08:47:47 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:34:34.951 ************************************ 00:34:34.951 END TEST bdev_json_nonenclosed 00:34:34.951 ************************************ 00:34:34.951 08:47:47 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:34:34.951 08:47:47 blockdev_crypto_aesni -- bdev/blockdev.sh@781 -- # true 00:34:34.951 08:47:47 blockdev_crypto_aesni -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:34.951 08:47:47 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:34.951 08:47:47 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:34.951 08:47:47 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:34.951 ************************************ 00:34:34.951 START TEST bdev_json_nonarray 00:34:34.951 ************************************ 00:34:34.951 08:47:47 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:34.951 [2024-07-23 08:47:47.417037] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:34:34.951 [2024-07-23 08:47:47.417116] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1670338 ] 00:34:35.211 [2024-07-23 08:47:47.536796] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:35.469 [2024-07-23 08:47:47.760999] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:35.469 [2024-07-23 08:47:47.761084] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:34:35.470 [2024-07-23 08:47:47.761102] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:35.470 [2024-07-23 08:47:47.761112] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:35.728 00:34:35.728 real 0m0.861s 00:34:35.728 user 0m0.700s 00:34:35.728 sys 0m0.156s 00:34:35.728 08:47:48 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:34:35.728 08:47:48 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:35.728 08:47:48 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:34:35.728 ************************************ 00:34:35.728 END TEST bdev_json_nonarray 00:34:35.728 ************************************ 00:34:35.728 08:47:48 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:34:35.728 08:47:48 blockdev_crypto_aesni -- bdev/blockdev.sh@784 -- # true 00:34:35.728 08:47:48 blockdev_crypto_aesni -- bdev/blockdev.sh@786 -- # [[ crypto_aesni == bdev ]] 00:34:35.728 08:47:48 blockdev_crypto_aesni -- bdev/blockdev.sh@793 -- # [[ crypto_aesni == gpt ]] 00:34:35.728 08:47:48 blockdev_crypto_aesni -- bdev/blockdev.sh@797 -- # [[ crypto_aesni == crypto_sw ]] 00:34:35.728 08:47:48 blockdev_crypto_aesni -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:34:35.728 08:47:48 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # cleanup 00:34:35.728 08:47:48 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:34:35.728 08:47:48 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:35.728 08:47:48 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:34:35.728 08:47:48 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:34:35.728 08:47:48 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:34:35.728 08:47:48 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:34:35.987 00:34:35.987 real 1m39.570s 00:34:35.987 user 3m22.262s 00:34:35.987 sys 0m8.385s 00:34:35.987 08:47:48 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:35.987 08:47:48 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:35.987 ************************************ 00:34:35.987 END TEST blockdev_crypto_aesni 00:34:35.987 ************************************ 00:34:35.987 08:47:48 -- common/autotest_common.sh@1142 -- # return 0 00:34:35.987 08:47:48 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:34:35.987 08:47:48 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:35.987 08:47:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:35.987 08:47:48 -- common/autotest_common.sh@10 -- # set +x 00:34:35.987 ************************************ 00:34:35.987 START TEST blockdev_crypto_sw 00:34:35.987 ************************************ 00:34:35.987 08:47:48 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:34:35.987 * Looking for test storage... 00:34:35.987 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # uname -s 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@681 -- # test_type=crypto_sw 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # crypto_device= 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # dek= 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # env_ctx= 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == bdev ]] 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == crypto_* ]] 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1670639 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 1670639 00:34:35.987 08:47:48 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:34:35.987 08:47:48 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 1670639 ']' 00:34:35.987 08:47:48 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:35.987 08:47:48 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:35.987 08:47:48 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:35.987 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:35.987 08:47:48 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:35.987 08:47:48 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:36.246 [2024-07-23 08:47:48.514059] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:34:36.246 [2024-07-23 08:47:48.514160] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1670639 ] 00:34:36.246 [2024-07-23 08:47:48.639279] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:36.505 [2024-07-23 08:47:48.869554] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:36.763 08:47:49 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:36.763 08:47:49 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:34:36.764 08:47:49 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:34:36.764 08:47:49 blockdev_crypto_sw -- bdev/blockdev.sh@710 -- # setup_crypto_sw_conf 00:34:36.764 08:47:49 blockdev_crypto_sw -- bdev/blockdev.sh@192 -- # rpc_cmd 00:34:36.764 08:47:49 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:36.764 08:47:49 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:38.139 Malloc0 00:34:38.139 Malloc1 00:34:38.139 true 00:34:38.139 true 00:34:38.139 true 00:34:38.139 [2024-07-23 08:47:50.352894] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:38.139 crypto_ram 00:34:38.139 [2024-07-23 08:47:50.360896] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:38.139 crypto_ram2 00:34:38.139 [2024-07-23 08:47:50.368930] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:38.139 crypto_ram3 00:34:38.139 [ 00:34:38.139 { 00:34:38.139 "name": "Malloc1", 00:34:38.139 "aliases": [ 00:34:38.139 "7ab27dfd-add0-4e6e-88a7-642981015e4a" 00:34:38.139 ], 00:34:38.139 "product_name": "Malloc disk", 00:34:38.139 "block_size": 4096, 00:34:38.139 "num_blocks": 4096, 00:34:38.139 "uuid": "7ab27dfd-add0-4e6e-88a7-642981015e4a", 00:34:38.139 "assigned_rate_limits": { 00:34:38.139 "rw_ios_per_sec": 0, 00:34:38.139 "rw_mbytes_per_sec": 0, 00:34:38.139 "r_mbytes_per_sec": 0, 00:34:38.139 "w_mbytes_per_sec": 0 00:34:38.139 }, 00:34:38.139 "claimed": true, 00:34:38.139 "claim_type": "exclusive_write", 00:34:38.139 "zoned": false, 00:34:38.139 "supported_io_types": { 00:34:38.139 "read": true, 00:34:38.139 "write": true, 00:34:38.139 "unmap": true, 00:34:38.139 "flush": true, 00:34:38.139 "reset": true, 00:34:38.139 "nvme_admin": false, 00:34:38.139 "nvme_io": false, 00:34:38.139 "nvme_io_md": false, 00:34:38.139 "write_zeroes": true, 00:34:38.139 "zcopy": true, 00:34:38.139 "get_zone_info": false, 00:34:38.139 "zone_management": false, 00:34:38.139 "zone_append": false, 00:34:38.139 "compare": false, 00:34:38.139 "compare_and_write": false, 00:34:38.139 "abort": true, 00:34:38.139 "seek_hole": false, 00:34:38.139 "seek_data": false, 00:34:38.139 "copy": true, 00:34:38.139 "nvme_iov_md": false 00:34:38.139 }, 00:34:38.139 "memory_domains": [ 00:34:38.139 { 00:34:38.139 "dma_device_id": "system", 00:34:38.139 "dma_device_type": 1 00:34:38.139 }, 00:34:38.139 { 00:34:38.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:38.139 "dma_device_type": 2 00:34:38.139 } 00:34:38.139 ], 00:34:38.139 "driver_specific": {} 00:34:38.139 } 00:34:38.139 ] 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:38.139 08:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:38.139 08:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # cat 00:34:38.139 08:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:38.139 08:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:38.139 08:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:38.139 08:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:34:38.139 08:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:38.139 08:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:38.139 08:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:34:38.139 08:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r .name 00:34:38.139 08:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "d31b73b2-faf9-5d2d-a0a0-cbb925367012"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "d31b73b2-faf9-5d2d-a0a0-cbb925367012",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "dc2b370a-d0fe-5131-aa98-57bc44b53054"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "dc2b370a-d0fe-5131-aa98-57bc44b53054",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:34:38.139 08:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:34:38.139 08:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:34:38.139 08:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:34:38.139 08:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # killprocess 1670639 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 1670639 ']' 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 1670639 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1670639 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1670639' 00:34:38.139 killing process with pid 1670639 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 1670639 00:34:38.139 08:47:50 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 1670639 00:34:40.670 08:47:53 blockdev_crypto_sw -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:40.670 08:47:53 blockdev_crypto_sw -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:40.670 08:47:53 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:34:40.670 08:47:53 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:40.670 08:47:53 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:40.670 ************************************ 00:34:40.670 START TEST bdev_hello_world 00:34:40.670 ************************************ 00:34:40.671 08:47:53 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:40.928 [2024-07-23 08:47:53.250587] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:34:40.928 [2024-07-23 08:47:53.250800] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1671422 ] 00:34:40.928 [2024-07-23 08:47:53.372268] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:41.186 [2024-07-23 08:47:53.585482] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:41.754 [2024-07-23 08:47:54.053705] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:41.754 [2024-07-23 08:47:54.053775] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:41.754 [2024-07-23 08:47:54.053791] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:41.754 [2024-07-23 08:47:54.061718] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:41.754 [2024-07-23 08:47:54.061751] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:41.754 [2024-07-23 08:47:54.061761] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:41.754 [2024-07-23 08:47:54.069727] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:41.754 [2024-07-23 08:47:54.069752] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:41.754 [2024-07-23 08:47:54.069761] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:41.754 [2024-07-23 08:47:54.141191] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:34:41.754 [2024-07-23 08:47:54.141220] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:34:41.754 [2024-07-23 08:47:54.141236] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:34:41.754 [2024-07-23 08:47:54.142676] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:34:41.754 [2024-07-23 08:47:54.142761] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:34:41.754 [2024-07-23 08:47:54.142777] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:34:41.754 [2024-07-23 08:47:54.142805] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:34:41.754 00:34:41.754 [2024-07-23 08:47:54.142824] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:34:43.158 00:34:43.158 real 0m2.302s 00:34:43.158 user 0m2.010s 00:34:43.158 sys 0m0.257s 00:34:43.158 08:47:55 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:43.158 08:47:55 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:34:43.158 ************************************ 00:34:43.158 END TEST bdev_hello_world 00:34:43.158 ************************************ 00:34:43.158 08:47:55 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:43.158 08:47:55 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:34:43.158 08:47:55 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:43.158 08:47:55 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:43.158 08:47:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:43.158 ************************************ 00:34:43.158 START TEST bdev_bounds 00:34:43.158 ************************************ 00:34:43.158 08:47:55 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:34:43.158 08:47:55 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1671924 00:34:43.158 08:47:55 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:34:43.158 08:47:55 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1671924' 00:34:43.158 Process bdevio pid: 1671924 00:34:43.158 08:47:55 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1671924 00:34:43.158 08:47:55 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1671924 ']' 00:34:43.158 08:47:55 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:43.158 08:47:55 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:43.158 08:47:55 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:43.158 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:43.158 08:47:55 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:43.158 08:47:55 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:43.158 08:47:55 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:43.158 [2024-07-23 08:47:55.620348] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:34:43.158 [2024-07-23 08:47:55.620437] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1671924 ] 00:34:43.417 [2024-07-23 08:47:55.744547] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:34:43.676 [2024-07-23 08:47:55.965736] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:43.676 [2024-07-23 08:47:55.965803] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:43.676 [2024-07-23 08:47:55.965810] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:43.935 [2024-07-23 08:47:56.421131] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:43.935 [2024-07-23 08:47:56.421193] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:43.935 [2024-07-23 08:47:56.421210] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:43.935 [2024-07-23 08:47:56.429156] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:43.935 [2024-07-23 08:47:56.429186] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:43.935 [2024-07-23 08:47:56.429195] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:43.935 [2024-07-23 08:47:56.437160] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:43.935 [2024-07-23 08:47:56.437189] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:43.935 [2024-07-23 08:47:56.437199] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:44.194 08:47:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:44.194 08:47:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:34:44.194 08:47:56 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:34:44.194 I/O targets: 00:34:44.194 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:34:44.194 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:34:44.194 00:34:44.194 00:34:44.194 CUnit - A unit testing framework for C - Version 2.1-3 00:34:44.194 http://cunit.sourceforge.net/ 00:34:44.194 00:34:44.194 00:34:44.194 Suite: bdevio tests on: crypto_ram3 00:34:44.194 Test: blockdev write read block ...passed 00:34:44.194 Test: blockdev write zeroes read block ...passed 00:34:44.194 Test: blockdev write zeroes read no split ...passed 00:34:44.194 Test: blockdev write zeroes read split ...passed 00:34:44.194 Test: blockdev write zeroes read split partial ...passed 00:34:44.194 Test: blockdev reset ...passed 00:34:44.194 Test: blockdev write read 8 blocks ...passed 00:34:44.194 Test: blockdev write read size > 128k ...passed 00:34:44.194 Test: blockdev write read invalid size ...passed 00:34:44.194 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:44.194 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:44.194 Test: blockdev write read max offset ...passed 00:34:44.194 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:44.194 Test: blockdev writev readv 8 blocks ...passed 00:34:44.194 Test: blockdev writev readv 30 x 1block ...passed 00:34:44.194 Test: blockdev writev readv block ...passed 00:34:44.194 Test: blockdev writev readv size > 128k ...passed 00:34:44.194 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:44.194 Test: blockdev comparev and writev ...passed 00:34:44.194 Test: blockdev nvme passthru rw ...passed 00:34:44.194 Test: blockdev nvme passthru vendor specific ...passed 00:34:44.194 Test: blockdev nvme admin passthru ...passed 00:34:44.194 Test: blockdev copy ...passed 00:34:44.194 Suite: bdevio tests on: crypto_ram 00:34:44.194 Test: blockdev write read block ...passed 00:34:44.194 Test: blockdev write zeroes read block ...passed 00:34:44.194 Test: blockdev write zeroes read no split ...passed 00:34:44.453 Test: blockdev write zeroes read split ...passed 00:34:44.453 Test: blockdev write zeroes read split partial ...passed 00:34:44.453 Test: blockdev reset ...passed 00:34:44.453 Test: blockdev write read 8 blocks ...passed 00:34:44.453 Test: blockdev write read size > 128k ...passed 00:34:44.453 Test: blockdev write read invalid size ...passed 00:34:44.453 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:44.453 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:44.453 Test: blockdev write read max offset ...passed 00:34:44.453 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:44.453 Test: blockdev writev readv 8 blocks ...passed 00:34:44.453 Test: blockdev writev readv 30 x 1block ...passed 00:34:44.453 Test: blockdev writev readv block ...passed 00:34:44.453 Test: blockdev writev readv size > 128k ...passed 00:34:44.453 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:44.453 Test: blockdev comparev and writev ...passed 00:34:44.453 Test: blockdev nvme passthru rw ...passed 00:34:44.453 Test: blockdev nvme passthru vendor specific ...passed 00:34:44.453 Test: blockdev nvme admin passthru ...passed 00:34:44.453 Test: blockdev copy ...passed 00:34:44.453 00:34:44.453 Run Summary: Type Total Ran Passed Failed Inactive 00:34:44.453 suites 2 2 n/a 0 0 00:34:44.453 tests 46 46 46 0 0 00:34:44.453 asserts 260 260 260 0 n/a 00:34:44.453 00:34:44.453 Elapsed time = 0.429 seconds 00:34:44.453 0 00:34:44.453 08:47:56 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1671924 00:34:44.453 08:47:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1671924 ']' 00:34:44.453 08:47:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1671924 00:34:44.453 08:47:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:34:44.454 08:47:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:44.454 08:47:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1671924 00:34:44.454 08:47:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:44.454 08:47:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:44.454 08:47:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1671924' 00:34:44.454 killing process with pid 1671924 00:34:44.454 08:47:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1671924 00:34:44.454 08:47:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1671924 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:34:45.831 00:34:45.831 real 0m2.657s 00:34:45.831 user 0m6.293s 00:34:45.831 sys 0m0.382s 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:45.831 ************************************ 00:34:45.831 END TEST bdev_bounds 00:34:45.831 ************************************ 00:34:45.831 08:47:58 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:45.831 08:47:58 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:34:45.831 08:47:58 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:34:45.831 08:47:58 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:45.831 08:47:58 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:45.831 ************************************ 00:34:45.831 START TEST bdev_nbd 00:34:45.831 ************************************ 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=2 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=2 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1672460 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1672460 /var/tmp/spdk-nbd.sock 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1672460 ']' 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:34:45.831 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:45.831 08:47:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:46.089 [2024-07-23 08:47:58.358713] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:34:46.089 [2024-07-23 08:47:58.358800] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:46.089 [2024-07-23 08:47:58.484002] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:46.348 [2024-07-23 08:47:58.715097] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:46.916 [2024-07-23 08:47:59.179783] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:46.916 [2024-07-23 08:47:59.179852] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:46.916 [2024-07-23 08:47:59.179865] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:46.916 [2024-07-23 08:47:59.187802] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:46.916 [2024-07-23 08:47:59.187835] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:46.916 [2024-07-23 08:47:59.187846] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:46.916 [2024-07-23 08:47:59.195819] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:46.916 [2024-07-23 08:47:59.195846] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:46.916 [2024-07-23 08:47:59.195856] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:46.916 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:46.916 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:34:46.916 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:34:46.916 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:46.916 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:34:46.916 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:34:46.916 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:34:46.916 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:46.916 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:34:46.916 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:34:46.916 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:34:46.916 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:34:46.916 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:34:46.916 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:34:46.916 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:47.174 1+0 records in 00:34:47.174 1+0 records out 00:34:47.174 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251062 s, 16.3 MB/s 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:34:47.174 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:47.432 1+0 records in 00:34:47.432 1+0 records out 00:34:47.432 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277299 s, 14.8 MB/s 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:34:47.432 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:47.691 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:34:47.691 { 00:34:47.691 "nbd_device": "/dev/nbd0", 00:34:47.691 "bdev_name": "crypto_ram" 00:34:47.691 }, 00:34:47.691 { 00:34:47.691 "nbd_device": "/dev/nbd1", 00:34:47.691 "bdev_name": "crypto_ram3" 00:34:47.691 } 00:34:47.691 ]' 00:34:47.691 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:34:47.691 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:34:47.691 { 00:34:47.691 "nbd_device": "/dev/nbd0", 00:34:47.691 "bdev_name": "crypto_ram" 00:34:47.691 }, 00:34:47.691 { 00:34:47.691 "nbd_device": "/dev/nbd1", 00:34:47.691 "bdev_name": "crypto_ram3" 00:34:47.691 } 00:34:47.691 ]' 00:34:47.691 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:34:47.691 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:34:47.691 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:47.691 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:47.691 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:47.691 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:47.691 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:47.691 08:47:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:47.691 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:47.691 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:47.691 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:47.691 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:47.691 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:47.691 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:47.691 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:47.691 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:47.691 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:47.691 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:47.949 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:47.949 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:47.949 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:47.949 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:47.949 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:47.949 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:47.949 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:47.949 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:47.949 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:47.949 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:47.949 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:34:48.207 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:34:48.466 /dev/nbd0 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:48.466 1+0 records in 00:34:48.466 1+0 records out 00:34:48.466 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000211424 s, 19.4 MB/s 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:34:48.466 08:48:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:34:48.725 /dev/nbd1 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:48.725 1+0 records in 00:34:48.725 1+0 records out 00:34:48.725 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00020243 s, 20.2 MB/s 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:34:48.725 { 00:34:48.725 "nbd_device": "/dev/nbd0", 00:34:48.725 "bdev_name": "crypto_ram" 00:34:48.725 }, 00:34:48.725 { 00:34:48.725 "nbd_device": "/dev/nbd1", 00:34:48.725 "bdev_name": "crypto_ram3" 00:34:48.725 } 00:34:48.725 ]' 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:34:48.725 { 00:34:48.725 "nbd_device": "/dev/nbd0", 00:34:48.725 "bdev_name": "crypto_ram" 00:34:48.725 }, 00:34:48.725 { 00:34:48.725 "nbd_device": "/dev/nbd1", 00:34:48.725 "bdev_name": "crypto_ram3" 00:34:48.725 } 00:34:48.725 ]' 00:34:48.725 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:48.983 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:34:48.983 /dev/nbd1' 00:34:48.983 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:34:48.983 /dev/nbd1' 00:34:48.983 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:48.983 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:34:48.983 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:34:48.983 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:34:48.983 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:34:48.983 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:34:48.983 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:48.983 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:48.983 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:34:48.983 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:48.983 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:34:48.983 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:34:48.983 256+0 records in 00:34:48.983 256+0 records out 00:34:48.983 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103621 s, 101 MB/s 00:34:48.983 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:34:48.984 256+0 records in 00:34:48.984 256+0 records out 00:34:48.984 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0166771 s, 62.9 MB/s 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:34:48.984 256+0 records in 00:34:48.984 256+0 records out 00:34:48.984 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0251146 s, 41.8 MB/s 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:48.984 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:49.242 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:34:49.500 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:34:49.501 08:48:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:34:49.760 malloc_lvol_verify 00:34:49.760 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:34:50.018 bfd46c04-90f9-4aae-b6d5-3bd768618d88 00:34:50.018 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:34:50.018 17e0aec4-79e2-4785-92ba-47e93fee2a0e 00:34:50.018 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:34:50.276 /dev/nbd0 00:34:50.276 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:34:50.276 mke2fs 1.46.5 (30-Dec-2021) 00:34:50.276 Discarding device blocks: 0/4096 done 00:34:50.276 Creating filesystem with 4096 1k blocks and 1024 inodes 00:34:50.276 00:34:50.276 Allocating group tables: 0/1 done 00:34:50.276 Writing inode tables: 0/1 done 00:34:50.276 Creating journal (1024 blocks): done 00:34:50.276 Writing superblocks and filesystem accounting information: 0/1 done 00:34:50.276 00:34:50.276 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:34:50.276 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:34:50.276 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:50.276 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:34:50.276 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:50.276 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:50.276 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:50.276 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1672460 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1672460 ']' 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1672460 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1672460 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1672460' 00:34:50.535 killing process with pid 1672460 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1672460 00:34:50.535 08:48:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1672460 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:34:51.910 00:34:51.910 real 0m6.060s 00:34:51.910 user 0m7.998s 00:34:51.910 sys 0m1.570s 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:51.910 ************************************ 00:34:51.910 END TEST bdev_nbd 00:34:51.910 ************************************ 00:34:51.910 08:48:04 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:51.910 08:48:04 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:34:51.910 08:48:04 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = nvme ']' 00:34:51.910 08:48:04 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = gpt ']' 00:34:51.910 08:48:04 blockdev_crypto_sw -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:34:51.910 08:48:04 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:51.910 08:48:04 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:51.910 08:48:04 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:51.910 ************************************ 00:34:51.910 START TEST bdev_fio 00:34:51.910 ************************************ 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:51.910 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:34:51.910 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:51.911 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:51.911 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:51.911 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:34:51.911 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:34:51.911 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:34:51.911 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:52.169 ************************************ 00:34:52.169 START TEST bdev_fio_rw_verify 00:34:52.169 ************************************ 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib=/usr/lib64/libasan.so.8 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1347 -- # break 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:52.169 08:48:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:52.428 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:52.428 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:52.428 fio-3.35 00:34:52.428 Starting 2 threads 00:35:04.626 00:35:04.626 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1673965: Tue Jul 23 08:48:15 2024 00:35:04.626 read: IOPS=29.4k, BW=115MiB/s (120MB/s)(1149MiB/10000msec) 00:35:04.626 slat (usec): min=10, max=137, avg=15.48, stdev= 3.20 00:35:04.626 clat (usec): min=5, max=568, avg=109.38, stdev=44.60 00:35:04.626 lat (usec): min=19, max=615, avg=124.86, stdev=45.85 00:35:04.626 clat percentiles (usec): 00:35:04.626 | 50.000th=[ 108], 99.000th=[ 215], 99.900th=[ 237], 99.990th=[ 273], 00:35:04.626 | 99.999th=[ 433] 00:35:04.626 write: IOPS=35.3k, BW=138MiB/s (144MB/s)(1306MiB/9475msec); 0 zone resets 00:35:04.626 slat (usec): min=10, max=231, avg=25.35, stdev= 4.22 00:35:04.626 clat (usec): min=18, max=890, avg=145.89, stdev=68.06 00:35:04.626 lat (usec): min=38, max=1052, avg=171.24, stdev=69.56 00:35:04.626 clat percentiles (usec): 00:35:04.626 | 50.000th=[ 141], 99.000th=[ 297], 99.900th=[ 343], 99.990th=[ 519], 00:35:04.626 | 99.999th=[ 635] 00:35:04.626 bw ( KiB/s): min=125352, max=140600, per=94.79%, avg=133750.79, stdev=2220.82, samples=38 00:35:04.626 iops : min=31338, max=35150, avg=33437.63, stdev=555.20, samples=38 00:35:04.626 lat (usec) : 10=0.01%, 20=0.01%, 50=7.87%, 100=28.41%, 250=59.22% 00:35:04.626 lat (usec) : 500=4.49%, 750=0.01%, 1000=0.01% 00:35:04.626 cpu : usr=99.33%, sys=0.31%, ctx=34, majf=0, minf=24987 00:35:04.626 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:35:04.626 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:04.626 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:04.626 issued rwts: total=294035,334238,0,0 short=0,0,0,0 dropped=0,0,0,0 00:35:04.627 latency : target=0, window=0, percentile=100.00%, depth=8 00:35:04.627 00:35:04.627 Run status group 0 (all jobs): 00:35:04.627 READ: bw=115MiB/s (120MB/s), 115MiB/s-115MiB/s (120MB/s-120MB/s), io=1149MiB (1204MB), run=10000-10000msec 00:35:04.627 WRITE: bw=138MiB/s (144MB/s), 138MiB/s-138MiB/s (144MB/s-144MB/s), io=1306MiB (1369MB), run=9475-9475msec 00:35:04.627 ----------------------------------------------------- 00:35:04.627 Suppressions used: 00:35:04.627 count bytes template 00:35:04.627 2 23 /usr/src/fio/parse.c 00:35:04.627 674 64704 /usr/src/fio/iolog.c 00:35:04.627 1 8 libtcmalloc_minimal.so 00:35:04.627 1 904 libcrypto.so 00:35:04.627 ----------------------------------------------------- 00:35:04.627 00:35:04.886 00:35:04.886 real 0m12.691s 00:35:04.886 user 0m27.684s 00:35:04.886 sys 0m0.538s 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:35:04.886 ************************************ 00:35:04.886 END TEST bdev_fio_rw_verify 00:35:04.886 ************************************ 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:35:04.886 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "d31b73b2-faf9-5d2d-a0a0-cbb925367012"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "d31b73b2-faf9-5d2d-a0a0-cbb925367012",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "dc2b370a-d0fe-5131-aa98-57bc44b53054"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "dc2b370a-d0fe-5131-aa98-57bc44b53054",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:35:04.887 crypto_ram3 ]] 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "d31b73b2-faf9-5d2d-a0a0-cbb925367012"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "d31b73b2-faf9-5d2d-a0a0-cbb925367012",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "dc2b370a-d0fe-5131-aa98-57bc44b53054"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "dc2b370a-d0fe-5131-aa98-57bc44b53054",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:35:04.887 ************************************ 00:35:04.887 START TEST bdev_fio_trim 00:35:04.887 ************************************ 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib=/usr/lib64/libasan.so.8 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1347 -- # break 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:35:04.887 08:48:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:05.555 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:05.555 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:05.555 fio-3.35 00:35:05.555 Starting 2 threads 00:35:17.754 00:35:17.754 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1676356: Tue Jul 23 08:48:28 2024 00:35:17.754 write: IOPS=50.9k, BW=199MiB/s (208MB/s)(1987MiB/10001msec); 0 zone resets 00:35:17.754 slat (usec): min=10, max=169, avg=17.20, stdev= 3.53 00:35:17.754 clat (usec): min=28, max=722, avg=128.20, stdev=72.53 00:35:17.754 lat (usec): min=38, max=789, avg=145.40, stdev=75.14 00:35:17.754 clat percentiles (usec): 00:35:17.754 | 50.000th=[ 103], 99.000th=[ 269], 99.900th=[ 289], 99.990th=[ 314], 00:35:17.754 | 99.999th=[ 644] 00:35:17.754 bw ( KiB/s): min=193296, max=206856, per=100.00%, avg=203500.63, stdev=1582.02, samples=38 00:35:17.754 iops : min=48324, max=51714, avg=50875.16, stdev=395.51, samples=38 00:35:17.754 trim: IOPS=50.9k, BW=199MiB/s (208MB/s)(1987MiB/10001msec); 0 zone resets 00:35:17.754 slat (usec): min=4, max=144, avg= 8.22, stdev= 1.95 00:35:17.754 clat (usec): min=31, max=361, avg=84.96, stdev=26.66 00:35:17.754 lat (usec): min=37, max=375, avg=93.18, stdev=26.83 00:35:17.754 clat percentiles (usec): 00:35:17.754 | 50.000th=[ 85], 99.000th=[ 141], 99.900th=[ 153], 99.990th=[ 188], 00:35:17.754 | 99.999th=[ 262] 00:35:17.754 bw ( KiB/s): min=193320, max=206856, per=100.00%, avg=203501.89, stdev=1579.95, samples=38 00:35:17.754 iops : min=48330, max=51714, avg=50875.47, stdev=394.99, samples=38 00:35:17.754 lat (usec) : 50=12.70%, 100=45.82%, 250=39.38%, 500=2.09%, 750=0.01% 00:35:17.754 cpu : usr=99.64%, sys=0.01%, ctx=33, majf=0, minf=2110 00:35:17.754 IO depths : 1=7.4%, 2=17.4%, 4=60.1%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:35:17.754 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:17.754 complete : 0=0.0%, 4=86.9%, 8=13.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:17.754 issued rwts: total=0,508726,508728,0 short=0,0,0,0 dropped=0,0,0,0 00:35:17.754 latency : target=0, window=0, percentile=100.00%, depth=8 00:35:17.754 00:35:17.754 Run status group 0 (all jobs): 00:35:17.754 WRITE: bw=199MiB/s (208MB/s), 199MiB/s-199MiB/s (208MB/s-208MB/s), io=1987MiB (2084MB), run=10001-10001msec 00:35:17.754 TRIM: bw=199MiB/s (208MB/s), 199MiB/s-199MiB/s (208MB/s-208MB/s), io=1987MiB (2084MB), run=10001-10001msec 00:35:17.754 ----------------------------------------------------- 00:35:17.754 Suppressions used: 00:35:17.754 count bytes template 00:35:17.754 2 23 /usr/src/fio/parse.c 00:35:17.754 1 8 libtcmalloc_minimal.so 00:35:17.754 1 904 libcrypto.so 00:35:17.754 ----------------------------------------------------- 00:35:17.754 00:35:17.754 00:35:17.754 real 0m12.658s 00:35:17.754 user 0m27.373s 00:35:17.754 sys 0m0.495s 00:35:17.754 08:48:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:17.754 08:48:30 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:35:17.754 ************************************ 00:35:17.754 END TEST bdev_fio_trim 00:35:17.754 ************************************ 00:35:17.754 08:48:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:35:17.754 08:48:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:35:17.754 08:48:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:17.754 08:48:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:35:17.754 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:17.754 08:48:30 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:35:17.754 00:35:17.754 real 0m25.660s 00:35:17.754 user 0m55.217s 00:35:17.754 sys 0m1.201s 00:35:17.754 08:48:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:17.754 08:48:30 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:35:17.754 ************************************ 00:35:17.754 END TEST bdev_fio 00:35:17.754 ************************************ 00:35:17.754 08:48:30 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:35:17.754 08:48:30 blockdev_crypto_sw -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:35:17.755 08:48:30 blockdev_crypto_sw -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:35:17.755 08:48:30 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:35:17.755 08:48:30 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:17.755 08:48:30 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:17.755 ************************************ 00:35:17.755 START TEST bdev_verify 00:35:17.755 ************************************ 00:35:17.755 08:48:30 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:35:17.755 [2024-07-23 08:48:30.211867] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:35:17.755 [2024-07-23 08:48:30.211961] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1678400 ] 00:35:18.013 [2024-07-23 08:48:30.334318] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:18.271 [2024-07-23 08:48:30.562757] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:18.271 [2024-07-23 08:48:30.562766] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:18.838 [2024-07-23 08:48:31.075993] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:35:18.838 [2024-07-23 08:48:31.076056] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:18.838 [2024-07-23 08:48:31.076069] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:18.838 [2024-07-23 08:48:31.084018] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:35:18.838 [2024-07-23 08:48:31.084050] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:18.838 [2024-07-23 08:48:31.084060] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:18.838 [2024-07-23 08:48:31.092023] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:35:18.838 [2024-07-23 08:48:31.092050] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:35:18.838 [2024-07-23 08:48:31.092058] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:18.838 Running I/O for 5 seconds... 00:35:24.107 00:35:24.107 Latency(us) 00:35:24.107 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:24.107 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:24.107 Verification LBA range: start 0x0 length 0x800 00:35:24.107 crypto_ram : 5.01 8176.51 31.94 0.00 0.00 15600.75 1388.74 19348.72 00:35:24.107 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:24.107 Verification LBA range: start 0x800 length 0x800 00:35:24.107 crypto_ram : 5.01 8200.41 32.03 0.00 0.00 15556.97 1154.68 19348.72 00:35:24.107 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:24.107 Verification LBA range: start 0x0 length 0x800 00:35:24.107 crypto_ram3 : 5.02 4105.05 16.04 0.00 0.00 31042.14 1497.97 23218.47 00:35:24.107 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:24.107 Verification LBA range: start 0x800 length 0x800 00:35:24.108 crypto_ram3 : 5.02 4108.59 16.05 0.00 0.00 31011.61 1451.15 23218.47 00:35:24.108 =================================================================================================================== 00:35:24.108 Total : 24590.56 96.06 0.00 0.00 20744.22 1154.68 23218.47 00:35:25.483 00:35:25.483 real 0m7.470s 00:35:25.483 user 0m13.653s 00:35:25.483 sys 0m0.273s 00:35:25.483 08:48:37 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:25.483 08:48:37 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:35:25.483 ************************************ 00:35:25.483 END TEST bdev_verify 00:35:25.483 ************************************ 00:35:25.483 08:48:37 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:35:25.483 08:48:37 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:25.483 08:48:37 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:35:25.483 08:48:37 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:25.483 08:48:37 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:25.483 ************************************ 00:35:25.483 START TEST bdev_verify_big_io 00:35:25.483 ************************************ 00:35:25.483 08:48:37 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:25.483 [2024-07-23 08:48:37.753945] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:35:25.483 [2024-07-23 08:48:37.754040] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1679812 ] 00:35:25.483 [2024-07-23 08:48:37.894018] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:25.742 [2024-07-23 08:48:38.104685] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:25.742 [2024-07-23 08:48:38.104695] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:26.309 [2024-07-23 08:48:38.537931] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:35:26.309 [2024-07-23 08:48:38.538000] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:26.309 [2024-07-23 08:48:38.538013] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:26.309 [2024-07-23 08:48:38.545952] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:35:26.309 [2024-07-23 08:48:38.545980] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:26.309 [2024-07-23 08:48:38.545989] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:26.309 [2024-07-23 08:48:38.553957] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:35:26.309 [2024-07-23 08:48:38.553982] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:35:26.309 [2024-07-23 08:48:38.553991] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:26.309 Running I/O for 5 seconds... 00:35:31.576 00:35:31.576 Latency(us) 00:35:31.576 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:31.576 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:31.576 Verification LBA range: start 0x0 length 0x80 00:35:31.576 crypto_ram : 5.09 754.02 47.13 0.00 0.00 166832.84 4993.22 235679.94 00:35:31.576 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:31.576 Verification LBA range: start 0x80 length 0x80 00:35:31.576 crypto_ram : 5.05 735.04 45.94 0.00 0.00 171070.19 5336.50 236678.58 00:35:31.576 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:31.576 Verification LBA range: start 0x0 length 0x80 00:35:31.576 crypto_ram3 : 5.17 396.05 24.75 0.00 0.00 309717.30 4618.73 239674.51 00:35:31.576 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:31.576 Verification LBA range: start 0x80 length 0x80 00:35:31.576 crypto_ram3 : 5.15 397.85 24.87 0.00 0.00 308339.69 4525.10 239674.51 00:35:31.576 =================================================================================================================== 00:35:31.576 Total : 2282.96 142.69 0.00 0.00 218186.07 4525.10 239674.51 00:35:32.953 00:35:32.953 real 0m7.632s 00:35:32.953 user 0m14.056s 00:35:32.953 sys 0m0.281s 00:35:32.953 08:48:45 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:32.953 08:48:45 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:35:32.953 ************************************ 00:35:32.953 END TEST bdev_verify_big_io 00:35:32.953 ************************************ 00:35:32.953 08:48:45 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:35:32.953 08:48:45 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:32.953 08:48:45 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:32.953 08:48:45 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:32.953 08:48:45 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:32.953 ************************************ 00:35:32.953 START TEST bdev_write_zeroes 00:35:32.953 ************************************ 00:35:32.953 08:48:45 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:32.953 [2024-07-23 08:48:45.453831] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:35:32.953 [2024-07-23 08:48:45.453914] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1681179 ] 00:35:33.212 [2024-07-23 08:48:45.572006] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:33.471 [2024-07-23 08:48:45.782943] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:33.730 [2024-07-23 08:48:46.240673] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:35:33.730 [2024-07-23 08:48:46.240741] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:33.730 [2024-07-23 08:48:46.240755] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:33.730 [2024-07-23 08:48:46.248690] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:35:33.730 [2024-07-23 08:48:46.248722] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:33.730 [2024-07-23 08:48:46.248731] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:33.996 [2024-07-23 08:48:46.256700] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:35:33.996 [2024-07-23 08:48:46.256728] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:35:33.996 [2024-07-23 08:48:46.256737] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:33.996 Running I/O for 1 seconds... 00:35:34.937 00:35:34.937 Latency(us) 00:35:34.937 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:34.937 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:34.937 crypto_ram : 1.01 37898.12 148.04 0.00 0.00 3371.36 924.53 4805.97 00:35:34.937 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:34.937 crypto_ram3 : 1.01 18921.57 73.91 0.00 0.00 6728.16 4275.44 7177.75 00:35:34.937 =================================================================================================================== 00:35:34.937 Total : 56819.69 221.95 0.00 0.00 4490.29 924.53 7177.75 00:35:36.314 00:35:36.314 real 0m3.326s 00:35:36.314 user 0m3.006s 00:35:36.314 sys 0m0.278s 00:35:36.314 08:48:48 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:36.314 08:48:48 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:35:36.314 ************************************ 00:35:36.314 END TEST bdev_write_zeroes 00:35:36.314 ************************************ 00:35:36.314 08:48:48 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:35:36.314 08:48:48 blockdev_crypto_sw -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:36.314 08:48:48 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:36.314 08:48:48 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:36.314 08:48:48 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:36.314 ************************************ 00:35:36.314 START TEST bdev_json_nonenclosed 00:35:36.314 ************************************ 00:35:36.314 08:48:48 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:36.573 [2024-07-23 08:48:48.852281] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:35:36.573 [2024-07-23 08:48:48.852394] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1681769 ] 00:35:36.573 [2024-07-23 08:48:48.978954] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:36.831 [2024-07-23 08:48:49.189306] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:36.831 [2024-07-23 08:48:49.189383] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:35:36.831 [2024-07-23 08:48:49.189400] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:36.831 [2024-07-23 08:48:49.189410] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:37.398 00:35:37.398 real 0m0.841s 00:35:37.398 user 0m0.662s 00:35:37.398 sys 0m0.175s 00:35:37.398 08:48:49 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:35:37.398 08:48:49 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:37.398 08:48:49 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:35:37.398 ************************************ 00:35:37.398 END TEST bdev_json_nonenclosed 00:35:37.398 ************************************ 00:35:37.398 08:48:49 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:35:37.398 08:48:49 blockdev_crypto_sw -- bdev/blockdev.sh@781 -- # true 00:35:37.398 08:48:49 blockdev_crypto_sw -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:37.398 08:48:49 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:37.398 08:48:49 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:37.398 08:48:49 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:37.398 ************************************ 00:35:37.398 START TEST bdev_json_nonarray 00:35:37.398 ************************************ 00:35:37.398 08:48:49 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:37.398 [2024-07-23 08:48:49.741550] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:35:37.399 [2024-07-23 08:48:49.741637] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1681980 ] 00:35:37.399 [2024-07-23 08:48:49.862168] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:37.657 [2024-07-23 08:48:50.078476] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:37.657 [2024-07-23 08:48:50.078559] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:35:37.657 [2024-07-23 08:48:50.078576] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:37.657 [2024-07-23 08:48:50.078586] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:38.224 00:35:38.224 real 0m0.840s 00:35:38.224 user 0m0.677s 00:35:38.224 sys 0m0.159s 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:35:38.224 ************************************ 00:35:38.224 END TEST bdev_json_nonarray 00:35:38.224 ************************************ 00:35:38.224 08:48:50 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:35:38.224 08:48:50 blockdev_crypto_sw -- bdev/blockdev.sh@784 -- # true 00:35:38.224 08:48:50 blockdev_crypto_sw -- bdev/blockdev.sh@786 -- # [[ crypto_sw == bdev ]] 00:35:38.224 08:48:50 blockdev_crypto_sw -- bdev/blockdev.sh@793 -- # [[ crypto_sw == gpt ]] 00:35:38.224 08:48:50 blockdev_crypto_sw -- bdev/blockdev.sh@797 -- # [[ crypto_sw == crypto_sw ]] 00:35:38.224 08:48:50 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:35:38.224 08:48:50 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:35:38.224 08:48:50 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:38.224 08:48:50 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:38.224 ************************************ 00:35:38.224 START TEST bdev_crypto_enomem 00:35:38.224 ************************************ 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@634 -- # local base_dev=base0 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local test_dev=crypt0 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local err_dev=EE_base0 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local qd=32 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # ERR_PID=1682228 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # waitforlisten 1682228 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 1682228 ']' 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:38.224 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:38.224 08:48:50 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:35:38.224 [2024-07-23 08:48:50.671863] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:35:38.224 [2024-07-23 08:48:50.671955] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1682228 ] 00:35:38.482 [2024-07-23 08:48:50.794954] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:38.741 [2024-07-23 08:48:51.022326] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:38.999 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:38.999 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:35:38.999 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@644 -- # rpc_cmd 00:35:38.999 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:38.999 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:35:38.999 true 00:35:38.999 base0 00:35:38.999 true 00:35:38.999 [2024-07-23 08:48:51.488534] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:35:38.999 crypt0 00:35:38.999 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:39.000 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@651 -- # waitforbdev crypt0 00:35:39.000 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:35:39.000 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:39.000 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:35:39.000 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:39.000 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:39.000 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:35:39.000 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:39.000 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:35:39.000 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:39.000 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:35:39.000 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:39.000 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:35:39.000 [ 00:35:39.000 { 00:35:39.000 "name": "crypt0", 00:35:39.000 "aliases": [ 00:35:39.000 "3f4aafa9-42fd-530f-bd76-2aaee5e0715d" 00:35:39.000 ], 00:35:39.000 "product_name": "crypto", 00:35:39.000 "block_size": 512, 00:35:39.000 "num_blocks": 2097152, 00:35:39.000 "uuid": "3f4aafa9-42fd-530f-bd76-2aaee5e0715d", 00:35:39.000 "assigned_rate_limits": { 00:35:39.000 "rw_ios_per_sec": 0, 00:35:39.000 "rw_mbytes_per_sec": 0, 00:35:39.000 "r_mbytes_per_sec": 0, 00:35:39.000 "w_mbytes_per_sec": 0 00:35:39.000 }, 00:35:39.000 "claimed": false, 00:35:39.000 "zoned": false, 00:35:39.000 "supported_io_types": { 00:35:39.000 "read": true, 00:35:39.000 "write": true, 00:35:39.000 "unmap": false, 00:35:39.000 "flush": false, 00:35:39.000 "reset": true, 00:35:39.000 "nvme_admin": false, 00:35:39.000 "nvme_io": false, 00:35:39.000 "nvme_io_md": false, 00:35:39.000 "write_zeroes": true, 00:35:39.000 "zcopy": false, 00:35:39.000 "get_zone_info": false, 00:35:39.000 "zone_management": false, 00:35:39.000 "zone_append": false, 00:35:39.000 "compare": false, 00:35:39.000 "compare_and_write": false, 00:35:39.000 "abort": false, 00:35:39.000 "seek_hole": false, 00:35:39.000 "seek_data": false, 00:35:39.000 "copy": false, 00:35:39.000 "nvme_iov_md": false 00:35:39.000 }, 00:35:39.000 "memory_domains": [ 00:35:39.000 { 00:35:39.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:39.258 "dma_device_type": 2 00:35:39.258 } 00:35:39.258 ], 00:35:39.258 "driver_specific": { 00:35:39.258 "crypto": { 00:35:39.258 "base_bdev_name": "EE_base0", 00:35:39.258 "name": "crypt0", 00:35:39.258 "key_name": "test_dek_sw" 00:35:39.258 } 00:35:39.258 } 00:35:39.258 } 00:35:39.258 ] 00:35:39.258 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:39.258 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:35:39.258 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # rpcpid=1682264 00:35:39.258 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@656 -- # sleep 1 00:35:39.258 08:48:51 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:39.258 Running I/O for 5 seconds... 00:35:40.194 08:48:52 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:35:40.194 08:48:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:40.194 08:48:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:35:40.194 08:48:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:40.194 08:48:52 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@659 -- # wait 1682264 00:35:44.381 00:35:44.381 Latency(us) 00:35:44.381 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:44.381 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:35:44.381 crypt0 : 5.00 49433.83 193.10 0.00 0.00 644.57 286.72 924.53 00:35:44.381 =================================================================================================================== 00:35:44.381 Total : 49433.83 193.10 0.00 0.00 644.57 286.72 924.53 00:35:44.381 0 00:35:44.381 08:48:56 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@661 -- # rpc_cmd bdev_crypto_delete crypt0 00:35:44.381 08:48:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:44.381 08:48:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:35:44.382 08:48:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:44.382 08:48:56 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@663 -- # killprocess 1682228 00:35:44.382 08:48:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 1682228 ']' 00:35:44.382 08:48:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 1682228 00:35:44.382 08:48:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:35:44.382 08:48:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:44.382 08:48:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1682228 00:35:44.382 08:48:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:35:44.382 08:48:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:35:44.382 08:48:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1682228' 00:35:44.382 killing process with pid 1682228 00:35:44.382 08:48:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 1682228 00:35:44.382 Received shutdown signal, test time was about 5.000000 seconds 00:35:44.382 00:35:44.382 Latency(us) 00:35:44.382 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:44.382 =================================================================================================================== 00:35:44.382 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:44.382 08:48:56 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 1682228 00:35:45.761 08:48:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # trap - SIGINT SIGTERM EXIT 00:35:45.761 00:35:45.761 real 0m7.388s 00:35:45.761 user 0m7.431s 00:35:45.761 sys 0m0.379s 00:35:45.761 08:48:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:45.761 08:48:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:35:45.761 ************************************ 00:35:45.761 END TEST bdev_crypto_enomem 00:35:45.761 ************************************ 00:35:45.761 08:48:58 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:35:45.761 08:48:58 blockdev_crypto_sw -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:35:45.761 08:48:58 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # cleanup 00:35:45.761 08:48:58 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:35:45.761 08:48:58 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:45.761 08:48:58 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:35:45.761 08:48:58 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:35:45.761 08:48:58 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:35:45.761 08:48:58 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:35:45.761 00:35:45.761 real 1m9.704s 00:35:45.761 user 1m55.955s 00:35:45.761 sys 0m6.010s 00:35:45.761 08:48:58 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:45.761 08:48:58 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:45.761 ************************************ 00:35:45.761 END TEST blockdev_crypto_sw 00:35:45.761 ************************************ 00:35:45.761 08:48:58 -- common/autotest_common.sh@1142 -- # return 0 00:35:45.761 08:48:58 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:35:45.761 08:48:58 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:35:45.761 08:48:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:45.761 08:48:58 -- common/autotest_common.sh@10 -- # set +x 00:35:45.761 ************************************ 00:35:45.761 START TEST blockdev_crypto_qat 00:35:45.761 ************************************ 00:35:45.761 08:48:58 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:35:45.761 * Looking for test storage... 00:35:45.761 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:35:45.761 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:35:45.761 08:48:58 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:35:45.761 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:35:45.761 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:45.761 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:35:45.761 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:35:45.761 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:35:45.761 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:35:45.761 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:35:45.761 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:35:45.761 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:35:45.761 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:35:45.761 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # uname -s 00:35:45.761 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:35:45.761 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:35:45.761 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@681 -- # test_type=crypto_qat 00:35:45.762 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # crypto_device= 00:35:45.762 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # dek= 00:35:45.762 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # env_ctx= 00:35:45.762 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:35:45.762 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:35:45.762 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == bdev ]] 00:35:45.762 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == crypto_* ]] 00:35:45.762 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:35:45.762 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:35:45.762 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1683569 00:35:45.762 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:35:45.762 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 1683569 00:35:45.762 08:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:35:45.762 08:48:58 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 1683569 ']' 00:35:45.762 08:48:58 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:45.762 08:48:58 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:45.762 08:48:58 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:45.762 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:45.762 08:48:58 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:45.762 08:48:58 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:45.762 [2024-07-23 08:48:58.277060] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:35:45.762 [2024-07-23 08:48:58.277158] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1683569 ] 00:35:46.020 [2024-07-23 08:48:58.400354] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:46.277 [2024-07-23 08:48:58.616787] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:46.536 08:48:59 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:46.536 08:48:59 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:35:46.536 08:48:59 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:35:46.536 08:48:59 blockdev_crypto_qat -- bdev/blockdev.sh@707 -- # setup_crypto_qat_conf 00:35:46.536 08:48:59 blockdev_crypto_qat -- bdev/blockdev.sh@169 -- # rpc_cmd 00:35:46.536 08:48:59 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:46.536 08:48:59 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:46.536 [2024-07-23 08:48:59.046369] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:46.536 [2024-07-23 08:48:59.054411] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:46.795 [2024-07-23 08:48:59.062426] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:47.054 [2024-07-23 08:48:59.333496] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:50.343 true 00:35:50.343 true 00:35:50.343 true 00:35:50.343 true 00:35:50.343 Malloc0 00:35:50.343 Malloc1 00:35:50.343 Malloc2 00:35:50.343 Malloc3 00:35:50.343 [2024-07-23 08:49:02.695208] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:50.343 crypto_ram 00:35:50.343 [2024-07-23 08:49:02.703231] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:50.343 crypto_ram1 00:35:50.343 [2024-07-23 08:49:02.711231] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:50.343 crypto_ram2 00:35:50.343 [2024-07-23 08:49:02.719268] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:50.343 crypto_ram3 00:35:50.343 [ 00:35:50.343 { 00:35:50.343 "name": "Malloc1", 00:35:50.343 "aliases": [ 00:35:50.343 "dca51ec8-3308-4cfe-b04c-15abde946a00" 00:35:50.343 ], 00:35:50.343 "product_name": "Malloc disk", 00:35:50.343 "block_size": 512, 00:35:50.343 "num_blocks": 65536, 00:35:50.343 "uuid": "dca51ec8-3308-4cfe-b04c-15abde946a00", 00:35:50.343 "assigned_rate_limits": { 00:35:50.343 "rw_ios_per_sec": 0, 00:35:50.343 "rw_mbytes_per_sec": 0, 00:35:50.343 "r_mbytes_per_sec": 0, 00:35:50.343 "w_mbytes_per_sec": 0 00:35:50.343 }, 00:35:50.343 "claimed": true, 00:35:50.343 "claim_type": "exclusive_write", 00:35:50.343 "zoned": false, 00:35:50.343 "supported_io_types": { 00:35:50.343 "read": true, 00:35:50.343 "write": true, 00:35:50.343 "unmap": true, 00:35:50.343 "flush": true, 00:35:50.343 "reset": true, 00:35:50.343 "nvme_admin": false, 00:35:50.343 "nvme_io": false, 00:35:50.343 "nvme_io_md": false, 00:35:50.343 "write_zeroes": true, 00:35:50.343 "zcopy": true, 00:35:50.343 "get_zone_info": false, 00:35:50.343 "zone_management": false, 00:35:50.343 "zone_append": false, 00:35:50.343 "compare": false, 00:35:50.343 "compare_and_write": false, 00:35:50.343 "abort": true, 00:35:50.343 "seek_hole": false, 00:35:50.343 "seek_data": false, 00:35:50.343 "copy": true, 00:35:50.343 "nvme_iov_md": false 00:35:50.343 }, 00:35:50.343 "memory_domains": [ 00:35:50.343 { 00:35:50.343 "dma_device_id": "system", 00:35:50.343 "dma_device_type": 1 00:35:50.343 }, 00:35:50.343 { 00:35:50.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:50.343 "dma_device_type": 2 00:35:50.343 } 00:35:50.343 ], 00:35:50.343 "driver_specific": {} 00:35:50.343 } 00:35:50.343 ] 00:35:50.343 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:50.343 08:49:02 blockdev_crypto_qat -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:35:50.343 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:50.343 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:50.343 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:50.343 08:49:02 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # cat 00:35:50.343 08:49:02 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:35:50.343 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:50.343 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:50.343 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:50.343 08:49:02 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:35:50.343 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:50.343 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:50.343 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:50.343 08:49:02 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:35:50.343 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:50.343 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:50.343 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:50.343 08:49:02 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:35:50.343 08:49:02 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:35:50.343 08:49:02 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:35:50.343 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:50.343 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:50.603 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:50.603 08:49:02 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:35:50.603 08:49:02 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r .name 00:35:50.603 08:49:02 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "33b638cf-1cb6-5f37-a7b9-1be5effe32a8"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "33b638cf-1cb6-5f37-a7b9-1be5effe32a8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "d86911c9-0682-516f-9b83-3b4ff84ef7a8"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d86911c9-0682-516f-9b83-3b4ff84ef7a8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "9eca78dd-e7fc-5ac2-b3df-845e9daca2d9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "9eca78dd-e7fc-5ac2-b3df-845e9daca2d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "124a3ac5-9ef5-556b-ba57-083e46286bb4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "124a3ac5-9ef5-556b-ba57-083e46286bb4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:35:50.603 08:49:02 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:35:50.603 08:49:02 blockdev_crypto_qat -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:35:50.603 08:49:02 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:35:50.603 08:49:02 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # killprocess 1683569 00:35:50.603 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 1683569 ']' 00:35:50.603 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 1683569 00:35:50.603 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:35:50.603 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:50.603 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1683569 00:35:50.603 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:50.603 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:50.603 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1683569' 00:35:50.603 killing process with pid 1683569 00:35:50.603 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 1683569 00:35:50.603 08:49:02 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 1683569 00:35:53.955 08:49:06 blockdev_crypto_qat -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:35:53.955 08:49:06 blockdev_crypto_qat -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:35:53.955 08:49:06 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:35:53.955 08:49:06 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:53.955 08:49:06 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:53.955 ************************************ 00:35:53.955 START TEST bdev_hello_world 00:35:53.955 ************************************ 00:35:53.956 08:49:06 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:35:54.214 [2024-07-23 08:49:06.521120] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:35:54.214 [2024-07-23 08:49:06.521199] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1685092 ] 00:35:54.214 [2024-07-23 08:49:06.638292] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:54.472 [2024-07-23 08:49:06.847435] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:54.472 [2024-07-23 08:49:06.868662] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:54.472 [2024-07-23 08:49:06.876695] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:54.472 [2024-07-23 08:49:06.884709] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:54.730 [2024-07-23 08:49:07.210070] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:58.014 [2024-07-23 08:49:09.933642] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:58.014 [2024-07-23 08:49:09.933707] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:58.014 [2024-07-23 08:49:09.933724] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:58.014 [2024-07-23 08:49:09.941659] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:58.014 [2024-07-23 08:49:09.941692] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:58.014 [2024-07-23 08:49:09.941702] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:58.014 [2024-07-23 08:49:09.949674] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:58.014 [2024-07-23 08:49:09.949702] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:58.014 [2024-07-23 08:49:09.949712] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:58.014 [2024-07-23 08:49:09.957706] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:58.014 [2024-07-23 08:49:09.957733] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:58.014 [2024-07-23 08:49:09.957743] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:58.014 [2024-07-23 08:49:10.166013] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:35:58.014 [2024-07-23 08:49:10.166053] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:35:58.014 [2024-07-23 08:49:10.166071] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:35:58.014 [2024-07-23 08:49:10.167606] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:35:58.014 [2024-07-23 08:49:10.167699] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:35:58.014 [2024-07-23 08:49:10.167715] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:35:58.014 [2024-07-23 08:49:10.167757] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:35:58.014 00:35:58.014 [2024-07-23 08:49:10.167778] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:36:00.544 00:36:00.544 real 0m6.013s 00:36:00.544 user 0m5.440s 00:36:00.544 sys 0m0.477s 00:36:00.544 08:49:12 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:00.544 08:49:12 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:36:00.544 ************************************ 00:36:00.544 END TEST bdev_hello_world 00:36:00.544 ************************************ 00:36:00.544 08:49:12 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:36:00.544 08:49:12 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:36:00.544 08:49:12 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:36:00.544 08:49:12 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:00.544 08:49:12 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:00.544 ************************************ 00:36:00.544 START TEST bdev_bounds 00:36:00.544 ************************************ 00:36:00.544 08:49:12 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:36:00.544 08:49:12 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1686131 00:36:00.544 08:49:12 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:36:00.544 08:49:12 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:36:00.544 08:49:12 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1686131' 00:36:00.544 Process bdevio pid: 1686131 00:36:00.544 08:49:12 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1686131 00:36:00.545 08:49:12 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1686131 ']' 00:36:00.545 08:49:12 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:00.545 08:49:12 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:00.545 08:49:12 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:00.545 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:00.545 08:49:12 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:00.545 08:49:12 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:36:00.545 [2024-07-23 08:49:12.610619] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:36:00.545 [2024-07-23 08:49:12.610712] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1686131 ] 00:36:00.545 [2024-07-23 08:49:12.740757] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:36:00.545 [2024-07-23 08:49:12.970298] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:00.545 [2024-07-23 08:49:12.970364] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:00.545 [2024-07-23 08:49:12.970370] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:36:00.545 [2024-07-23 08:49:12.991694] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:36:00.545 [2024-07-23 08:49:12.999709] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:36:00.545 [2024-07-23 08:49:13.007726] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:36:00.803 [2024-07-23 08:49:13.320724] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:36:04.083 [2024-07-23 08:49:15.998279] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:36:04.083 [2024-07-23 08:49:15.998347] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:04.083 [2024-07-23 08:49:15.998361] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:04.083 [2024-07-23 08:49:16.006294] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:36:04.083 [2024-07-23 08:49:16.006326] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:04.083 [2024-07-23 08:49:16.006336] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:04.083 [2024-07-23 08:49:16.014316] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:36:04.084 [2024-07-23 08:49:16.014346] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:36:04.084 [2024-07-23 08:49:16.014356] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:04.084 [2024-07-23 08:49:16.022345] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:36:04.084 [2024-07-23 08:49:16.022370] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:36:04.084 [2024-07-23 08:49:16.022379] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:04.341 08:49:16 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:04.341 08:49:16 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:36:04.341 08:49:16 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:36:04.341 I/O targets: 00:36:04.341 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:36:04.341 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:36:04.341 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:36:04.341 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:36:04.341 00:36:04.341 00:36:04.341 CUnit - A unit testing framework for C - Version 2.1-3 00:36:04.341 http://cunit.sourceforge.net/ 00:36:04.341 00:36:04.341 00:36:04.341 Suite: bdevio tests on: crypto_ram3 00:36:04.341 Test: blockdev write read block ...passed 00:36:04.341 Test: blockdev write zeroes read block ...passed 00:36:04.341 Test: blockdev write zeroes read no split ...passed 00:36:04.599 Test: blockdev write zeroes read split ...passed 00:36:04.599 Test: blockdev write zeroes read split partial ...passed 00:36:04.599 Test: blockdev reset ...passed 00:36:04.599 Test: blockdev write read 8 blocks ...passed 00:36:04.599 Test: blockdev write read size > 128k ...passed 00:36:04.599 Test: blockdev write read invalid size ...passed 00:36:04.599 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:36:04.599 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:36:04.599 Test: blockdev write read max offset ...passed 00:36:04.599 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:36:04.599 Test: blockdev writev readv 8 blocks ...passed 00:36:04.599 Test: blockdev writev readv 30 x 1block ...passed 00:36:04.599 Test: blockdev writev readv block ...passed 00:36:04.599 Test: blockdev writev readv size > 128k ...passed 00:36:04.599 Test: blockdev writev readv size > 128k in two iovs ...passed 00:36:04.599 Test: blockdev comparev and writev ...passed 00:36:04.599 Test: blockdev nvme passthru rw ...passed 00:36:04.599 Test: blockdev nvme passthru vendor specific ...passed 00:36:04.599 Test: blockdev nvme admin passthru ...passed 00:36:04.599 Test: blockdev copy ...passed 00:36:04.599 Suite: bdevio tests on: crypto_ram2 00:36:04.599 Test: blockdev write read block ...passed 00:36:04.599 Test: blockdev write zeroes read block ...passed 00:36:04.599 Test: blockdev write zeroes read no split ...passed 00:36:04.599 Test: blockdev write zeroes read split ...passed 00:36:04.599 Test: blockdev write zeroes read split partial ...passed 00:36:04.599 Test: blockdev reset ...passed 00:36:04.599 Test: blockdev write read 8 blocks ...passed 00:36:04.599 Test: blockdev write read size > 128k ...passed 00:36:04.599 Test: blockdev write read invalid size ...passed 00:36:04.599 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:36:04.599 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:36:04.599 Test: blockdev write read max offset ...passed 00:36:04.599 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:36:04.599 Test: blockdev writev readv 8 blocks ...passed 00:36:04.599 Test: blockdev writev readv 30 x 1block ...passed 00:36:04.599 Test: blockdev writev readv block ...passed 00:36:04.599 Test: blockdev writev readv size > 128k ...passed 00:36:04.599 Test: blockdev writev readv size > 128k in two iovs ...passed 00:36:04.599 Test: blockdev comparev and writev ...passed 00:36:04.599 Test: blockdev nvme passthru rw ...passed 00:36:04.599 Test: blockdev nvme passthru vendor specific ...passed 00:36:04.599 Test: blockdev nvme admin passthru ...passed 00:36:04.599 Test: blockdev copy ...passed 00:36:04.599 Suite: bdevio tests on: crypto_ram1 00:36:04.599 Test: blockdev write read block ...passed 00:36:04.599 Test: blockdev write zeroes read block ...passed 00:36:04.599 Test: blockdev write zeroes read no split ...passed 00:36:04.857 Test: blockdev write zeroes read split ...passed 00:36:04.857 Test: blockdev write zeroes read split partial ...passed 00:36:04.857 Test: blockdev reset ...passed 00:36:04.857 Test: blockdev write read 8 blocks ...passed 00:36:04.857 Test: blockdev write read size > 128k ...passed 00:36:04.857 Test: blockdev write read invalid size ...passed 00:36:04.857 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:36:04.857 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:36:04.857 Test: blockdev write read max offset ...passed 00:36:04.857 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:36:04.857 Test: blockdev writev readv 8 blocks ...passed 00:36:04.857 Test: blockdev writev readv 30 x 1block ...passed 00:36:04.857 Test: blockdev writev readv block ...passed 00:36:04.857 Test: blockdev writev readv size > 128k ...passed 00:36:04.857 Test: blockdev writev readv size > 128k in two iovs ...passed 00:36:04.857 Test: blockdev comparev and writev ...passed 00:36:04.857 Test: blockdev nvme passthru rw ...passed 00:36:04.857 Test: blockdev nvme passthru vendor specific ...passed 00:36:04.857 Test: blockdev nvme admin passthru ...passed 00:36:04.857 Test: blockdev copy ...passed 00:36:04.857 Suite: bdevio tests on: crypto_ram 00:36:04.857 Test: blockdev write read block ...passed 00:36:04.857 Test: blockdev write zeroes read block ...passed 00:36:04.857 Test: blockdev write zeroes read no split ...passed 00:36:04.857 Test: blockdev write zeroes read split ...passed 00:36:05.116 Test: blockdev write zeroes read split partial ...passed 00:36:05.116 Test: blockdev reset ...passed 00:36:05.116 Test: blockdev write read 8 blocks ...passed 00:36:05.116 Test: blockdev write read size > 128k ...passed 00:36:05.116 Test: blockdev write read invalid size ...passed 00:36:05.116 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:36:05.116 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:36:05.116 Test: blockdev write read max offset ...passed 00:36:05.116 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:36:05.116 Test: blockdev writev readv 8 blocks ...passed 00:36:05.116 Test: blockdev writev readv 30 x 1block ...passed 00:36:05.116 Test: blockdev writev readv block ...passed 00:36:05.116 Test: blockdev writev readv size > 128k ...passed 00:36:05.116 Test: blockdev writev readv size > 128k in two iovs ...passed 00:36:05.116 Test: blockdev comparev and writev ...passed 00:36:05.116 Test: blockdev nvme passthru rw ...passed 00:36:05.116 Test: blockdev nvme passthru vendor specific ...passed 00:36:05.116 Test: blockdev nvme admin passthru ...passed 00:36:05.116 Test: blockdev copy ...passed 00:36:05.116 00:36:05.116 Run Summary: Type Total Ran Passed Failed Inactive 00:36:05.116 suites 4 4 n/a 0 0 00:36:05.116 tests 92 92 92 0 0 00:36:05.116 asserts 520 520 520 0 n/a 00:36:05.116 00:36:05.116 Elapsed time = 1.488 seconds 00:36:05.116 0 00:36:05.116 08:49:17 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1686131 00:36:05.116 08:49:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1686131 ']' 00:36:05.116 08:49:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1686131 00:36:05.116 08:49:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:36:05.116 08:49:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:05.116 08:49:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1686131 00:36:05.116 08:49:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:36:05.116 08:49:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:36:05.116 08:49:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1686131' 00:36:05.116 killing process with pid 1686131 00:36:05.116 08:49:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1686131 00:36:05.116 08:49:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1686131 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:36:07.644 00:36:07.644 real 0m7.275s 00:36:07.644 user 0m20.008s 00:36:07.644 sys 0m0.660s 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:36:07.644 ************************************ 00:36:07.644 END TEST bdev_bounds 00:36:07.644 ************************************ 00:36:07.644 08:49:19 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:36:07.644 08:49:19 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:36:07.644 08:49:19 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:36:07.644 08:49:19 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:07.644 08:49:19 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:07.644 ************************************ 00:36:07.644 START TEST bdev_nbd 00:36:07.644 ************************************ 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1687543 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1687543 /var/tmp/spdk-nbd.sock 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1687543 ']' 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:36:07.644 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:07.644 08:49:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:36:07.644 [2024-07-23 08:49:19.963387] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:36:07.644 [2024-07-23 08:49:19.963500] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:36:07.644 [2024-07-23 08:49:20.090375] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:07.902 [2024-07-23 08:49:20.296174] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:07.902 [2024-07-23 08:49:20.317346] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:36:07.902 [2024-07-23 08:49:20.325377] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:36:07.902 [2024-07-23 08:49:20.333395] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:36:08.160 [2024-07-23 08:49:20.618765] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:36:11.441 [2024-07-23 08:49:23.383316] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:36:11.441 [2024-07-23 08:49:23.383390] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:11.441 [2024-07-23 08:49:23.383404] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:11.441 [2024-07-23 08:49:23.391334] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:36:11.442 [2024-07-23 08:49:23.391373] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:11.442 [2024-07-23 08:49:23.391385] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:11.442 [2024-07-23 08:49:23.399347] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:36:11.442 [2024-07-23 08:49:23.399375] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:36:11.442 [2024-07-23 08:49:23.399385] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:11.442 [2024-07-23 08:49:23.407399] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:36:11.442 [2024-07-23 08:49:23.407429] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:36:11.442 [2024-07-23 08:49:23.407441] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:11.699 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:11.699 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:36:11.699 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:36:11.699 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:11.699 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:36:11.699 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:36:11.699 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:36:11.699 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:11.699 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:36:11.699 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:36:11.699 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:36:11.699 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:36:11.699 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:36:11.699 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:36:11.699 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:11.979 1+0 records in 00:36:11.979 1+0 records out 00:36:11.979 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000178506 s, 22.9 MB/s 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:36:11.979 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:12.255 1+0 records in 00:36:12.255 1+0 records out 00:36:12.255 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263903 s, 15.5 MB/s 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:36:12.255 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:12.513 1+0 records in 00:36:12.513 1+0 records out 00:36:12.513 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225043 s, 18.2 MB/s 00:36:12.513 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:12.513 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:36:12.513 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:12.514 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:36:12.514 08:49:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:36:12.514 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:36:12.514 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:36:12.514 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:36:12.514 08:49:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:12.514 1+0 records in 00:36:12.514 1+0 records out 00:36:12.514 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000273256 s, 15.0 MB/s 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:36:12.514 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:36:12.772 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:36:12.772 { 00:36:12.772 "nbd_device": "/dev/nbd0", 00:36:12.772 "bdev_name": "crypto_ram" 00:36:12.772 }, 00:36:12.772 { 00:36:12.772 "nbd_device": "/dev/nbd1", 00:36:12.772 "bdev_name": "crypto_ram1" 00:36:12.772 }, 00:36:12.772 { 00:36:12.772 "nbd_device": "/dev/nbd2", 00:36:12.772 "bdev_name": "crypto_ram2" 00:36:12.772 }, 00:36:12.772 { 00:36:12.772 "nbd_device": "/dev/nbd3", 00:36:12.772 "bdev_name": "crypto_ram3" 00:36:12.772 } 00:36:12.772 ]' 00:36:12.772 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:36:12.772 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:36:12.772 { 00:36:12.772 "nbd_device": "/dev/nbd0", 00:36:12.772 "bdev_name": "crypto_ram" 00:36:12.772 }, 00:36:12.772 { 00:36:12.772 "nbd_device": "/dev/nbd1", 00:36:12.772 "bdev_name": "crypto_ram1" 00:36:12.772 }, 00:36:12.772 { 00:36:12.772 "nbd_device": "/dev/nbd2", 00:36:12.772 "bdev_name": "crypto_ram2" 00:36:12.772 }, 00:36:12.772 { 00:36:12.772 "nbd_device": "/dev/nbd3", 00:36:12.772 "bdev_name": "crypto_ram3" 00:36:12.772 } 00:36:12.772 ]' 00:36:12.772 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:36:12.772 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:36:12.772 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:12.772 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:36:12.772 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:36:12.772 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:36:12.772 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:12.772 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:36:13.031 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:36:13.031 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:36:13.031 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:36:13.031 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:13.031 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:13.031 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:36:13.031 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:13.031 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:13.031 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:13.031 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:13.289 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:36:13.547 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:36:13.547 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:36:13.547 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:36:13.547 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:13.547 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:13.547 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:36:13.547 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:13.547 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:13.547 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:36:13.547 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:13.547 08:49:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:36:13.805 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:36:13.805 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:36:13.805 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:36:13.805 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:36:13.805 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:36:13.805 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:36:13.805 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:36:13.805 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:36:13.805 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:36:13.805 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:36:13.805 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:36:13.805 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:36:13.806 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:36:13.806 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:13.806 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:36:13.806 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:36:13.806 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:13.806 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:36:13.806 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:36:13.806 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:13.806 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:36:13.806 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:36:13.806 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:13.806 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:36:13.806 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:36:13.806 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:36:13.806 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:36:13.806 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:36:14.064 /dev/nbd0 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:14.064 1+0 records in 00:36:14.064 1+0 records out 00:36:14.064 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000206811 s, 19.8 MB/s 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:36:14.064 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:36:14.322 /dev/nbd1 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:14.322 1+0 records in 00:36:14.322 1+0 records out 00:36:14.322 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269069 s, 15.2 MB/s 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:36:14.322 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:36:14.581 /dev/nbd10 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:14.581 1+0 records in 00:36:14.581 1+0 records out 00:36:14.581 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000248118 s, 16.5 MB/s 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:36:14.581 08:49:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:36:14.581 /dev/nbd11 00:36:14.581 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:36:14.581 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:36:14.581 08:49:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:36:14.581 08:49:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:36:14.581 08:49:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:36:14.581 08:49:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:36:14.581 08:49:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:36:14.839 08:49:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:36:14.839 08:49:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:36:14.839 08:49:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:36:14.839 08:49:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:14.839 1+0 records in 00:36:14.839 1+0 records out 00:36:14.839 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00023776 s, 17.2 MB/s 00:36:14.839 08:49:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:14.839 08:49:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:36:14.839 08:49:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:14.839 08:49:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:36:14.839 08:49:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:36:14.839 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:14.839 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:36:14.839 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:36:14.839 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:14.839 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:36:14.839 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:36:14.839 { 00:36:14.839 "nbd_device": "/dev/nbd0", 00:36:14.839 "bdev_name": "crypto_ram" 00:36:14.839 }, 00:36:14.839 { 00:36:14.839 "nbd_device": "/dev/nbd1", 00:36:14.839 "bdev_name": "crypto_ram1" 00:36:14.839 }, 00:36:14.839 { 00:36:14.839 "nbd_device": "/dev/nbd10", 00:36:14.839 "bdev_name": "crypto_ram2" 00:36:14.839 }, 00:36:14.839 { 00:36:14.839 "nbd_device": "/dev/nbd11", 00:36:14.839 "bdev_name": "crypto_ram3" 00:36:14.839 } 00:36:14.839 ]' 00:36:14.839 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:36:14.839 { 00:36:14.839 "nbd_device": "/dev/nbd0", 00:36:14.839 "bdev_name": "crypto_ram" 00:36:14.839 }, 00:36:14.839 { 00:36:14.839 "nbd_device": "/dev/nbd1", 00:36:14.839 "bdev_name": "crypto_ram1" 00:36:14.839 }, 00:36:14.839 { 00:36:14.839 "nbd_device": "/dev/nbd10", 00:36:14.839 "bdev_name": "crypto_ram2" 00:36:14.839 }, 00:36:14.839 { 00:36:14.839 "nbd_device": "/dev/nbd11", 00:36:14.839 "bdev_name": "crypto_ram3" 00:36:14.840 } 00:36:14.840 ]' 00:36:14.840 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:36:14.840 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:36:14.840 /dev/nbd1 00:36:14.840 /dev/nbd10 00:36:14.840 /dev/nbd11' 00:36:14.840 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:36:14.840 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:36:14.840 /dev/nbd1 00:36:14.840 /dev/nbd10 00:36:14.840 /dev/nbd11' 00:36:14.840 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:36:14.840 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:36:14.840 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:36:14.840 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:36:14.840 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:36:14.840 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:14.840 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:36:14.840 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:36:14.840 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:36:14.840 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:36:14.840 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:36:14.840 256+0 records in 00:36:14.840 256+0 records out 00:36:14.840 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0100033 s, 105 MB/s 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:36:15.098 256+0 records in 00:36:15.098 256+0 records out 00:36:15.098 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0419203 s, 25.0 MB/s 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:36:15.098 256+0 records in 00:36:15.098 256+0 records out 00:36:15.098 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0369811 s, 28.4 MB/s 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:36:15.098 256+0 records in 00:36:15.098 256+0 records out 00:36:15.098 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0253115 s, 41.4 MB/s 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:36:15.098 256+0 records in 00:36:15.098 256+0 records out 00:36:15.098 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0264018 s, 39.7 MB/s 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:15.098 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:36:15.356 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:36:15.356 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:36:15.356 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:36:15.356 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:15.356 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:15.356 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:36:15.356 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:15.356 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:15.356 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:15.356 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:36:15.615 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:36:15.615 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:36:15.615 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:36:15.615 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:15.615 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:15.615 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:36:15.615 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:15.615 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:15.615 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:15.615 08:49:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:36:15.615 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:36:15.615 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:36:15.615 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:36:15.615 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:15.615 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:15.615 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:36:15.615 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:15.615 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:15.615 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:15.615 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:36:15.872 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:36:15.872 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:36:15.872 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:36:15.872 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:15.872 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:15.872 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:36:15.872 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:15.872 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:15.872 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:36:15.872 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:15.872 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:36:16.129 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:36:16.388 malloc_lvol_verify 00:36:16.388 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:36:16.388 b8b8dcd6-e8bf-4737-9fe5-f43f314d1be4 00:36:16.388 08:49:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:36:16.646 c83b88e2-73a3-45c9-a88f-f2f647e9f8b9 00:36:16.646 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:36:16.904 /dev/nbd0 00:36:16.905 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:36:16.905 mke2fs 1.46.5 (30-Dec-2021) 00:36:16.905 Discarding device blocks: 0/4096 done 00:36:16.905 Creating filesystem with 4096 1k blocks and 1024 inodes 00:36:16.905 00:36:16.905 Allocating group tables: 0/1 done 00:36:16.905 Writing inode tables: 0/1 done 00:36:16.905 Creating journal (1024 blocks): done 00:36:16.905 Writing superblocks and filesystem accounting information: 0/1 done 00:36:16.905 00:36:16.905 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:36:16.905 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:36:16.905 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:16.905 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:36:16.905 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:36:16.905 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:36:16.905 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:16.905 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1687543 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1687543 ']' 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1687543 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1687543 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1687543' 00:36:17.163 killing process with pid 1687543 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1687543 00:36:17.163 08:49:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1687543 00:36:19.698 08:49:31 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:36:19.698 00:36:19.698 real 0m12.080s 00:36:19.698 user 0m14.649s 00:36:19.698 sys 0m2.688s 00:36:19.698 08:49:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:19.698 08:49:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:36:19.698 ************************************ 00:36:19.698 END TEST bdev_nbd 00:36:19.698 ************************************ 00:36:19.698 08:49:31 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:36:19.698 08:49:31 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:36:19.698 08:49:31 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = nvme ']' 00:36:19.698 08:49:31 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = gpt ']' 00:36:19.698 08:49:31 blockdev_crypto_qat -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:36:19.698 08:49:31 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:36:19.698 08:49:31 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:19.698 08:49:31 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:19.698 ************************************ 00:36:19.698 START TEST bdev_fio 00:36:19.698 ************************************ 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:36:19.698 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram1]' 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram1 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:36:19.698 ************************************ 00:36:19.698 START TEST bdev_fio_rw_verify 00:36:19.698 ************************************ 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib=/usr/lib64/libasan.so.8 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1347 -- # break 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:36:19.698 08:49:32 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:20.265 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:20.265 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:20.265 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:20.265 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:20.265 fio-3.35 00:36:20.265 Starting 4 threads 00:36:35.142 00:36:35.142 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1690696: Tue Jul 23 08:49:46 2024 00:36:35.142 read: IOPS=25.1k, BW=98.1MiB/s (103MB/s)(981MiB/10001msec) 00:36:35.142 slat (usec): min=13, max=402, avg=55.11, stdev=30.10 00:36:35.142 clat (usec): min=20, max=1361, avg=313.17, stdev=196.02 00:36:35.142 lat (usec): min=52, max=1582, avg=368.28, stdev=210.92 00:36:35.142 clat percentiles (usec): 00:36:35.142 | 50.000th=[ 260], 99.000th=[ 906], 99.900th=[ 1057], 99.990th=[ 1172], 00:36:35.142 | 99.999th=[ 1287] 00:36:35.142 write: IOPS=27.6k, BW=108MiB/s (113MB/s)(1049MiB/9732msec); 0 zone resets 00:36:35.142 slat (usec): min=22, max=450, avg=64.61, stdev=29.79 00:36:35.142 clat (usec): min=17, max=2520, avg=344.15, stdev=205.56 00:36:35.142 lat (usec): min=58, max=2787, avg=408.76, stdev=220.13 00:36:35.142 clat percentiles (usec): 00:36:35.142 | 50.000th=[ 297], 99.000th=[ 979], 99.900th=[ 1123], 99.990th=[ 1680], 00:36:35.142 | 99.999th=[ 2442] 00:36:35.142 bw ( KiB/s): min=90728, max=158055, per=97.82%, avg=107988.58, stdev=5278.77, samples=76 00:36:35.142 iops : min=22682, max=39513, avg=26997.11, stdev=1319.66, samples=76 00:36:35.142 lat (usec) : 20=0.01%, 50=0.04%, 100=6.18%, 250=37.68%, 500=37.84% 00:36:35.142 lat (usec) : 750=13.53%, 1000=4.22% 00:36:35.142 lat (msec) : 2=0.50%, 4=0.01% 00:36:35.142 cpu : usr=99.40%, sys=0.17%, ctx=62, majf=0, minf=26050 00:36:35.142 IO depths : 1=4.0%, 2=27.5%, 4=54.8%, 8=13.7%, 16=0.0%, 32=0.0%, >=64=0.0% 00:36:35.142 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:36:35.142 complete : 0=0.0%, 4=87.9%, 8=12.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:36:35.142 issued rwts: total=251175,268598,0,0 short=0,0,0,0 dropped=0,0,0,0 00:36:35.142 latency : target=0, window=0, percentile=100.00%, depth=8 00:36:35.142 00:36:35.142 Run status group 0 (all jobs): 00:36:35.142 READ: bw=98.1MiB/s (103MB/s), 98.1MiB/s-98.1MiB/s (103MB/s-103MB/s), io=981MiB (1029MB), run=10001-10001msec 00:36:35.142 WRITE: bw=108MiB/s (113MB/s), 108MiB/s-108MiB/s (113MB/s-113MB/s), io=1049MiB (1100MB), run=9732-9732msec 00:36:36.077 ----------------------------------------------------- 00:36:36.077 Suppressions used: 00:36:36.077 count bytes template 00:36:36.077 4 47 /usr/src/fio/parse.c 00:36:36.077 1546 148416 /usr/src/fio/iolog.c 00:36:36.077 1 8 libtcmalloc_minimal.so 00:36:36.077 1 904 libcrypto.so 00:36:36.077 ----------------------------------------------------- 00:36:36.077 00:36:36.077 00:36:36.077 real 0m16.468s 00:36:36.077 user 0m51.159s 00:36:36.077 sys 0m0.769s 00:36:36.077 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:36.077 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:36:36.077 ************************************ 00:36:36.077 END TEST bdev_fio_rw_verify 00:36:36.077 ************************************ 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "33b638cf-1cb6-5f37-a7b9-1be5effe32a8"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "33b638cf-1cb6-5f37-a7b9-1be5effe32a8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "d86911c9-0682-516f-9b83-3b4ff84ef7a8"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d86911c9-0682-516f-9b83-3b4ff84ef7a8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "9eca78dd-e7fc-5ac2-b3df-845e9daca2d9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "9eca78dd-e7fc-5ac2-b3df-845e9daca2d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "124a3ac5-9ef5-556b-ba57-083e46286bb4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "124a3ac5-9ef5-556b-ba57-083e46286bb4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:36:36.337 crypto_ram1 00:36:36.337 crypto_ram2 00:36:36.337 crypto_ram3 ]] 00:36:36.337 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "33b638cf-1cb6-5f37-a7b9-1be5effe32a8"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "33b638cf-1cb6-5f37-a7b9-1be5effe32a8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "d86911c9-0682-516f-9b83-3b4ff84ef7a8"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d86911c9-0682-516f-9b83-3b4ff84ef7a8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "9eca78dd-e7fc-5ac2-b3df-845e9daca2d9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "9eca78dd-e7fc-5ac2-b3df-845e9daca2d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "124a3ac5-9ef5-556b-ba57-083e46286bb4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "124a3ac5-9ef5-556b-ba57-083e46286bb4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram1]' 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram1 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:36:36.338 ************************************ 00:36:36.338 START TEST bdev_fio_trim 00:36:36.338 ************************************ 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib=/usr/lib64/libasan.so.8 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1347 -- # break 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:36:36.338 08:49:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:36.597 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:36.597 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:36.597 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:36.597 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:36.597 fio-3.35 00:36:36.597 Starting 4 threads 00:36:51.518 00:36:51.518 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1693762: Tue Jul 23 08:50:02 2024 00:36:51.518 write: IOPS=41.5k, BW=162MiB/s (170MB/s)(1621MiB/10001msec); 0 zone resets 00:36:51.518 slat (usec): min=14, max=443, avg=54.63, stdev=22.06 00:36:51.518 clat (usec): min=32, max=1298, avg=202.94, stdev=102.74 00:36:51.518 lat (usec): min=46, max=1572, avg=257.58, stdev=114.95 00:36:51.518 clat percentiles (usec): 00:36:51.518 | 50.000th=[ 182], 99.000th=[ 441], 99.900th=[ 519], 99.990th=[ 717], 00:36:51.518 | 99.999th=[ 1090] 00:36:51.518 bw ( KiB/s): min=140896, max=254248, per=100.00%, avg=167154.11, stdev=11618.71, samples=76 00:36:51.518 iops : min=35224, max=63568, avg=41788.63, stdev=2904.70, samples=76 00:36:51.518 trim: IOPS=41.5k, BW=162MiB/s (170MB/s)(1621MiB/10001msec); 0 zone resets 00:36:51.518 slat (usec): min=4, max=408, avg=16.11, stdev= 7.78 00:36:51.518 clat (usec): min=46, max=1574, avg=257.75, stdev=114.97 00:36:51.518 lat (usec): min=51, max=1628, avg=273.86, stdev=117.53 00:36:51.518 clat percentiles (usec): 00:36:51.518 | 50.000th=[ 239], 99.000th=[ 519], 99.900th=[ 611], 99.990th=[ 865], 00:36:51.518 | 99.999th=[ 1319] 00:36:51.518 bw ( KiB/s): min=140896, max=254272, per=100.00%, avg=167154.53, stdev=11618.81, samples=76 00:36:51.518 iops : min=35224, max=63568, avg=41788.63, stdev=2904.70, samples=76 00:36:51.518 lat (usec) : 50=1.19%, 100=9.98%, 250=49.21%, 500=38.75%, 750=0.85% 00:36:51.518 lat (usec) : 1000=0.01% 00:36:51.518 lat (msec) : 2=0.01% 00:36:51.518 cpu : usr=99.59%, sys=0.04%, ctx=73, majf=0, minf=7671 00:36:51.518 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:36:51.518 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:36:51.518 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:36:51.518 issued rwts: total=0,414956,414956,0 short=0,0,0,0 dropped=0,0,0,0 00:36:51.518 latency : target=0, window=0, percentile=100.00%, depth=8 00:36:51.518 00:36:51.518 Run status group 0 (all jobs): 00:36:51.518 WRITE: bw=162MiB/s (170MB/s), 162MiB/s-162MiB/s (170MB/s-170MB/s), io=1621MiB (1700MB), run=10001-10001msec 00:36:51.518 TRIM: bw=162MiB/s (170MB/s), 162MiB/s-162MiB/s (170MB/s-170MB/s), io=1621MiB (1700MB), run=10001-10001msec 00:36:52.895 ----------------------------------------------------- 00:36:52.895 Suppressions used: 00:36:52.895 count bytes template 00:36:52.895 4 47 /usr/src/fio/parse.c 00:36:52.895 1 8 libtcmalloc_minimal.so 00:36:52.895 1 904 libcrypto.so 00:36:52.895 ----------------------------------------------------- 00:36:52.895 00:36:52.895 00:36:52.895 real 0m16.398s 00:36:52.895 user 0m51.098s 00:36:52.895 sys 0m0.700s 00:36:52.895 08:50:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:52.895 08:50:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:36:52.895 ************************************ 00:36:52.895 END TEST bdev_fio_trim 00:36:52.895 ************************************ 00:36:52.895 08:50:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:36:52.895 08:50:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:36:52.895 08:50:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:52.895 08:50:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:36:52.895 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:52.895 08:50:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:36:52.895 00:36:52.895 real 0m33.129s 00:36:52.895 user 1m42.401s 00:36:52.895 sys 0m1.601s 00:36:52.895 08:50:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:52.895 08:50:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:36:52.895 ************************************ 00:36:52.895 END TEST bdev_fio 00:36:52.895 ************************************ 00:36:52.896 08:50:05 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:36:52.896 08:50:05 blockdev_crypto_qat -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:36:52.896 08:50:05 blockdev_crypto_qat -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:36:52.896 08:50:05 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:36:52.896 08:50:05 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:52.896 08:50:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:52.896 ************************************ 00:36:52.896 START TEST bdev_verify 00:36:52.896 ************************************ 00:36:52.896 08:50:05 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:36:52.896 [2024-07-23 08:50:05.285981] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:36:52.896 [2024-07-23 08:50:05.286059] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1696069 ] 00:36:52.896 [2024-07-23 08:50:05.404215] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:36:53.156 [2024-07-23 08:50:05.619769] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:53.156 [2024-07-23 08:50:05.619776] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:53.156 [2024-07-23 08:50:05.641072] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:36:53.156 [2024-07-23 08:50:05.649095] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:36:53.156 [2024-07-23 08:50:05.657123] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:36:53.723 [2024-07-23 08:50:05.953159] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:36:56.255 [2024-07-23 08:50:08.684468] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:36:56.255 [2024-07-23 08:50:08.684537] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:56.255 [2024-07-23 08:50:08.684550] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:56.255 [2024-07-23 08:50:08.692485] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:36:56.255 [2024-07-23 08:50:08.692515] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:56.255 [2024-07-23 08:50:08.692525] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:56.255 [2024-07-23 08:50:08.700501] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:36:56.255 [2024-07-23 08:50:08.700526] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:36:56.255 [2024-07-23 08:50:08.700535] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:56.255 [2024-07-23 08:50:08.708525] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:36:56.255 [2024-07-23 08:50:08.708552] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:36:56.255 [2024-07-23 08:50:08.708561] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:56.514 Running I/O for 5 seconds... 00:37:01.779 00:37:01.779 Latency(us) 00:37:01.779 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:01.779 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:37:01.779 Verification LBA range: start 0x0 length 0x1000 00:37:01.779 crypto_ram : 5.06 607.44 2.37 0.00 0.00 210353.13 8426.06 140808.78 00:37:01.779 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:37:01.779 Verification LBA range: start 0x1000 length 0x1000 00:37:01.779 crypto_ram : 5.06 607.32 2.37 0.00 0.00 210404.42 9549.53 139810.13 00:37:01.779 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:37:01.779 Verification LBA range: start 0x0 length 0x1000 00:37:01.779 crypto_ram1 : 5.06 607.34 2.37 0.00 0.00 209851.59 8800.55 127826.41 00:37:01.779 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:37:01.779 Verification LBA range: start 0x1000 length 0x1000 00:37:01.779 crypto_ram1 : 5.06 607.05 2.37 0.00 0.00 209888.81 11047.50 127327.09 00:37:01.779 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:37:01.779 Verification LBA range: start 0x0 length 0x1000 00:37:01.779 crypto_ram2 : 5.04 4758.08 18.59 0.00 0.00 26683.84 3417.23 22968.81 00:37:01.779 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:37:01.779 Verification LBA range: start 0x1000 length 0x1000 00:37:01.779 crypto_ram2 : 5.05 4765.23 18.61 0.00 0.00 26661.37 5554.96 22968.81 00:37:01.779 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:37:01.779 Verification LBA range: start 0x0 length 0x1000 00:37:01.779 crypto_ram3 : 5.05 4765.16 18.61 0.00 0.00 26605.34 3198.78 23468.13 00:37:01.779 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:37:01.779 Verification LBA range: start 0x1000 length 0x1000 00:37:01.779 crypto_ram3 : 5.05 4762.37 18.60 0.00 0.00 26604.99 6584.81 23468.13 00:37:01.779 =================================================================================================================== 00:37:01.779 Total : 21479.99 83.91 0.00 0.00 47423.07 3198.78 140808.78 00:37:04.306 00:37:04.306 real 0m11.165s 00:37:04.306 user 0m20.942s 00:37:04.306 sys 0m0.471s 00:37:04.306 08:50:16 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:04.306 08:50:16 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:37:04.306 ************************************ 00:37:04.306 END TEST bdev_verify 00:37:04.307 ************************************ 00:37:04.307 08:50:16 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:37:04.307 08:50:16 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:37:04.307 08:50:16 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:37:04.307 08:50:16 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:37:04.307 08:50:16 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:37:04.307 ************************************ 00:37:04.307 START TEST bdev_verify_big_io 00:37:04.307 ************************************ 00:37:04.307 08:50:16 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:37:04.307 [2024-07-23 08:50:16.516481] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:37:04.307 [2024-07-23 08:50:16.516569] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1698044 ] 00:37:04.307 [2024-07-23 08:50:16.639154] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:37:04.564 [2024-07-23 08:50:16.852729] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:37:04.564 [2024-07-23 08:50:16.852731] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:04.564 [2024-07-23 08:50:16.874049] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:37:04.564 [2024-07-23 08:50:16.882066] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:37:04.564 [2024-07-23 08:50:16.890100] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:37:04.821 [2024-07-23 08:50:17.206916] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:37:08.129 [2024-07-23 08:50:19.908880] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:37:08.129 [2024-07-23 08:50:19.908945] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:37:08.129 [2024-07-23 08:50:19.908957] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:08.129 [2024-07-23 08:50:19.916897] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:37:08.129 [2024-07-23 08:50:19.916929] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:37:08.129 [2024-07-23 08:50:19.916939] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:08.129 [2024-07-23 08:50:19.924916] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:37:08.129 [2024-07-23 08:50:19.924943] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:37:08.129 [2024-07-23 08:50:19.924952] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:08.129 [2024-07-23 08:50:19.932939] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:37:08.129 [2024-07-23 08:50:19.932967] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:37:08.129 [2024-07-23 08:50:19.932977] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:08.129 Running I/O for 5 seconds... 00:37:08.401 [2024-07-23 08:50:20.743464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.743771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.743827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.743861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.743895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.743926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.744190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.744208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.746925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.746972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.747008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.747040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.747393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.747439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.747476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.747508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.747773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.747789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.750440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.750498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.750536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.750567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.750939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.750977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.751022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.751065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.751342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.751356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.753911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.753956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.753989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.754019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.754343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.754384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.754416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.754457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.754828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.754843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.757389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.757445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.757488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.757530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.757931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.757967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.757998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.758027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.758369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.758385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.760935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.760988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.761022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.761053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.761434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.761472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.761507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.761538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.761871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.761886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.764235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.764293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.764326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.764357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.764729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.764832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.764866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.764897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.765223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.765241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.767736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.767782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.767813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.767845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.768216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.768256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.768288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.768319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.768639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.768654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.771088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.771134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.771166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.771197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.771570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.771631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.771666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.771697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.772052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.772067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.774666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.774711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.401 [2024-07-23 08:50:20.774743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.774777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.775122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.775162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.775196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.775230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.775576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.775594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.777995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.778053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.778087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.778120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.778496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.778534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.778567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.778602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.778944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.778961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.781416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.781461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.781494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.781529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.781904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.781941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.781977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.782008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.782297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.782311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.784731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.784776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.784811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.784846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.785232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.785269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.785323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.785359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.785686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.785701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.788177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.788223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.788256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.788287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.788600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.788644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.788676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.788707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.789003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.789017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.791455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.791501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.791543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.791576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.792007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.792045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.792090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.792149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.792448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.792461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.794839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.794883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.794916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.794948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.795311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.795347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.795394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.795437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.795802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.795818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.798148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.798203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.798247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.798301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.798648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.798687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.798720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.798753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.799073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.799088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.801381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.801427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.801458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.801501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.801900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.801937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.801970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.802002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.802328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.402 [2024-07-23 08:50:20.802345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.804569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.804618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.804651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.804682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.805060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.805097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.805130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.805164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.805475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.805489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.807854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.807900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.807932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.807965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.808344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.808380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.808412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.808445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.808774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.808789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.811011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.811055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.811088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.811119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.811493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.811529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.811562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.811592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.811915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.811931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.814175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.814219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.814252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.814283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.814649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.814690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.814722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.814760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.815081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.815096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.817430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.817476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.817507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.817538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.817910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.817947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.817980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.818012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.818338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.818351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.820567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.820617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.820650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.820682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.821048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.821084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.821118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.821150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.821420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.821434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.823604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.823654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.823686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.823718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.824089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.824127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.824176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.824221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.824550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.824563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.826886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.826931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.826965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.826998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.827356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.827393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.827425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.827469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.827804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.827820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.830203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.830250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.830304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.830348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.830713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.830770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.830804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.830837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.831147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.831161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.403 [2024-07-23 08:50:20.833345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.833389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.833433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.833464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.833851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.833890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.833922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.833953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.834282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.834300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.836292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.836336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.836367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.836398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.836677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.836713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.836746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.836783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.837013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.837027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.838674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.838718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.838751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.838791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.839183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.839220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.839253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.839286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.839641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.839657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.841501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.841545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.841576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.841606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.841881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.841917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.841948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.841979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.842369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.842387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.843963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.844007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.844038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.844069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.844429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.844465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.844498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.844531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.844851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.844866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.847778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.848444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.849299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.850371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.851376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.851683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.851979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.852271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.852619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.852639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.854603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.855558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.856615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.857688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.858229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.858529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.858829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.859124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.859460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.859475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.862000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.862976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.864060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.865246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.865851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.866148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.866443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.866742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.867021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.867036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.869411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.870488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.871559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.871936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.872628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.872926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.873220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.873714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.874014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.874028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.876595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.877683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.404 [2024-07-23 08:50:20.878310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.878606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.879260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.879557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.879856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.880871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.881115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.881129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.883695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.884693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.884998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.885282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.885884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.886176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.887157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.888061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.888303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.888317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.890850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.891171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.891477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.891774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.892430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.893220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.894071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.895131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.895370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.895384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.897413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.897725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.898025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.898318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.899150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.899992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.901044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.902114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.902383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.902398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.904157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.904465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.904763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.905056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.906421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.907575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.908656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.909645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.909909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.909923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.911742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.912045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.912339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.912636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.913719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.914771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.915830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.405 [2024-07-23 08:50:20.916244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.916495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.916509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.918448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.918758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.919053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.919933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.921249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.922315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.922852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.923977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.924221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.924235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.926190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.926498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.927157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.928008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.929298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.930063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.931077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.932008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.932249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.932263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.934357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.934836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.935680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.936701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.937962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.938727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.939587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.940645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.940891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.940905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.943130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.944249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.945423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.946483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.947320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.948190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.949253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.950322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.950624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.950639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.954000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.955194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.956261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.957215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.958321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.959381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.960421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.961018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.961406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.961421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.964252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.965394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.966517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.967190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.968529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.969591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.970336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.970642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.970976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.970992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.973730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.974802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.975212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.976070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.977439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.978447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.978747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.979041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.979360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.979375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.982076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.982693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.983878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.985006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.986343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.986657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.986951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.987242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.987604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.987624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.989990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.990966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.991840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.992929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.993572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.993892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.994186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.668 [2024-07-23 08:50:20.994479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:20.994821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:20.994836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:20.996882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:20.997746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:20.998819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:20.999883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.000478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.000785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.001078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.001377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.001662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.001678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.003968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.005019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.006081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.006799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.007417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.007721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.008018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.008549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.008822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.008837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.011321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.012478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.013516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.013820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.014453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.014759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.015054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.016053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.016299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.016313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.018744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.019867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.020171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.020466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.021120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.021420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.022563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.023603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.023850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.023864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.026331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.026816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.027117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.027412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.028060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.029010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.029858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.030874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.031127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.031141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.033278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.033582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.033882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.034177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.035143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.035997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.037072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.038142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.038434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.038448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.040067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.040372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.040695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.040991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.042233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.043293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.044357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.045188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.045443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.045457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.047162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.047468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.047771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.048074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.049285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.050373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.051565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.052164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.052438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.052452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.054222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.054528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.054832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.055937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.057197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.057694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.058691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.059586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.059833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.669 [2024-07-23 08:50:21.059848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.061942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.062250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.062550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.062853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.063495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.063796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.064092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.064391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.064724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.064739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.066889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.067193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.067487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.067792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.068422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.068744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.069043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.069339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.069703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.069720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.072022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.072326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.072634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.072669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.073324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.073641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.073933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.074219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.074550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.074565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.076561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.076882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.077176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.077473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.077519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.077883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.078198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.078494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.078802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.079110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.079447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.079462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.081425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.081470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.081503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.081537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.081873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.081922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.081956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.081989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.082022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.082310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.082324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.084082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.084125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.084158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.084190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.084529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.084575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.084614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.084648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.084691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.084973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.084987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.086804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.086848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.086883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.086915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.087227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.087286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.087336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.087379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.087412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.087706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.087721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.089605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.089659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.089693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.089725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.090020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.090076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.090110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.090153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.090195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.090476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.090491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.092660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.092704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.092748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.092804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.093150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.670 [2024-07-23 08:50:21.093211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.093259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.093292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.093325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.093640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.093655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.095437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.095482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.095514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.095545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.095844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.095900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.095934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.095977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.096010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.096374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.096390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.098208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.098253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.098299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.098350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.098638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.098687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.098721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.098752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.098784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.099112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.099126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.100940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.100984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.101021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.101052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.101396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.101443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.101476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.101509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.101542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.101902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.101917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.103754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.103798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.103829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.103861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.104177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.104228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.104261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.104297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.104328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.104659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.104673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.106417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.106461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.106494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.106525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.106883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.106931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.106978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.107011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.107044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.107383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.107398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.109172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.109218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.109250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.109282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.109593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.109645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.109678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.109710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.109743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.110090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.110104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.111831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.111876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.111912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.111964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.112304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.112362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.112396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.112428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.112459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.112800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.112815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.114658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.114703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.114734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.114765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.115101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.115151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.115185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.115219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.115252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.115544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.671 [2024-07-23 08:50:21.115558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.117373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.117417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.117449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.117482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.117834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.117884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.117929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.117962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.118006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.118308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.118323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.120168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.120212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.120248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.120280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.120574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.120635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.120670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.120703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.120733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.120997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.121011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.122832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.122876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.122912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.122956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.123240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.123296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.123341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.123374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.123419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.123725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.123740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.125646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.125709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.125753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.125787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.126077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.126137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.126172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.126205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.126238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.126543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.126557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.128420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.128465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.128497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.128540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.128838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.128895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.128940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.128974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.129004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.129344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.129361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.131161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.131218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.131261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.131292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.131593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.131645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.131678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.131710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.131743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.132079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.132094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.133880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.133925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.133975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.134007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.134347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.134394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.134428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.672 [2024-07-23 08:50:21.134461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.134496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.134807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.134823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.136652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.136695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.136726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.136757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.137093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.137143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.137177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.137210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.137242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.137594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.137607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.139209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.139259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.139302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.139336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.139655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.139702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.139736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.139768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.139800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.140134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.140150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.141728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.141771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.141804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.141845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.142076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.142127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.142164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.142196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.142227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.142483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.142506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.143813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.143858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.143893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.143926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.144258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.144304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.144339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.144371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.144403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.144744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.144760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.146505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.146551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.146583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.146619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.146851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.146904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.146937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.146968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.146999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.147328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.147342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.148674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.148718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.148750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.148788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.149140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.149185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.149219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.149252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.149285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.149597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.149617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.151165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.151208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.151239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.151269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.151493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.151542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.151574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.151606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.151648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.151935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.151949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.153336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.153381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.153416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.153446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.153773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.153820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.153854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.153886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.153918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.154238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.154252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.673 [2024-07-23 08:50:21.155805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.155851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.155882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.155913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.156138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.156194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.156231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.156261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.156293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.156555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.156569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.157969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.158013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.158045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.158076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.158410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.158455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.158488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.158521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.158553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.158877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.158892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.160396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.160446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.161525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.161559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.161811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.161861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.161910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.161941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.161972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.162222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.162239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.163789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.163834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.163868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.164161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.164499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.164550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.164584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.164622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.164654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.164882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.164896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.167107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.168183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.169221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.169936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.170268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.170578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.170877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.171169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.171914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.172173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.172187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.174568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.175771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.176859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.177154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.177494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.177806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.178101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.178463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.179384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.179623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.179638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.674 [2024-07-23 08:50:21.182005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.183080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.183414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.183713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.184061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.184371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.184681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.185833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.186914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.187151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.187165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.189581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.190157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.190467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.190764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.191114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.191425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.192347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.193201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.194275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.194512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.194527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.196764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.197074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.197368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.197665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.198006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.198671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.199516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.200577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.201649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.201997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.202012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.203578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.203890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.204186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.204480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.204784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.205673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.206718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.207770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.208566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.208812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.208827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.210490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.210803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.211098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.211391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.211632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.212687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.213849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.214973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.215649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.215926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.215941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.217700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.218013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.218312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.937 [2024-07-23 08:50:21.219469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.219717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.220805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.221868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.222338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.223189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.223424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.223438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.225310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.225619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.226549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.227396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.227646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.228729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.229158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.230181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.231305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.231542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.231557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.233503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.234245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.235104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.236153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.236390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.237099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.238227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.239270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.240398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.240639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.240654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.242945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.243800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.244877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.245944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.246208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.247179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.248063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.249113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.250179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.250500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.250515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.253350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.254427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.255489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.256342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.256602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.257481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.258525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.259590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.259990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.260368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.260383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.263175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.264310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.265339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.266111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.266365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.267433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.268502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.269101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.269414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.269756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.269772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.272356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.273424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.273912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.274760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.274996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.276078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.277015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.277309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.277617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.277926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.277940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.280503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.281099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.282238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.283370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.283606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.284757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.285058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.285352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.285651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.286009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.286023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.288306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.289284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.290158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.291202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.291438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.291887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.292189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.292491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.938 [2024-07-23 08:50:21.292786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.293115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.293130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.295004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.295857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.296926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.297979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.298311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.298627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.298922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.299214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.299506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.299767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.299782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.302036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.303110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.304153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.304949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.305282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.305594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.305894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.306187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.306864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.307141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.307155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.309491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.310640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.311772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.312066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.312430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.312738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.313045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.313381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.314319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.314554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.314569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.316955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.318034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.318342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.318643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.318990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.319306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.319599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.320717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.321747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.321982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.321996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.324413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.325015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.325321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.325620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.325960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.326268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.327122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.327970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.329017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.329251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.329265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.331649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.331957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.332250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.332543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.332886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.333395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.334234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.335295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.336359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.336644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.336660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.338235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.338538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.338839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.339132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.339470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.340537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.341715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.342803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.343776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.344034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.344049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.345707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.346014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.346309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.346601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.346841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.347755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.348823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.349956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.350534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.350797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.350815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.352555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.939 [2024-07-23 08:50:21.352862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.353157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.353900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.354150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.355237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.355764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.356507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.357533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.357786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.357800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.359845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.360159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.360458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.360759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.361095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.361409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.361701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.361989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.362298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.362628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.362642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.364726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.365031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.365324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.365629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.365962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.366265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.366573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.366882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.367177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.367499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.367514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.369641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.369948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.370241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.370536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.370841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.371155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.371453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.371752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.372049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.372387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.372403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.374447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.374755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.375054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.375352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.375664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.375974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.376266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.376558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.376860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.377158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.377172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.379273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.379583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.379888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.380185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.380524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.380840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.381140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.381426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.381723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.382069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.382084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.384190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.384502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.384541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.384830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.385136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.385456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.385760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.386059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.386356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.386704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.386721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.388806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.389100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.389385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.389422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.389792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.390104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.390405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.390714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.391023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.391358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.391373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.393255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.393298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.393331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.393361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.393702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.393750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.940 [2024-07-23 08:50:21.393782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.393814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.393845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.394143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.394156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.395870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.395916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.395947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.395978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.396305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.396349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.396382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.396424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.396456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.396801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.396818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.398529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.398573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.398605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.398641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.398940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.398985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.399017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.399048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.399077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.399404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.399419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.401114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.401158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.401189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.401220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.401543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.401589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.401627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.401660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.401691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.402011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.402025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.403876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.403919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.403950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.403980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.404304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.404352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.404385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.404416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.404447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.404718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.404733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.406522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.406565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.406596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.406632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.406964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.407019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.407071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.407112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.407145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.407447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.407464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.409256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.409300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.409330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.409362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.409663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.409719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.409754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.409786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.409815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.410069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.410084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.411857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.411900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.411935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.411977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.941 [2024-07-23 08:50:21.412267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.412323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.412367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.412399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.412442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.412743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.412759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.414565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.414624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.414669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.414700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.414977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.415035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.415074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.415105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.415136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.415445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.415458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.417291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.417333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.417381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.417412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.417701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.417757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.417791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.417822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.417854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.418168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.418183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.419950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.420009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.420042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.420073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.420369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.420414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.420446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.420476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.420508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.420837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.420853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.422548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.422591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.422644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.422675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.423011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.423058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.423091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.423125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.423155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.423459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.423473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.425254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.425298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.425328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.425358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.425696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.425744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.425778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.425811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.425844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.426158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.426173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.427900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.427943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.427975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.428006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.428308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.428353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.428385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.428416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.428445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.428775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.428792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.430212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.430258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.430289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.430322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.430682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.430730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.430764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.430797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.430828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.431136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.431150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.432943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.432986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.433020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.433050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.433377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.433424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.433456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.433488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.433519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.942 [2024-07-23 08:50:21.433837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.433853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.435313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.435356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.435407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.435437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.435666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.435717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.435751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.435787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.435818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.436041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.436058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.437405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.437455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.437487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.437518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.437857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.437904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.437937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.437969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.438010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.438360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.438375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.439778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.439821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.439851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.439881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.440124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.440173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.440205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.440252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.440284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.440504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.440519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.441930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.441974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.442008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.442040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.442368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.442413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.442447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.442493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.442526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.442878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.442894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.444285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.444327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.444359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.444389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.444680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.444736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.444774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.444805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.444835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.445094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.445109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.446517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.446560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.446589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.446623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.446966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.447012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.447046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.447078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.447109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.447451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.447467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.448864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.448908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.448939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.448969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.449262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.449318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.449351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.449382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.449414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.449670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.449685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.451147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.451192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.451228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:08.943 [2024-07-23 08:50:21.451259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.451567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.451630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.451665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.451697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.451728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.452046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.452063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.453414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.453457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.453489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.453519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.453933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.453990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.454023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.454053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.454083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.454336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.454350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.455816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.455859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.455889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.455926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.456229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.456275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.456308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.456339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.456369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.456704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.456720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.458039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.458082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.458119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.458151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.458424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.458472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.458504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.458535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.458564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.458817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.458832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.460338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.460382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.460417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.460449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.460731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.460784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.460817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.460848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.460881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.461213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.461228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.462512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.462561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.462595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.462630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.462877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.462927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.462959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.462989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.463027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.463286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.463301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.464863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.464907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.464941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.464972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.206 [2024-07-23 08:50:21.465294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.465340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.465373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.465405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.465438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.465774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.465791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.467058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.467104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.467851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.467889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.468138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.468189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.468222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.468253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.468282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.468508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.468522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.470139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.470182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.470212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.470495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.470776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.470828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.470861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.470891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.470921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.471167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.471181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.473556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.474584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.475180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.475466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.475801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.476102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.476389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.477209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.478016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.478244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.478259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.480638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.481600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.481894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.482181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.482480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.482786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.483116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.484021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.485037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.485269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.485284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.487656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.487983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.488277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.488568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.488922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.489236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.490248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.491356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.492391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.492626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.492642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.494449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.494781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.495075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.495371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.495728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.496570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.497424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.498484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.499553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.499947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.499963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.501619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.501922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.502215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.502507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.502780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.503636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.504684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.505744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.506271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.506507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.506522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.508188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.508493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.508796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.509206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.509440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.510638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.511748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.512764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.513541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.207 [2024-07-23 08:50:21.513802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.513817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.515674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.515977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.516269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.517263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.517501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.518666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.519806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.520488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.521350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.521589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.521603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.523485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.523815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.524739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.525553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.525792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.526825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.527255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.528277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.529412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.529650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.529665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.531658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.532517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.533329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.534344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.534607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.535039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.536010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.537082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.538072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.538305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.538320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.540547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.541377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.542393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.543410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.543670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.544582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.545409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.546426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.547447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.547735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.547754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.550395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.551400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.552406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.553148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.553382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.554214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.555224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.556234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.556624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.556954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.556969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.559431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.560472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.561530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.562022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.562297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.563339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.564365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.565185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.565477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.565824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.565842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.568467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.569553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.570043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.570910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.571144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.572187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.573018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.573311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.573598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.573885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.573900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.576381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.576963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.578072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.579194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.579428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.580532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.580835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.581121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.581405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.581754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.581773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.583839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.584948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.585966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.208 [2024-07-23 08:50:21.587068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.587302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.587613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.587903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.588189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.588502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.588858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.588873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.590673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.591510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.592529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.593624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.593930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.594239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.594524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.594813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.595104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.595350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.595364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.597575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.598581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.599605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.600235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.600554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.600866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.601157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.601445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.602090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.602360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.602373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.604639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.605777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.606857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.607145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.607479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.607785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.608082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.608452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.609324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.609556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.609569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.611908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.612958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.613253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.613546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.613886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.614190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.614476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.615646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.616687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.616919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.616933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.619274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.619831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.620133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.620418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.620783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.621081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.622017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.622833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.623852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.624085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.624099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.626104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.626415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.626710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.627001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.627353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.628089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.628911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.629863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.630431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.630672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.630686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.632493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.632792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.633169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.634069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.634301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.635394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.636365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.636858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.637690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.637923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.637937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.639787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.640085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.640372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.640671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.640991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.641293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.209 [2024-07-23 08:50:21.641578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.641868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.642156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.642476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.642491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.644549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.644863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.645160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.645446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.645789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.646091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.646376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.646673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.646981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.647318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.647333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.649406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.649710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.649998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.650282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.650628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.650931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.651224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.651515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.651807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.652113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.652128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.654169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.654463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.654754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.655042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.655318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.655629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.655924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.656211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.656496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.656834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.656850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.658827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.659123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.659412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.659709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.659989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.660291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.660583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.660877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.661168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.661531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.661546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.663564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.663878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.664171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.664464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.664809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.665112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.665403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.665699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.665995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.666305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.666319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.668419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.668726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.669014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.669299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.210 [2024-07-23 08:50:21.669648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.211 [2024-07-23 08:50:21.669952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.211 [2024-07-23 08:50:21.670245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.211 [2024-07-23 08:50:21.670536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.211 [2024-07-23 08:50:21.670830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.211 [2024-07-23 08:50:21.671172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.211 [2024-07-23 08:50:21.671188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.211 [2024-07-23 08:50:21.673283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.673587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.673884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.674171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.674451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.674759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.675048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.675332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.675620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.675950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.675966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.678017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.678315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.678353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.678645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.678981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.679296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.679581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.679871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.680155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.680503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.680517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.682615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.682911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.683207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.683247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.683538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.683849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.684138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.684430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.684724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.685002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.685017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.686803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.686852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.686884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.686916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.687260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.687319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.687351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.687400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.687445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.687821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.687836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.689679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.689722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.689754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.689784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.690069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.690122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.690156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.690187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.690218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.690499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.690514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.692383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.692428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.692471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.692501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.692803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.692864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.692897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.692941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.692990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.693314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.693332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.695217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.695272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.695303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.695335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.695638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.695692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.695726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.695758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.695789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.696103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.696117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.697993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.698037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.698079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.698110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.698449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.698507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.698541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.698572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.698601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.698927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.698942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.700697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.700751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.700782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.700812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.701097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.212 [2024-07-23 08:50:21.701150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.701184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.701216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.701251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.701580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.701598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.703379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.703424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.703455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.703484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.703826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.703872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.703905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.703937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.703970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.704281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.704295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.705989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.706032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.706065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.706097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.706427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.706473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.706506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.706537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.706580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.706917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.706933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.708519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.708573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.708604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.708640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.708952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.709014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.709047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.709080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.709111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.709445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.709459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.711196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.711239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.711276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.711319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.711671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.711720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.711753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.711785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.711817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.712118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.712132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.713726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.713770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.713801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.713833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.714061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.714111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.714144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.714182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.714216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.714500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.714514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.715881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.715935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.715986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.716030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.716358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.716407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.716443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.716476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.716508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.716788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.716804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.718361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.718405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.718436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.718476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.718712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.718766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.718801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.718836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.718867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.719098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.213 [2024-07-23 08:50:21.719112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.720505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.720549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.720580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.720616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.720948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.720996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.721031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.721067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.721099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.721407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.721421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.722938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.722997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.723028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.723059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.723283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.723332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.723365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.723396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.723429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.723660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.723675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.725047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.725090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.725123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.725154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.725477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.725523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.725556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.725588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.725634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.725967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.725983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.727419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.727462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.727494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.727525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.727856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.727910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.727949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.727980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.728010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.728277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.728291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.729709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.729753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.729784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.729816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.730138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.730182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.730234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.730268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.730298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.730655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.730671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.732087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.732130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.732160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.732190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.732473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.732523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.732557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.476 [2024-07-23 08:50:21.732588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.732624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.732870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.732883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.734373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.734418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.734454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.734487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.734810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.734865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.734903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.734934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.734965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.735287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.735303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.736693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.736736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.736772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.736804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.737069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.737121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.737153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.737184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.737215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.737473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.737487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.739007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.739051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.739083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.739115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.739395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.739449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.739481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.739512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.739544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.739872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.739888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.741201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.741245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.741277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.741307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.741536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.741582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.741619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.741661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.741692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.741918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.741940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.743418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.743463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.743498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.743532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.743842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.743895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.743939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.743972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.744003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.744339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.744354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.745704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.745749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.745779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.745814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.746057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.746110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.746145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.746180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.746211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.746439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.746461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.748054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.748098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.748134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.748171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.748520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.748576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.748616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.748651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.748684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.749014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.749029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.750414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.750484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.750518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.750550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.750816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.750869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.750903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.750935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.750966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.751193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.751208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.752999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.477 [2024-07-23 08:50:21.753054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.753090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.753121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.753458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.753503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.753537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.753569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.753602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.753877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.753894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.755325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.755370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.755401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.755432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.755691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.755746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.755780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.755812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.755842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.756070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.756084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.757732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.757776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.757807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.757838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.758150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.758195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.758229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.758261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.758295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.758541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.758556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.760017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.760060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.760711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.760752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.761021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.761074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.761107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.761138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.761172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.761399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.761413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.763218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.763263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.763297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.763695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.763931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.763985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.764022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.764054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.764085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.764314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.764329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.766877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.767897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.768194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.768488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.768790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.769102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.769396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.770390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.771487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.771726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.771741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.774209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.774518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.774819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.775115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.775466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.775783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.776903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.777897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.778974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.779208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.779222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.781213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.781535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.781838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.782141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.782493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.783235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.784063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.785109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.786172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.786459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.786475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.788085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.788385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.788697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.788994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.789279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.790133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.478 [2024-07-23 08:50:21.791143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.792152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.792874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.793102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.793117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.794765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.795063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.795350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.795705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.795935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.797052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.798150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.799138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.799887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.800134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.800149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.801944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.802247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.802535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.803677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.803903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.804937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.805959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.806438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.807261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.807490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.807504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.809397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.809703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.810624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.811443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.811678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.812721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.813154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.814154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.815241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.815474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.815488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.817433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.818213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.819034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.820040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.820270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.820900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.822062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.823144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.824312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.824546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.824560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.826920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.827747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.828773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.829804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.830121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.831170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.832127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.833178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.834324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.834614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.834630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.837195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.838233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.839261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.839894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.840123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.840989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.842030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.843090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.843391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.843777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.843794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.846623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.847706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.848598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.849497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.849765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.850850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.851910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.852392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.852700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.853041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.853056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.855759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.856937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.857582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.858401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.858636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.859664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.860410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.860720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.861022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.479 [2024-07-23 08:50:21.861319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.861334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.863929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.864304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.865281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.866371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.866614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.867758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.868057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.868358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.868660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.869014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.869030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.871210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.872277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.873233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.874287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.874523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.874876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.875173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.875465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.875770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.876106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.876121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.878128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.879006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.880063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.881135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.881449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.881774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.882071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.882367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.882670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.882927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.882941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.885222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.886293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.887360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.888000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.888326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.888645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.888943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.889237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.889930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.890220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.890235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.892823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.893902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.894799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.895097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.895428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.895746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.896045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.896448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.897325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.897561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.897574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.899990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.901083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.901387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.901691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.902038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.902358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.902662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.903512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.904549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.904792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.904807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.906582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.906893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.907196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.907493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.907814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.908693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.909789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.480 [2024-07-23 08:50:21.910862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.911581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.911856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.911872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.913604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.913924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.914221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.914521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.914822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.915137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.915438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.915743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.916045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.916371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.916387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.918451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.918765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.919062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.919367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.919709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.920026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.920322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.920628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.920926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.921208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.921226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.923279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.923586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.923893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.924192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.924523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.924843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.925147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.925442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.925750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.926072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.926087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.928285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.928596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.928904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.929201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.929496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.929817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.930117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.930417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.930725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.931066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.931082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.933219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.933536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.933842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.934141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.934478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.934799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.935102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.935401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.935711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.936037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.936052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.938170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.938474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.938785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.939080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.939377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.939698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.940005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.940293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.940580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.940922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.940939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.943040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.943345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.943651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.943963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.944239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.944545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.944843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.945131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.945420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.945717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.945732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.947842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.948153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.948447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.948880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.949117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.949425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.949720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.950795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.951092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.951426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.951442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.953440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.481 [2024-07-23 08:50:21.954607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.954911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.955217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.955509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.955831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.956707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.957102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.957404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.957672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.957687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.960172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.960667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.960967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.961690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.961961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.962274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.962571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.962878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.963196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.963434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.963449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.965481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.965811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.966113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.967180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.967547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.967868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.968958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.969259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.969555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.969842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.969857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.972578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.972893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.972937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.973228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.973528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.973849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.974659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.975132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.975428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.975722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.975738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.978193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.978703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.979002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.979043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.979301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.979950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.980239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.980528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.980832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.981199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.981213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.982891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.982936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.982969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.983002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.983339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.983385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.983419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.983453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.983496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.983780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.983806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.985527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.985569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.985599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.985637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.985919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.985972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.986006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.986038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.986070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.986415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.986430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.988118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.988163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.988196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.988228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.988547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.988597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.988637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.988668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.988700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.988946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.988961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.990723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.482 [2024-07-23 08:50:21.990767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.990799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.990830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.991168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.991225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.991260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.991294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.991325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.991666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.991683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.993433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.993493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.993538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.993570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.993832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.993886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.993921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.993951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.993983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.994287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.994301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.996146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.996201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.996233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.996276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.996507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.996573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.996620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.996656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.996689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.996973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.996987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.998739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.998788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.998820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.998851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.999083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.999137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.745 [2024-07-23 08:50:21.999171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:21.999203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:21.999234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:21.999546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:21.999563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.001345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.001391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.001430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.001462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.001788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.001845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.001880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.001913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.001945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.002274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.002292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.004244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.004290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.004321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.004353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.004704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.004757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.004791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.004824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.004855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.005114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.005128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.006571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.006621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.006653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.006684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.006937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.006987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.007021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.007052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.007084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.007311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.007325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.009021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.009064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.009095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.009128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.009461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.009507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.009541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.009573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.009606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.009840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.009854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.011282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.011327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.011373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.011405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.011639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.011687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.011727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.011762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.011793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.012020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.012032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.013825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.013871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.013906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.013939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.014276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.014326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.014364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.014396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.014426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.014700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.014715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.016130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.016180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.016214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.016246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.016476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.016529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.016563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.016595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.016633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.016861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.016879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.018745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.018791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.018823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.018855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.019137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.019186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.019219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.019250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.019282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.019530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.746 [2024-07-23 08:50:22.019543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.021008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.021053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.021087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.021118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.021345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.021396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.021432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.021470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.021507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.021740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.021755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.023980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.024023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.024054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.024084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.024305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.024357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.024391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.024429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.024469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.024715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.024730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.028021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.028074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.028105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.028137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.028425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.028480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.028513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.028545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.028575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.028910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.028925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.032058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.032103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.032133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.032164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.032388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.032438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.032471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.032502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.032533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.032762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.032776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.035107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.035152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.035186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.035215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.035438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.035491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.035523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.035557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.035591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.035823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.035838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.038896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.038942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.038973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.039006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.039309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.039367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.039401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.039433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.039465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.039789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.039806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.042914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.042962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.042993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.043023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.043246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.043295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.043327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.043358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.043389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.043619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.043634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.046020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.046068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.046099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.046132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.046358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.046408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.046440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.046471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.046503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.046731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.046746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.049643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.049691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.049727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.049761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.747 [2024-07-23 08:50:22.050084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.050143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.050188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.050221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.050253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.050591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.050606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.053345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.053395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.053436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.053467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.053697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.053752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.053786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.053816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.053848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.054067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.054081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.056744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.056797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.056846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.056876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.057097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.057148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.057180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.057211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.057241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.057460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.057474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.060680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.060725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.060756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.060787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.061117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.061163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.061197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.061237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.061281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.061646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.061662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.064328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.064372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.064417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.064448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.064692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.064745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.064781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.064812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.064842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.065067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.065081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.068190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.068250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.068281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.068311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.068552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.068617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.068651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.068681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.068720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.068945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.068959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.072515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.072560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.072591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.072648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.072988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.073038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.073071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.073103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.073134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.073459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.073473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.076304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.076350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.077458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.077501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.077734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.077784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.077827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.077864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.077894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.078118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.078132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.079919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.079964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.079997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.080314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.080548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.080602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.080647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.080680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.748 [2024-07-23 08:50:22.080711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.080951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.080964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.083419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.084389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.084682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.084969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.085260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.085558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.085998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.086809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.087832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.088077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.088092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.090565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.090872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.091162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.091446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.091805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.092106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.093035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.094056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.095077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.095305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.095320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.096908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.097205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.097490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.097784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.098125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.099257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.100352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.101514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.102556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.102892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.102907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.104581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.104886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.105172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.105456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.105691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.106646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.107704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.108846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.109459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.109717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.109731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.111507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.111839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.112142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.113375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.113614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.114656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.115707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.116145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.116967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.117197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.117212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.119199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.119495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.120528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.121442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.121678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.122722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.123077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.123968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.124989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.125217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.125231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.127278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.128277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.129158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.130171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.130400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.130800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.131714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.132728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.133762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.133990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.134009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.136805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.137754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.138806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.139951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.140225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.141055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.142070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.143091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.143815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.144131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.144145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.749 [2024-07-23 08:50:22.146739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.147772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.148894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.149457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.149723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.150745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.151773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.152568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.152860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.153201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.153216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.155816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.156884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.157382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.158219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.158449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.159480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.160332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.160624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.160914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.161222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.161235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.163813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.164238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.165052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.166071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.166300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.167254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.167541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.167831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.168114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.168447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.168462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.170251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.171214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.172249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.173267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.173493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.173801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.174088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.174372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.174668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.174972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.174986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.177514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.178646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.179757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.180762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.181060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.181359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.181654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.181939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.182364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.182594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.182608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.184889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.185879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.186976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.187269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.187595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.187903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.188190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.188473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.189475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.189709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.189724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.191706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.192813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.193108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.193391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.193626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.194069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.194358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.195225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.750 [2024-07-23 08:50:22.196042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.196272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.196286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.198633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.199425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.199719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.200006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.200309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.200614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.201232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.202038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.203073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.203303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.203317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.205690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.205993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.206282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.206567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.206919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.207221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.207510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.207809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.208107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.208435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.208450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.210467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.210772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.211058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.211344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.211624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.211932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.212222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.212507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.212798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.213124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.213140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.215224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.215524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.215819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.216113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.216383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.216695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.216986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.217270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.217554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.217829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.217843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.219839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.220142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.220435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.220731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.221044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.221344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.221636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.221925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.222222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.222570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.222585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.224763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.225065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.225363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.225654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.225998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.226297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.226587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.226906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.227201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.227542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.227558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.229582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.229893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.230187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.230479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.230801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.231114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.231426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.231728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.232022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.232377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.232393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.234382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.234694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.234989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.235284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.235564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.235881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.236178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.236471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.236771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.237095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.237109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.751 [2024-07-23 08:50:22.239156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.239469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.239780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.240080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.240437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.240752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.241050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.241346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.241657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.241959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.241974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.244188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.244496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.244797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.245091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.245438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.245749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.246048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.246345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.246645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.246993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.247009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.249062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.249366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.249664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.249958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.250294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.250607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.250915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.251213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.251506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.251827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.251841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.253857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.254159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.254452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.254754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.255072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.255403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.255724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.256034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.256343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.256692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.256708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.258833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.259148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.259448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:09.752 [2024-07-23 08:50:22.259755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.260095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.260406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.260706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.260998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.261298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.261606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.261627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.263717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.264025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.264322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.264622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.264925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.265234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.265530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.265836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.266146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.266526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.266541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.268623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.268939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.269238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.269533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.269891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.270524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.271172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.271893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.272440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.272768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.272783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.275451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.275754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.275792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.276074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.276355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.276666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.276956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.015 [2024-07-23 08:50:22.277240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.277524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.277857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.277873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.279780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.280895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.281990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.282042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.282270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.283387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.283679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.283979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.284267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.284624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.284639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.286012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.286054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.286084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.286114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.286469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.286520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.286552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.286583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.286618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.286880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.286893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.288431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.288474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.288506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.288536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.288840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.288886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.288918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.288949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.288980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.289318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.289333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.290631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.290680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.290711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.290742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.290992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.291037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.291070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.291100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.291142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.291367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.291381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.292936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.292979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.293010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.293043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.293368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.293416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.293452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.293484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.293516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.293837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.293853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.295162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.295206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.295239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.295269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.295495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.295553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.295585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.295627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.295658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.295918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.295931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.297500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.297544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.297585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.297621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.297967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.298013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.298051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.298082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.298114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.298410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.298424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.299745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.299792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.299823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.299852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.300117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.300169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.300200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.300231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.300261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.300484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.300498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.302319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.016 [2024-07-23 08:50:22.302363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.302397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.302426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.302771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.302818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.302852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.302884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.302915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.303170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.303184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.304548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.304591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.304627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.304660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.304903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.304954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.304987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.305018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.305047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.305269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.305282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.306899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.306942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.306972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.307002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.307330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.307374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.307407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.307440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.307472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.307705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.307720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.309050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.309093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.309123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.309158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.309435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.309485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.309518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.309548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.309585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.309816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.309831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.311526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.311574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.311614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.311653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.311991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.312035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.312068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.312104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.312134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.312358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.312371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.313745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.313790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.313826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.313856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.314095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.314144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.314186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.314216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.314246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.314468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.314481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.316176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.316219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.316250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.316282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.316624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.316673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.316706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.316736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.316766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.317013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.317027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.318334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.318377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.318413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.318442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.318673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.318723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.318755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.318785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.318816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.319037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.319051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.320826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.320882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.320916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.320948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.321242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.321289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.017 [2024-07-23 08:50:22.321321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.321350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.321381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.321656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.321671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.323034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.323076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.323106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.323136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.323359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.323411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.323443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.323476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.323507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.323734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.323748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.325553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.325596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.325633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.325664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.325910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.325955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.325987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.326018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.326054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.326277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.326290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.327645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.327688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.327718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.327747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.327971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.328019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.328052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.328085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.328125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.328345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.328358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.330193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.330237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.330271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.330302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.330524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.330574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.330617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.330647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.330683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.330904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.330918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.332238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.332281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.332312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.332348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.332572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.332634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.332672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.332702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.332733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.332980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.332994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.334770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.334813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.334862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.334892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.335115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.335162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.335199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.335230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.335260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.335485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.335498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.336820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.336867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.336904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.336934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.337157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.337205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.337237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.337267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.337297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.337646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.337669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.339754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.339800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.339833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.339863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.340114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.340161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.340193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.340222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.340253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.340472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.018 [2024-07-23 08:50:22.340486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.341835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.341877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.341907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.341937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.342160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.342212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.342244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.342275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.342305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.342602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.342621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.344457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.344500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.344530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.344559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.344807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.344856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.344888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.344918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.344949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.345171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.345184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.346518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.346561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.346591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.346626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.346847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.346896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.346928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.346959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.346996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.347269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.347282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.349052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.349095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.349432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.349467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.349498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.349726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.351071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.351113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.351147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.351176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.351437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.351473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.351504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.351541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.351853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.353699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.353741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.353773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.353802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.354100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.354136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.354167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.354197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.354419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.355755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.355797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.355843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.356842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.356878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.357172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.357224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.357255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.357287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.357317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.357664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.359335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.359377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.359408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.359453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.360448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.360681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.360695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.360741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.360774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.360811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.360843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.361101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.362714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.363010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.363297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.363581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.363818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.363833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.364664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.365674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.366691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.367087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.367329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.369041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.019 [2024-07-23 08:50:22.369335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.369624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.370356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.370602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.370621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.371639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.372650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.373259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.374388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.374622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.376426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.376727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.377157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.377973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.378204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.378218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.379255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.380130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.380998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.381821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.382053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.383975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.384279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.385361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.386452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.386688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.386704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.387753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.388459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.389276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.390291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.390525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.392546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.393716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.394739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.395844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.396076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.396091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.396577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.397397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.398413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.399429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.399715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.402597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.403533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.404552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.405670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.405974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.405988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.406825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.407835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.408837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.409566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.409901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.412442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.413471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.414497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.414849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.415079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.415094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.416202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.417299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.418282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.418569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.418904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.421432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.422461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.422992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.424062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.424295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.424309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.425400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.426534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.426825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.427115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.427428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.020 [2024-07-23 08:50:22.429915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.430684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.431670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.432544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.432778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.432793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.433839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.434143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.434429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.434717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.435091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.437534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.438273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.439113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.440125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.440357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.440371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.440968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.441264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.441549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.441838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.442176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.443788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.444640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.445663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.446689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.446936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.446951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.447248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.447533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.447823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.448108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.448403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.450739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.451781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.452075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.452361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.452698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.452714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.453022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.453308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.454459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.455506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.455741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.458087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.458675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.458974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.459260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.459600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.459620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.459928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.460779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.461518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.462474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.462718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.465142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.465446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.465744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.466032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.466381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.466396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.466695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.466990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.467298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.467585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.467930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.469968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.470265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.470553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.470843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.471146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.471159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.471459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.471753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.472039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.472323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.472683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.474644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.474941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.475229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.475521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.475838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.475853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.476149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.476432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.476723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.477017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.477314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.479430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.479741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.480035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.480322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.480693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.021 [2024-07-23 08:50:22.480709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.481016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.481321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.481607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.481903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.482218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.484264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.484562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.484853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.485139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.485464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.485479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.485784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.486090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.486382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.486675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.487013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.489055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.489348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.489640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.489925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.490218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.490232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.490531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.490825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.491114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.491399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.491744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.493712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.494010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.494298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.494588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.494880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.494895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.495191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.495476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.495767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.496061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.496358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.498391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.498721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.499023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.499331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.499686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.499702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.500005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.500298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.500593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.500898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.501227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.503238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.503542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.503843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.504136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.504471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.504490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.504802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.505102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.505401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.505703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.506037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.508246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.508548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.508846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.509137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.509416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.509429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.509741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.510038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.510331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.510628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.510951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.512989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.513295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.513591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.513902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.514211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.514226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.514529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.514830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.515125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.515421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.515741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.517773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.518080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.518380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.518685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.519027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.519042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.519344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.519643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.519938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.520237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.022 [2024-07-23 08:50:22.520546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.522605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.522922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.523218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.523511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.523859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.523875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.524183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.524481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.524785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.525093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.525443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.527771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.528298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.528595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.528890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.529235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.023 [2024-07-23 08:50:22.529250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.284 [2024-07-23 08:50:22.529563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.284 [2024-07-23 08:50:22.529867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.284 [2024-07-23 08:50:22.530166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.530461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.530814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.532831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.533131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.533416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.533726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.534035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.534049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.534357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.534924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.535639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.535947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.536234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.538826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.539910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.540484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.541640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.541871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.541884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.542951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.544078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.544378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.544676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.545000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.547580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.548422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.549338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.550204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.550436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.550450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.551487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.551883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.552917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.553207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.553530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.556182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.556234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.557242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.558079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.558336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.558351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.559187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.560221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.561250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.561608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.561971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.564679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.565765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.566739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.566778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.567029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.567043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.567882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.568890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.569914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.570310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.570541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.572354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.572397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.572430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.572461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.572743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.572758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.572801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.572836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.572868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.572899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.573145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.574487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.574530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.574560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.574590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.574819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.574833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.574879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.574911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.574941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.574972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.575191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.577024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.577068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.577099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.577131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.577385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.577398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.577441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.577473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.577504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.577540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.577768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.285 [2024-07-23 08:50:22.579137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.579179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.579210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.579240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.579468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.579481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.579526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.579559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.579597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.579637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.579860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.581443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.581485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.581515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.581545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.581895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.581910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.581958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.581994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.582026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.582057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.582387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.583681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.583724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.583754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.583790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.584012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.584025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.584069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.584106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.584138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.584168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.584389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.586002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.586045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.586089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.586121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.586466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.586483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.586523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.586556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.586588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.586623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.586905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.588214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.588261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.588294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.588324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.588587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.588599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.588650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.588683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.588715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.588745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.588967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.590462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.590505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.590536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.590567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.590855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.590871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.590918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.590950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.590981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.591011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.591315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.592809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.592852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.592893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.592927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.593148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.593160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.593204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.593237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.593269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.593300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.593523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.594878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.594920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.594950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.594981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.595323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.595337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.595377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.595410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.595442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.595473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.595808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.597199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.597246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.286 [2024-07-23 08:50:22.597279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.597310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.597591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.597606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.597658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.597691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.597739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.597771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.597992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.599310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.599352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.599387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.599426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.599772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.599788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.599833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.599865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.599897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.599928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.600237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.601785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.601827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.601858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.601888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.602110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.602122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.602168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.602200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.602231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.602261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.602482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.603844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.603886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.603917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.603947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.604231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.604249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.604295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.604328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.604359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.604390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.604703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.606331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.606374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.606407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.606437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.606665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.606679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.606726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.606763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.606802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.606834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.607055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.608359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.608410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.608448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.608479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.608774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.608789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.608849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.608882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.608913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.608943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.609273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.610981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.611028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.611062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.611096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.611331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.611344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.611391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.611424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.611455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.611493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.611721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.613051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.613097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.613127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.613157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.613377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.613390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.613436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.613468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.613498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.613529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.613836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.615970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.287 [2024-07-23 08:50:22.616013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.616046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.616077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.616323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.616335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.616381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.616412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.616444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.616474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.616701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.618041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.618088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.618118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.618148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.618371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.618383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.618430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.618463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.618493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.618526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.618843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.620408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.620449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.620479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.620510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.620856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.620873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.620913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.620946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.620978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.621013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.621236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.622550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.622592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.622628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.622659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.622904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.622917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.622961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.622993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.623026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.623068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.623289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.624934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.624977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.625011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.625042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.625368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.625382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.625421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.625454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.625489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.625520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.625747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.627065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.627107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.627136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.627172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.627393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.627405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.627451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.627489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.627523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.627552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.627779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.629366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.629409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.629441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.629475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.629705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.629719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.629764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.629800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.629832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.629862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.630191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.631548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.631591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.631628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.631657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.631960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.631974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.632021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.632053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.632084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.632114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.632361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.633795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.633839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.633872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.288 [2024-07-23 08:50:22.633903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.634218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.634232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.634271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.634304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.634335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.634365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.634696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.636016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.636058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.636088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.636117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.636471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.636484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.636535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.636566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.636597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.636634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.636893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.638201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.638243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.638272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.638302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.638637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.638662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.638703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.638736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.638768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.638801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.639035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.640583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.641619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.641658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.641704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.641926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.641940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.641987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.642019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.642051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.642082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.642304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.643692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.643734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.643767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.644053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.644363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.644377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.644418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.644450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.644481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.644512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.644860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.649162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.650187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.651017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.651846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.652151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.652165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.652463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.653139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.653701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.653986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.654262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.656714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.657836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.658931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.659914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.660206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.660220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.660517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.660810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.661096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.661614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.661888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.666137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.289 [2024-07-23 08:50:22.666445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.667382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.667673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.668005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.668019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.669111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.669397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.669687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.670709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.670937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.673247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.674281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.674784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.675072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.675408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.675424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.675724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.676011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.676978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.677815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.678042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.682136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.682637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.682924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.683674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.683947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.683960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.684258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.684923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.685746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.686771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.686999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.689426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.689730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.690017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.690301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.690636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.690652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.691017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.691883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.692900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.693922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.694152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.697660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.697955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.698994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.699282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.699621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.699637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.700788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.701821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.702927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.704021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.704311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.705887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.706185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.706470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.706760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.706992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.707006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.707832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.708814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.709824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.710201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.710431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.714251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.714834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.715121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.715873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.716119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.716133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.717091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.717835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.718833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.719718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.719947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.721952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.722987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.724097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.725125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.725353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.725368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.725993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.726822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.727842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.728859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.729166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.734013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.734988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.736050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.737242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.290 [2024-07-23 08:50:22.737521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.737535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.738351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.739380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.740401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.741077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.741409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.743913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.744958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.745989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.746284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.746511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.746525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.747532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.748660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.749787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.750118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.750350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.752818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.753123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.753416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.753712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.754053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.754067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.754374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.754681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.755879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.756176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.756520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.758452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.758779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.759071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.759366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.759715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.759733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.760036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.760333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.761132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.761678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.762012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.764506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.764815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.765108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.765401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.765730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.765746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.766053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.767033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.767334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.767636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.767874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.769848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.770147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.770431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.770739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.771026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.771041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.771347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.771899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.772620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.772918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.773210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.776424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.776724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.777013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.777305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.777669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.777684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.778832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.779119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.779403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.780512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.780881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.782892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.783186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.783471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.783763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.784046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.784060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.784768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.785293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.785578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.786268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.786544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.789088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.789387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.789689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.790134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.790361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.790375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.291 [2024-07-23 08:50:22.790680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.292 [2024-07-23 08:50:22.790970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.292 [2024-07-23 08:50:22.791925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.292 [2024-07-23 08:50:22.792210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.292 [2024-07-23 08:50:22.792533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.292 [2024-07-23 08:50:22.794595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.292 [2024-07-23 08:50:22.794904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.292 [2024-07-23 08:50:22.795197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.292 [2024-07-23 08:50:22.795509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.292 [2024-07-23 08:50:22.795746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.292 [2024-07-23 08:50:22.795760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.292 [2024-07-23 08:50:22.796094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.292 [2024-07-23 08:50:22.796387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.292 [2024-07-23 08:50:22.797314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.292 [2024-07-23 08:50:22.797661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.292 [2024-07-23 08:50:22.797997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.553 [2024-07-23 08:50:22.800597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.553 [2024-07-23 08:50:22.800910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.553 [2024-07-23 08:50:22.801615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.553 [2024-07-23 08:50:22.802196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.553 [2024-07-23 08:50:22.802531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.553 [2024-07-23 08:50:22.802546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.553 [2024-07-23 08:50:22.803033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.803780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.804074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.804364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.804697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.806788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.807101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.807394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.808383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.808741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.808757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.809052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.810176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.810463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.810754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.811058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.813805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.814761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.815077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.815362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.815595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.815614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.816132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.816417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.816711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.817002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.817303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.819295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.819714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.820534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.820825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.821141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.821155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.822183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.822470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.822763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.823058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.823358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.826654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.826955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.827245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.828295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.828634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.828650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.828951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.829237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.829526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.829819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.830154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.832369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.833023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.833309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.833901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.834149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.834163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.834462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.834757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.835723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.836093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.836320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.838945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.839819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.840106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.840475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.840707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.840722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.841022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.841309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.841601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.841896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.842125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.844155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.844460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.844760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.845873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.846213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.846229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.846528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.847460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.847775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.848067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.848293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.851759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.852595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.853384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.853838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.854182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.854198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.854821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.855438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.855729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.856410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.554 [2024-07-23 08:50:22.856693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.858906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.860000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.861119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.861647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.861880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.861895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.862197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.862557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.863430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.863720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.864043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.868010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.868058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.869125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.869514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.869746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.869762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.870061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.870346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.871383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.871673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.872001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.873918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.874755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.875779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.875817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.876045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.876058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.876553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.877718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.878005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.878293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.878519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.881197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.881241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.881273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.881304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.881548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.881561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.881608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.881646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.881695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.881726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.881950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.883303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.883346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.883377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.883407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.883732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.883749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.883792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.883825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.883868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.883902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.884122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.886916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.886960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.886991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.887021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.887242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.887255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.887298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.887335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.887366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.887402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.887626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.889003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.889046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.889077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.889111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.889429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.889443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.889483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.889528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.889565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.889595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.889820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.892548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.892592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.892627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.892663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.892884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.892896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.892944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.892978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.893012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.893042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.893262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.894668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.894710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.894744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.894775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.555 [2024-07-23 08:50:22.895102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.895115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.895159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.895191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.895223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.895253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.895477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.898161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.898209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.898243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.898273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.898498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.898512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.898559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.898591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.898627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.898657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.898877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.900291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.900334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.900369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.900400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.900711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.900726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.900770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.900801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.900833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.900862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.901130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.903861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.903906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.903935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.903964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.904207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.904220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.904268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.904300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.904331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.904364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.904584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.906102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.906145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.906177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.906208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.906476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.906490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.906532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.906564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.906595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.906630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.906949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.909643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.909685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.909714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.909743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.910001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.910014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.910060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.910091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.910121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.910151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.910369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.911894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.911939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.911975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.912006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.912235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.912249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.912292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.912326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.912357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.912393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.912767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.915561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.915604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.915641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.915673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.915902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.915915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.915964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.915997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.916040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.916074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.916292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.917826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.917869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.917902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.917944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.918165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.918178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.918221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.918260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.556 [2024-07-23 08:50:22.918292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.918322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.918641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.921415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.921459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.921496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.921527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.921757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.921771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.921820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.921856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.921887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.921918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.922137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.923765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.923809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.923859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.923894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.924113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.924126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.924172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.924204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.924235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.924265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.924581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.927441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.927489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.927521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.927554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.927780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.927793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.927842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.927874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.927904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.927934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.928153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.929788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.929837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.929873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.929903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.930124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.930137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.930181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.930214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.930244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.930274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.930597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.933634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.933683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.933713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.933743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.933963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.933976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.934022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.934053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.934083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.934113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.934332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.936048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.936095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.936128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.936158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.936378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.936391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.936434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.936466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.936495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.936525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.936859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.940064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.940111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.940141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.940171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.940393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.940405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.940454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.940486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.940515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.940546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.940771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.942406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.942448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.942477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.942507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.942776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.942789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.942832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.942873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.942906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.942935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.943257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.946669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.557 [2024-07-23 08:50:22.946711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.946740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.946770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.946995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.947007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.947051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.947083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.947116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.947153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.947372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.948969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.949011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.949045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.949074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.949372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.949386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.949430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.949471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.949502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.949533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.949865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.953403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.953446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.953476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.953505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.953730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.953744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.953792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.953827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.953866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.953897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.954117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.955680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.955722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.955752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.955782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.956111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.956126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.956174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.956207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.956238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.956271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.956591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.959834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.959878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.959909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.959942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.960171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.960184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.960233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.960271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.960303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.960333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.960573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.962163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.962206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.962240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.962276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.962657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.962674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.962716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.962749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.962781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.962812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.963122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.966066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.966110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.966147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.966187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.966416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.966429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.966479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.966512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.966544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.966574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.966846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.558 [2024-07-23 08:50:22.968406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.968454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.968486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.968517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.968870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.968886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.968928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.968962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.968994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.969026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.969302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.972038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.973169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.973208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.973240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.973476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.973490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.973536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.973569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.973601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.973650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.973879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.976030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.976078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.976115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.977154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.977387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.977401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.977449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.977481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.977520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.977552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.977869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.981810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.982347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.983103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.983397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.983706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.983721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.984594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.985639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.986704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.987421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.987665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.990640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.991600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.991898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.992310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.992549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.992563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.993743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.994806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.995729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.996613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:22.996888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.001791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.002092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.002385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.003402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.003647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.003661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.004814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.005961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.006624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.007477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.007721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.010909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.011213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.012242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.013192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.013433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.013448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.014521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.014913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.015784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.016857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.017097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.019994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.020866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.021728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.022750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.022990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.023004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.023539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.024624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.025640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.026768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.027013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.030123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.030986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.032048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.559 [2024-07-23 08:50:23.033124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.033389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.033402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.034393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.035297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.036346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.037389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.037762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.042165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.043226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.044280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.045057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.045299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.045315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.046184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.047208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.048259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.048624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.048862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.052066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.053205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.053509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.054543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.054786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.054804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.055807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.056490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.057086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.057385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.057682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.061301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.062484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.063569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.064757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.065000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.065014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.065423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.066289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.066583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.066914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.560 [2024-07-23 08:50:23.067153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.070711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.071579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.072589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.073668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.073987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.074001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.075115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.075408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.075699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.076732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.077070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.080815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.081900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.082959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.083735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.084046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.084060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.084895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.085190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.085594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.086472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.086821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.090264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.090564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.090865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.091161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.091483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.091497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.092529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.092821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.093108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.094164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.094539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.098802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.099106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.099392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.099697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.099992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.100006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.100796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.101233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.101518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.102235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.102517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.105531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.106211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.106498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.106794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.107114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.107129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.107432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.108374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.108664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.108950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.822 [2024-07-23 08:50:23.109184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.111660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.112787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.113081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.113367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.113678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.113693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.113993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.114817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.115229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.115515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.115792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.118434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.118984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.119683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.119970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.120299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.120313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.120621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.120917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.121900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.122192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.122521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.125916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.126211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.127228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.127523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.127860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.127886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.128183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.128473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.129194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.129714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.130048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.133382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.133684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.134144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.134924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.135278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.135294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.135592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.135893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.136192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.137267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.137644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.140754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.141053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.141339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.142224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.142528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.142541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.142854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.143144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.143434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.143965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.144203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.146634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.147673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.147961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.148248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.148484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.148498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.148806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.149095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.149383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.149683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.149924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.152798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.153311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.154044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.154332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.154639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.154653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.155612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.155900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.156188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.156485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.156775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.161097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.161399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.162255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.162622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.162962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.162977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.163630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.164210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.164498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.164792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.165130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.167758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.823 [2024-07-23 08:50:23.168064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.168355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.169516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.169894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.169910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.170205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.171143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.171436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.171731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.172005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.174290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.174593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.174892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.175292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.175526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.175540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.175849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.176138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.177197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.177488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.177821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.180288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.180587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.180880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.181171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.181458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.181472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.182188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.182475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.182911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.183696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.184042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.187159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.187473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.188404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.188695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.189031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.189046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.189346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.189645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.190693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.190985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.191320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.194595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.194896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.195665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.196133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.196465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.196480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.196793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.197084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.197570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.198331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.198680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.201934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.202234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.202616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.203473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.203812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.203828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.204143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.205060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.206080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.207111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.207350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.210856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.211153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.212214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.212502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.212850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.212866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.214013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.215067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.216212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.217269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.217603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.220632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.221614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.221908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.222195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.222429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.222443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.223266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.224298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.225317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.225735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.225990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.230124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.824 [2024-07-23 08:50:23.230173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.230462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.230752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.230991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.231005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.231845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.232915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.233937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.234339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.234584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.238766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.239164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.239461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.239499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.239747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.239762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.240618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.241690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.242759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.243257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.243542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.247123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.247169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.247209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.247242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.247476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.247494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.247542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.247576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.247613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.247646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.247969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.251073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.251127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.251159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.251190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.251421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.251434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.251483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.251516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.251548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.251580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.251818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.254154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.254199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.254231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.254263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.254538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.254550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.254596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.254635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.254667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.254698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.254945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.258404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.258448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.258482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.258513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.258822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.258845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.258907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.258945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.258980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.259014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.259259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.261416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.261460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.261492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.261523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.261761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.261776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.261828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.261868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.261902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.261939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.262172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.265320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.265371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.265403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.265435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.265766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.265781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.265825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.265862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.265897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.265930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.266191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.825 [2024-07-23 08:50:23.269271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.269316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.269363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.269397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.269739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.269754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.269801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.269833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.269867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.269898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.270181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.273865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.273910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.273946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.273978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.274269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.274283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.274329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.274363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.274397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.274430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.274768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.278498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.278545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.278579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.278620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.278901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.278914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.278965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.278998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.279040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.279077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.279332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.282322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.282368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.282401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.282433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.282676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.282690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.282738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.282777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.282811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.282847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.283078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.286401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.286452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.286485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.286517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.286790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.286804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.286850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.286883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.286916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.286947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.287288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.289322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.289372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.289414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.289444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.289682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.289696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.289745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.289779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.289811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.289842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.290127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.293769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.293814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.293846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.293880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.294231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.294246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.294292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.294326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.294368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.294405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.294644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.297546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.297592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.297630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.297662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.297894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.297907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.297958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.297992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.298030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.298063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.826 [2024-07-23 08:50:23.298295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.301377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.301421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.301452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.301490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.301870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.301887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.301932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.301966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.301999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.302031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.302336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.305229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.305273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.305322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.305356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.305583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.305596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.305649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.305682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.305714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.305744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.306076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.308264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.308316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.308348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.308378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.308604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.308622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.308673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.308706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.308737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.308768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.309082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.312353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.312400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.312432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.312475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.312839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.312855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.312900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.312951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.312983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.313025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.313330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.316123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.316190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.316222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.316253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.316477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.316490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.316536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.316568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.316599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.316636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.316904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.319098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.319147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.319181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.319212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.319437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.319450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.827 [2024-07-23 08:50:23.319496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.319529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.319559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.319590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.319910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.323046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.323089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.323130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.323160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.323499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.323513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.323558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.323590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.323627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.323658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.323938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.326632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.326696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.326728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.326758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.326982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.326994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.327040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.327073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.327104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.327134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.327406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.329567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.329625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.329657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.329688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.329911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.329924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.329970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.330006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.330038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.330069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.330393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.333748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.333794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.333831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.333875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.334240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.334257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.334301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.334335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.334369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.334402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:10.828 [2024-07-23 08:50:23.334748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.337637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.337695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.337727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.337759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.337992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.338005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.338055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.338088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.338121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.338152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.338439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.340698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.340745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.340775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.340806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.341034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.341050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.341096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.341130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.341160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.341191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.341484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.344651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.344696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.344728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.344772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.345153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.345168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.345213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.345249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.345282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.345314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.345635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.348410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.348463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.348499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.348531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.348767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.348781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.348828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.348862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.348894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.348925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.349226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.351481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.351529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.351563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.351593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.351827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.351841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.351892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.351925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.351956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.351987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.352297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.355362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.355407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.355449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.355480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.355837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.355885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.355929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.355962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.355994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.356028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.356305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.358906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.359953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.359994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.360027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.360307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.360322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.360370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.360404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.360444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.090 [2024-07-23 08:50:23.360478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.360847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.363910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.363957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.363988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.364744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.364993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.365007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.365052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.365084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.365121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.365153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.365377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.368938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.369761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.370778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.371799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.372082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.372096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.373131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.374070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.375111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.376248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.376560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.379809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.380655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.381573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.382401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.382639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.382653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.383687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.384056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.384347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.384639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.384981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.388671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.389604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.389897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.390184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.390468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.390482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.390781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.391247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.392050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.393068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.393299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.397517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.397819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.398107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.399137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.399504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.399520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.399826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.400898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.402004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.403125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.403355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.406759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.407058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.407347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.408231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.408517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.408531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.409598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.410621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.411066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.412096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.412329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.415943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.416749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.417777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.418799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.419062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.419077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.419852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.420669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.421700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.422725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.423008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.425539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.425842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.426131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.426415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.426691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.426706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.427007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.427296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.427580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.427869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.428195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.430766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.091 [2024-07-23 08:50:23.431076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.431372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.431667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.432004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.432019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.432315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.432601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.432894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.433190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.433511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.436096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.436392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.436684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.436973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.437255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.437270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.437569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.437873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.438160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.438445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.438796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.441409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.441727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.442017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.442305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.442621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.442635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.442931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.443219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.443510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.443808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.444159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.446776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.447075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.447360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.447664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.447992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.448008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.448315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.448608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.448918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.449203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.449502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.452139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.452449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.452744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.453031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.453375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.453390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.453688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.453983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.454275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.454577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.454912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.457479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.457780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.458070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.458361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.458694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.458709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.459008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.459293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.459580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.459873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.460190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.462872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.463180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.463464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.463754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.464092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.464107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.464406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.464705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.464994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.465278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.465579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.468186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.468484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.468792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.469085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.469434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.469449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.469752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.470040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.470324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.470618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.470923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.472972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.473279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.473567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.473858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.474167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.092 [2024-07-23 08:50:23.474181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.474477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.474777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.475071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.475367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.475699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.477766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.478064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.478349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.478640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.478977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.478992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.479298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.479593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.479889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.480173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.480482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.482625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.482930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.483216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.483500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.483842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.483859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.484157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.484449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.484743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.485027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.485318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.487293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.487588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.487880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.488171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.488492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.488512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.488819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.489106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.489389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.489688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.490022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.492028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.492330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.493367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.494286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.494518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.494533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.495573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.495971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.496806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.497828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.498058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.500091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.501153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.502059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.503120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.503359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.503373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.503775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.504671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.505784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.506860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.507107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.509845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.510717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.511792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.512897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.513247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.513261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.514186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.515241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.516326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.517121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.517452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.520067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.521130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.522207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.522599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.522841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.522856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.524041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.525161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.526178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.526472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.526802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.529496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.530559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.531092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.532171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.532401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.532415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.533500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.534642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.093 [2024-07-23 08:50:23.534936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.535226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.535535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.538148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.538899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.539917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.540822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.541055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.541069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.542120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.542414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.542705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.542990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.543353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.545783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.546570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.547440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.548582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.548823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.548838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.549425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.549740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.550034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.550338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.550688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.552407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.553241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.554246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.555264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.555522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.555537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.555845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.556132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.556414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.556710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.556973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.559321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.560381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.561404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.562257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.562572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.562586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.562889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.563176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.563461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.564103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.564364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.566698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.567882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.568958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.569245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.569578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.569593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.569896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.570184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.570621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.571429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.571665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.573981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.575017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.575312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.575600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.575920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.575936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.576234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.576521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.577535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.578677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.578906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.094 [2024-07-23 08:50:23.581342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.581389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.581685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.581970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.582305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.582319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.582621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.582908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.583995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.585078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.585307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.587668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.588182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.588474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.588512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.588858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.588874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.589172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.589457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.590511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.591478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.591714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.593087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.593130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.593160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.593190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.593424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.593437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.593482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.593522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.593554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.593584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.593911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.595662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.595705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.595734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.595771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.595992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.596004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.596049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.596081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.596126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.596160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.596379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.597765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.597808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.597844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.597877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.598099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.598112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.598157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.598190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.598221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.598253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.598583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.600323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.600376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.600408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.600442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.600691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.600706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.600753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.600786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.600817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.600848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.601075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.602443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.602489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.602520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.602551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.602859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.602874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.602924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.602957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.602998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.603031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.603379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.605128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.095 [2024-07-23 08:50:23.605177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.605208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.605239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.605469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.605483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.605531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.605564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.605595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.605634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.605863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.607259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.607303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.607334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.607365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.607645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.607659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.607706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.607741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.607785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.607817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.608136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.609779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.609821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.609855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.609886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.610107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.610120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.610167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.610199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.610229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.610260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.610479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.611913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.611957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.611987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.612017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.612300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.612314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.612364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.612399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.612430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.612461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.612781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.614414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.614457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.614488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.614518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.614745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.614759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.614804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.614840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.614880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.614912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.615148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.616496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.616540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.616578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.616614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.616919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.616932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.616973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.617006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.617037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.617069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.617404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.618955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.618997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.619038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.619071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.619292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.619309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.619355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.619387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.619418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.619448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.619728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.358 [2024-07-23 08:50:23.621057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.621100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.621130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.621161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.621461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.621475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.621520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.621553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.621585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.621622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.621954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.623624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.623670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.623700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.623730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.623952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.623965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.624011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.624043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.624073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.624104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.624410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.625720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.625763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.625794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.625830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.626197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.626212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.626253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.626288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.626319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.626350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.626662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.628142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.628183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.628218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.628248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.628469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.628482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.628529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.628561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.628593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.628638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.628954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.630343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.630387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.630419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.630449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.630804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.630819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.630865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.630901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.630933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.630964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.631270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.632770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.632813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.632843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.632887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.633106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.633118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.633163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.633196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.633227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.633258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.633480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.634894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.634937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.634967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.634997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.635332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.635347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.635388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.635420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.635453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.635483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.635834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.637259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.637310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.637344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.637375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.637627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.637642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.637689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.637722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.637754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.637793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.638013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.639413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.639459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.639490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.359 [2024-07-23 08:50:23.639522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.639863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.639879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.639925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.639970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.640001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.640032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.640374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.641764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.641807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.641837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.641866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.642135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.642147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.642198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.642232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.642263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.642293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.642534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.643968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.644012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.644045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.644076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.644405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.644418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.644462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.644495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.644526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.644556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.644883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.646249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.646292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.646326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.646356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.646671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.646686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.646733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.646776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.646807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.646838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.647102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.648663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.648718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.648750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.648782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.649080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.649095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.649139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.649171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.649203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.649234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.649553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.650850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.650900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.650933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.650964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.651215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.651228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.651271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.651304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.651335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.651368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.651594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.653111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.653156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.653191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.653222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.653546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.653560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.653602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.653642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.653674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.653706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.654038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.655447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.655490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.655525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.655555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.655780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.655794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.655840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.655872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.655902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.655931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.656151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.658036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.658082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.658114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.658144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.658401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.658413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.658461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.658494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.360 [2024-07-23 08:50:23.658524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.658563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.658794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.660139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.660182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.660212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.660241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.660462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.660474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.660519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.660551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.660583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.660630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.660849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.662625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.663353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.663391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.663422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.663683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.663698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.663747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.663780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.663811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.663861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.664085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.665454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.665497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.665527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.666580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.666920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.666935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.666983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.667015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.667046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.667076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.667395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.669991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.671033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.671578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.672694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.672921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.672935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.673983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.675107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.675399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.675689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.676018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.678568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.679386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.680049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.680868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.681095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.681114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.682151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.682792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.683081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.683365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.683675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.685633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.685929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.686217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.686503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.686808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.686823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.687124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.687412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.687703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.687989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.688316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.690344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.690649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.690943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.691244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.691582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.691597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.691901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.692191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.692475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.692773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.693060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.695150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.695450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.695744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.696029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.696346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.696364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.696662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.696950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.697239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.697525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.697852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.699873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.700169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.361 [2024-07-23 08:50:23.700454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.700744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.701076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.701090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.701389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.701688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.701987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.702269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.702596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.704558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.704859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.705144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.705431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.705724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.705739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.706037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.706321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.706604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.706893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.707204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.709239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.709542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.709846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.710155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.710497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.710512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.710813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.711109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.711392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.711689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.711981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.714062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.714362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.714674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.714970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.715275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.715289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.715582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.715877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.716169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.716458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.716786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.718798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.719096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.719382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.719676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.720019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.720034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.720330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.720628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.720916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.721200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.721523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.723511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.723811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.724099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.724386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.724684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.724699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.724997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.725282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.725564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.725853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.726165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.728199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.728501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.728801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.729104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.729458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.729474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.729774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.730069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.730352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.730650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.730932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.362 [2024-07-23 08:50:23.733068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.733370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.733664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.733950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.734255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.734268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.734563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.734858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.735147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.735437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.735792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.737823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.738120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.738405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.738694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.739027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.739041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.739896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.740265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.741224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.741515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.741854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.743880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.744187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.744472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.744770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.745109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.745125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.745423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.745724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.746011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.746295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.746615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.748575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.748879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.749164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.749451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.749760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.749775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.750078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.750365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.750656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.750945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.751274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.753360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.753661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.754790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.755914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.756150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.756164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.757299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.757873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.758722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.759787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.760020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.761955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.762485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.763317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.764373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.764604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.764622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.765510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.766440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.767316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.768381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.768617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.770687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.771890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.773008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.774198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.774434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.774449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.774978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.775842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.776915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.777985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.778250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.781093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.782021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.783072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.784242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.784545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.784559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.785408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.786424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.787448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.788139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.788445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.790979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.791990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.793017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.793414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.363 [2024-07-23 08:50:23.793648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.793663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.794720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.795860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.796912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.797198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.797531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.800157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.801187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.801866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.802936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.803186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.803201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.804242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.805258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.805552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.805846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.806179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.808794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.809948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.810574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.811403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.811636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.811651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.812683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.813369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.813660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.813947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.814275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.816740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.817111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.818034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.819041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.819267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.819281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.820332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.820628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.820915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.821208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.821548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.823797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.824701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.825529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.826553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.826786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.826801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.827223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.827510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.827798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.828084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.828428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.830186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.831017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.832034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.833065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.833332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.833346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.833652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.833943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.834229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.834518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.834792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.837194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.838334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.839381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.840330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.840642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.840656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.840962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.841247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.841533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.842102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.842351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.844452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.845477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.846529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.846831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.847173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.847189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.847485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.847781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.848067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.849185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.849416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.851775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.852810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.853338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.853646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.853991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.854008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.854304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.854591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.855557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.364 [2024-07-23 08:50:23.856451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.856692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.859035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.859830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.860121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.860408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.860711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.860728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.861028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.861593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.862405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.863421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.863653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.865961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.866013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.866297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.866580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.866895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.866910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.867207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.867575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.868487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.869522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.869761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.365 [2024-07-23 08:50:23.872178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.872633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.872933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.872972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.873305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.873320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.873632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.873927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.874924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.875779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.876008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.877344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.877386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.877416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.877450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.877679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.877693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.877741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.877773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.877804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.877842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.878125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.879893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.879935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.879965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.879995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.880234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.880246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.880291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.880324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.880354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.880384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.880605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.882036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.882078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.882108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.882138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.882361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.882373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.882418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.882457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.882490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.882521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.882779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.884485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.884528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.884559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.884589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.884843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.884857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.884904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.884937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.884969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.885000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.628 [2024-07-23 08:50:23.885229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.886620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.886663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.886694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.886724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.886952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.886965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.887017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.887056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.887090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.887122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.887442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.889187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.889230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.889260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.889290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.889529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.889542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.889587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.889626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.889680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.889719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.889948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.891330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.891372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.891411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.891454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.891682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.891696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.891740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.891773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.891804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.891835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.892101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.893818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.893859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.893893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.893922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.894160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.894172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.894218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.894251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.894288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.894322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.894543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.895884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.895928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.895958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.895994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.896217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.896230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.896283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.896316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.896347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.896378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.896719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.898495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.898538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.898575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.898607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.898841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.898854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.898898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.898940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.898972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.899004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.899233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.900600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.900653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.900689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.900720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.900948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.900962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.901008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.901041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.901072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.901105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.901454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.903218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.903260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.903302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.903333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.903567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.903580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.903634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.903673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.903704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.903736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.903966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.905342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.905406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.905440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.905472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.629 [2024-07-23 08:50:23.905708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.905722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.905771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.905805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.905837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.905869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.906216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.907943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.907986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.908022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.908053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.908281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.908294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.908341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.908374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.908405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.908436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.908673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.910022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.910068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.910102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.910133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.910426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.910439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.910484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.910516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.910547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.910591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.910957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.912664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.912713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.912753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.912786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.913013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.913027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.913073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.913105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.913137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.913168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.913395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.914773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.914820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.914853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.914883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.915136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.915152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.915200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.915235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.915279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.915312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.915661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.917349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.917397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.917430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.917460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.917697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.917711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.917759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.917792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.917824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.917864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.918091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.919471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.919515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.919547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.919578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.919859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.919874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.919930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.919976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.920009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.920041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.920374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.922053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.922100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.922137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.922168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.922398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.922412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.922461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.922494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.922530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.922561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.922903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.924329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.924373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.924407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.924438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.924784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.924800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.924846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.630 [2024-07-23 08:50:23.924880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.924914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.924947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.925321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.926727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.926771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.926805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.926835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.927094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.927109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.927154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.927189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.927225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.927256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.927488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.928924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.928969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.929003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.929035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.929377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.929401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.929445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.929496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.929541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.929575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.929935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.931351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.931395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.931431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.931462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.931740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.931755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.931802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.931838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.931870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.931902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.932155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.933586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.933638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.933671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.933703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.934044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.934060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.934109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.934142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.934174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.934206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.934551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.935929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.935972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.936004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.936038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.936322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.936335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.936382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.936415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.936447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.936478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.936738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.938227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.938269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.938302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.938333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.938646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.938662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.938712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.938745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.938776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.938807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.939121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.940502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.940545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.940581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.940617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.940945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.940958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.941005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.941037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.941068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.941098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.941353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.942890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.942935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.942967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.942998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.943288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.943303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.943346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.943379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.943409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.943440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.943778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.631 [2024-07-23 08:50:23.945080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.945121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.945160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.945202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.945532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.945554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.945598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.945636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.945684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.945715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.945987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.947469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.947779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.947831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.947879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.948208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.948223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.948275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.948310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.948343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.948378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.948697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.950442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.950485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.950515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.950803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.951077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.951093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.951143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.951176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.951208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.951239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.951531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.953653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.953977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.954273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.954564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.954922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.954937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.955232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.955521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.955818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.956108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.956439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.958430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.958732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.959021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.959306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.959647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.959664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.959987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.960292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.960592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.960904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.961191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.963194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.963488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.963780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.964067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.964424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.964439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.964741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.965029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.965316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.965603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.965948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.968045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.968340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.968634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.968925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.969243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.969257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.969556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.969851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.970140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.970429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.970766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.972929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.632 [2024-07-23 08:50:23.973230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.973528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.973821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.974152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.974167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.974466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.974760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.975048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.975338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.975654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.977722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.978023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.978310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.978597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.978928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.978944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.979241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.979534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.979833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.980119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.980454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.982555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.982855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.983154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.983441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.983732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.983748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.984053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.984349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.984647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.984949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.985273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.987267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.987568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.987868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.988167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.988485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.988499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.988805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.989089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.989373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.989666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.989972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.992035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.992331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.992629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.992919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.993254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.993270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.993565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.993858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.994145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.994433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.994727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.996828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.997137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.997425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.997713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.998059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.998074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.998368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.998662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.998952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.999242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:23.999579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.001565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.001867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.002152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.002423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.002763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.002777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.003066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.003345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.003624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.003929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.004288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.006072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.006392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.006693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.006988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.007335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.007352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.007662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.007965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.008262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.008560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.008864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.010926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.011230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.011524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.633 [2024-07-23 08:50:24.011826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.012149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.012163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.012469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.012769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.013068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.013365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.013714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.015755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.016058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.016353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.016658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.016996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.017011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.017314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.017615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.017912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.018378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.018641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.020926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.022008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.023181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.023481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.023828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.023845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.024154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.024450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.024748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.025777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.026014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.028382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.029464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.029922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.030213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.030556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.030574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.030878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.031168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.032068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.032890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.033120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.035505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.036195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.036486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.036780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.037068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.037082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.037380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.038062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.038883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.039820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.040039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.042421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.042727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.043016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.043303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.043665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.043681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.043983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.044952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.046066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.047101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.047337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.049858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.050165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.050462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.050768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.051124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.051139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.051443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.052434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.053551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.054629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.054866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.056680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.056996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.057290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.057594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.057935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.057951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.058852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.059705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.060764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.061841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.062141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.063808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.064125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.064420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.064721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.065003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.065018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.065875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.634 [2024-07-23 08:50:24.066903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.067970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.068531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.068772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.070404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.070723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.071019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.071311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.071545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.071560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.072660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.073846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.074915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.075626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.075900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.077749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.078066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.078363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.079434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.079723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.079738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.080824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.081890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.082316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.083175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.083413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.085222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.085533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.086224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.087081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.087321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.087335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.088409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.089064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.090230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.091344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.091586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.093579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.094092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.094952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.095992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.096228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.096242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.097155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.098032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.098854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.099858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.100092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.102147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.103286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.104321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.105437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.105692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.105707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.106196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.107071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.108137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.109149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.109439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.112103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.112945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.114004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.115082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.115383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.115397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.116373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.117418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.118489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.119372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.119679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.122224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.123292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.124368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.125008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.125245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.125259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.126160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.127203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.128292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.128602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.128969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.131810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.132831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.133717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.134555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.134864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.134879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.135953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.136960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.137458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.137774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.138116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.140753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.635 [2024-07-23 08:50:24.141844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.142368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.143237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.143482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.143496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.144565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.145352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.145656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.145955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.146253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.148889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.148939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.149397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.150249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.150487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.150501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.151572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.152436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.152740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.153036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.153327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.155923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.156596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.157714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.157759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.157994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.158008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.159092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.160270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.160571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.160878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.161206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.162822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.162866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.162901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.162933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.163173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.163186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.163233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.163266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.163306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.163339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.163603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.165026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.165081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.165113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.165144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.165478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.165493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.165535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.165569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.165602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.165640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.165913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.167507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.167552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.167583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.167626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.167859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.167872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.167925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.167959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.167991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.168024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.168259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.169678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.169723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.169755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.169786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.170127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.170143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.170187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.170221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.170255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.170287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.170591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.172150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.172202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.172236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.172267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.172495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.172509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.899 [2024-07-23 08:50:24.172559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.172593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.172632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.172668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.172894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.174286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.174329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.174361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.174392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.174727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.174744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.174786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.174819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.174855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.174895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.175226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.176661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.176704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.176735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.176765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.177044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.177058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.177106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.177147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.177178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.177210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.177461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.178881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.178925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.178960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.178991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.179314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.179330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.179375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.179422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.179463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.179494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.179847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.181264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.181309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.181341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.181372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.181692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.181707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.181764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.181798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.181829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.181860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.182106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.183553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.183598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.183635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.183667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.183994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.184009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.184062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.184093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.184123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.184153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.184472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.185878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.185921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.185951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.185988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.186252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.186265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.186310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.186343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.186374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.186405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.186655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.188151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.188195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.188230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.188267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.188563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.188578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.188635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.188669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.188702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.188733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.189059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.190419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.190469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.190501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.190532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.190790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.190803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.190852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.190885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.190916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.190953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.900 [2024-07-23 08:50:24.191176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.192754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.192799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.192832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.192876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.193159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.193182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.193234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.193268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.193301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.193336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.193671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.195131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.195177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.195208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.195239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.195461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.195474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.195521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.195553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.195585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.195622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.195850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.197680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.197725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.197760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.197794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.198059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.198073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.198116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.198149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.198181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.198212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.198481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.199918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.199960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.199990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.200021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.200242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.200256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.200307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.200340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.200378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.200416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.200646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.202419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.202464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.202495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.202526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.202770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.202784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.202836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.202868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.202906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.202937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.203161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.204530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.204573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.204604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.204646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.204870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.204884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.204932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.204967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.204999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.205030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.205273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.207027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.207072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.207107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.207139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.207363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.207376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.207428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.207465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.207506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.207538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.207767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.209134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.209186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.209219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.209250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.209474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.209488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.209538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.209572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.209603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.209639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.209916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.211801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.211847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.211880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.901 [2024-07-23 08:50:24.211912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.212146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.212159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.212209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.212243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.212274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.212304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.212527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.213950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.213993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.214023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.214053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.214278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.214291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.214338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.214371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.214402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.214434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.214798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.216822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.216865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.216899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.216929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.217176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.217189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.217234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.217296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.217328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.217359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.217583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.219029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.219074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.219104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.219134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.219357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.219369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.219420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.219452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.219482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.219519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.219925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.221961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.222016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.222064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.222108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.222393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.222408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.222458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.222492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.222524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.222556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.222874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.224747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.224791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.224833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.224863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.225164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.225178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.225243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.225288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.225318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.225349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.225684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.227474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.227534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.227580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.227617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.227904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.227918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.227969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.228002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.228034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.228066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.228409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.230150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.230196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.230243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.230277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.230621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.230637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.230684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.230718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.230751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.230783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.231073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.232855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.232897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.232928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.232957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.233278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.233293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.233334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.233367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.902 [2024-07-23 08:50:24.233399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.233430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.233753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.235493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.235794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.235833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.235877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.236230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.236246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.236297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.236331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.236366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.236398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.236752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.238589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.238640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.238671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.238953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.239273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.239286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.239357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.239402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.239433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.239466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.239754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.241882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.242189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.242488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.242800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.243121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.243136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.243433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.243744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.244043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.244343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.244694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.246765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.247064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.247352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.247664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.248005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.248021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.248331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.248640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.248951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.249238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.249581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.251731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.252033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.252337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.252641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.252961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.252977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.253279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.253572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.253867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.254157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.254496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.256381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.256923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.257668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.257975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.258287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.258301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.258628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.258932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.260004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.260303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.260655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.262660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.263804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.264105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.264403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.264645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.264661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.903 [2024-07-23 08:50:24.264972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.265269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.265563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.265871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.266134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.268075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.268382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.268690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.269426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.269701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.269717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.270028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.270537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.271310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.271604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.271929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.273957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.274892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.275189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.275484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.275785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.275801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.276111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.277246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.277550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.277851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.278088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.280883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.281208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.281505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.282482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.282826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.282841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.283152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.283450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.283757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.284478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.284754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.286815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.287124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.287723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.288395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.288755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.288771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.289122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.290045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.290344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.290645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.290945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.293737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.294042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.294338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.295241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.295548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.295563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.296437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.296835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.297888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.298187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.298536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.300543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.301703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.302004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.302297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.302530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.302544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.302889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.303187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.303483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.303789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.304050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.306011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.306322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.306628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.307269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.307507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.307521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.307835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.308216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.309116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.309417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.309764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.311876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.312182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.312487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.312795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.313124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.313139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.313447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.313752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.314051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.314371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.904 [2024-07-23 08:50:24.314744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.317096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.317968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.318986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.319998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.320250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.320264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.321253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.322142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.323159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.324230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.324599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.327272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.328296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.329315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.330071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.330304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.330318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.331150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.332163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.333185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.333509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.333894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.336701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.337745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.338650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.339469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.339764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.339779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.340809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.341826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.342282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.342582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.342916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.345494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.346630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.347186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.347997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.348226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.348240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.349270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.350040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.350328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.350618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.350889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.353434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.353907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.354920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.356006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.356233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.356248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.357352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.357646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.357932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.358218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.358555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.360772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.361738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.362617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.363631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.363860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.363874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.364239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.364530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.364821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.365120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.365454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.367303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.368135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.369161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.370179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.370460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.370474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.370780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.371070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.371357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.371654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.371929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.374366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.375492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.376576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.377515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.377789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.377804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.378105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.378393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.378686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.379023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.379251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.381430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.382466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.383509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.383807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.905 [2024-07-23 08:50:24.384149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.384165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.384465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.384763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.385048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.386147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.386392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.388787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.389810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.390311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.390616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.390931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.390947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.391246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.391533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.392265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.393082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.393309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.395701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.396576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.396870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.397160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.397441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.397456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.397763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.398211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.399028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.400081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.400316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.402761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.403067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.403357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.403651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.404009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.404024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.404324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.405406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.406515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.407646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.407876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.409767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.410077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.410373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.410688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.411038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.411054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.411939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:11.906 [2024-07-23 08:50:24.412801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.413858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.414926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.415246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.416935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.417229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.417514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.417827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.418102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.418117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.418988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.419997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.421013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.421489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.421722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.423436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.423743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.424030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.424681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.424938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.424953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.425999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.427060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.427761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.428877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.429127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.430991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.431296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.431805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.432628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.432857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.432873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.433899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.434759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.168 [2024-07-23 08:50:24.435631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.436456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.436693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.438607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.438929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.439847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.440868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.441096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.441111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.442125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.442868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.443693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.444715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.444944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.447033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.447082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.447967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.448985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.449211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.449225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.450241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.450968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.451779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.452795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.453024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.455097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.456129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.457256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.457297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.457524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.457538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.458574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.459288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.460114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.461123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.461354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.463178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.463229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.463260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.463292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.463572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.463585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.463635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.463667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.463698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.463727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.463973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.465390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.465432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.465463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.465493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.465721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.465735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.465782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.465814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.465851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.465887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.466106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.467917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.467960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.467992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.468030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.468252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.468265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.468313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.468345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.468380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.468415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.468643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.470012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.470065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.470097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.470127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.470349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.470364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.470410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.470442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.470473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.470503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.470790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.472839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.472882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.472912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.472942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.473188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.473201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.473250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.473283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.473313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.473344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.473566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.474971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.475014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.169 [2024-07-23 08:50:24.475044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.475074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.475294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.475307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.475354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.475397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.475429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.475461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.475747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.477497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.477539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.477581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.477621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.477842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.477855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.477897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.477936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.477972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.478002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.478222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.479614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.479691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.479736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.479768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.480009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.480023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.480070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.480105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.480137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.480182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.480523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.482283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.482331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.482362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.482393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.482631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.482648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.482699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.482733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.482764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.482795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.483022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.484432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.484474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.484504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.484534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.484849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.484863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.484909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.484941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.484973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.485002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.485333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.487031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.487073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.487103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.487133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.487354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.487368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.487415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.487448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.487485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.487519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.487796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.489368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.489412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.489450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.489481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.489797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.489813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.489855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.489887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.489917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.489949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.490271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.491643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.491685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.491723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.491754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.492028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.492042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.492089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.492122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.492153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.492183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.492437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.494017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.494061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.494093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.494124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.494434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.494447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.494488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.170 [2024-07-23 08:50:24.494521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.494554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.494585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.494934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.496291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.496336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.496366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.496406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.496634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.496647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.496696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.496732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.496767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.496797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.497021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.498757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.498803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.498838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.498868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.499186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.499201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.499243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.499276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.499308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.499339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.499595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.502180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.502223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.502253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.502284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.502504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.502516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.502566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.502597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.502637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.502667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.502970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.505105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.505150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.505181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.505212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.505432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.505446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.505493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.505526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.505557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.505597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.505886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.509061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.509105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.509136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.509166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.509518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.509533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.509580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.509619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.509652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.509682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.509912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.512766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.512814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.512849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.512880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.513137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.513150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.513201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.513235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.513269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.513299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.513528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.516774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.516817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.516848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.516879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.517131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.517143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.517193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.517226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.517256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.517287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.517515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.521205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.521250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.521281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.521313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.521635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.521648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.521691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.521724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.521760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.521793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.522126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.171 [2024-07-23 08:50:24.524604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.524654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.524685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.524721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.525040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.525056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.525098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.525131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.525163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.525196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.525476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.527874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.527919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.527955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.528001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.528317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.528332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.528382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.528419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.528452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.528483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.528832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.531325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.531372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.531404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.531436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.531746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.531762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.531809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.531841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.531874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.531906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.532226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.534593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.534648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.534691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.534734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.535017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.535032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.535097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.535141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.535190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.535235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.535628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.538043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.538088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.538119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.538150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.538479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.538495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.538540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.538574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.538607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.538648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.538984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.541357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.541408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.541442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.541474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.541795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.541811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.541866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.541898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.541943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.541988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.542313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.544707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.544763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.544794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.544826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.545129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.545143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.545187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.545219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.545251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.545283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.545621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.547968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.548012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.548052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.548083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.548430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.548454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.548497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.548529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.548561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.548592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.548929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.551421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.551731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.551778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.551811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.552099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.172 [2024-07-23 08:50:24.552114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.552155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.552191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.552223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.552254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.552584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.554395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.554441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.554472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.554762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.555101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.555115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.555160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.555193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.555225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.555255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.555597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.557666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.557962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.558250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.558544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.558841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.558857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.559157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.559441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.559752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.560047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.560358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.562416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.562719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.563011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.563300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.563644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.563664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.563961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.564247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.564536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.564837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.565119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.567191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.567837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.568126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.568458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.675707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.676690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.679017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.679956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.680012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.680996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.682007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.682243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.682312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.683021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.173 [2024-07-23 08:50:24.683072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.683870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.683919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.684904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.685895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.686241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.686257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.686268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.689467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.690484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.691545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.692663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.693033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.693963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.695020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.696044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.697039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.697369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.697385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.697395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.699907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.700927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.701940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.702721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.702962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.703801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.704818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.705835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.706361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.706712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.706728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.706740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.709408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.710462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.711522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.712027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.712301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.713376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.714398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.715357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.715651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.715976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.715991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.716002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.718554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.719538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.720059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.721159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.721390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.722408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.723379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.723680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.723970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.435 [2024-07-23 08:50:24.724290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.724306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.724315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.726850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.727753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.728555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.729368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.729599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.730622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.731231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.731518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.731811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.732131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.732145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.732155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.734557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.734959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.735804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.736795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.737027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.738084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.738371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.738662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.738948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.739281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.739297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.739307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.741379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.742401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.743280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.744301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.744539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.745021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.745310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.745593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.745899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.746231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.746246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.746256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.748084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.748911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.748954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.749940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.750917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.751268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.751283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.751583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.751879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.751915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.752198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.752485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.752723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.752739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.752749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.752759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.754869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.754918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.755906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.755945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.756171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.756184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.756724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.756767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.757051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.757085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.757412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.757426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.757436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.757446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.760008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.760056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.760666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.760710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.760966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.760980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.762124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.762165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.763183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.763222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.763502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.763518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.763528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.763538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.765429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.765478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.765778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.765832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.766065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.766080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.767006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.767048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.768160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.436 [2024-07-23 08:50:24.768204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.768479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.768493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.768503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.768513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.770179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.770227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.770517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.770554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.770916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.770933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.771233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.771275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.772278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.772318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.772553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.772567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.772577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.772591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.774976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.775025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.775320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.775356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.775716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.775733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.776036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.776074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.776364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.776399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.776728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.776744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.776754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.776765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.778814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.778861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.779154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.779200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.779548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.779562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.779879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.779920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.780209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.780244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.780556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.780571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.780580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.780590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.782657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.782706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.783001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.783035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.783382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.783398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.783711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.783754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.784048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.784088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.784410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.784424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.784434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.784445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.786524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.786577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.786876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.786913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.787255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.787271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.787573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.787619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.787910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.787945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.788244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.788258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.788268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.788279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.790354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.790405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.790713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.790761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.791045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.791059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.791364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.791403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.791703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.791751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.792105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.792121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.792134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.792148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.794209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.794258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.794540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.794574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.437 [2024-07-23 08:50:24.794901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.794917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.795222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.795262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.795552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.795592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.795949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.795966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.795977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.795989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.798105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.798156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.798446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.798481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.798785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.798802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.799109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.799150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.799439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.799484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.799839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.799854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.799864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.799875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.801981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.802038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.802334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.802376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.802733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.802749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.803064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.803102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.803385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.803418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.803776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.803792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.803802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.803816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.805822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.805872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.806162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.806199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.806499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.806514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.806829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.806884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.807168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.807209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.807546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.807562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.807573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.807583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.809635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.809685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.809985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.810021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.810388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.810403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.810707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.810747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.811037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.811078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.811415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.811430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.811440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.811450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.813975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.814025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.814315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.814368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.814734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.814750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.815049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.815087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.815378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.815414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.815786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.815803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.815818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.815828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.817937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.817986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.818274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.818313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.818664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.818680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.818981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.819034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.819318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.819354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.819724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.819739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.819749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.438 [2024-07-23 08:50:24.819761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.821860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.821911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.822202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.822236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.822598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.822621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.822928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.822970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.823268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.823317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.823624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.823642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.823662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.823672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.825770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.825832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.826124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.826163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.826519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.826534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.826858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.826898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.827190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.827232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.827595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.827617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.827628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.827639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.829664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.829712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.830000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.830040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.830350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.830365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.830672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.830711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.830995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.831030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.831334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.831348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.831358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.831368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.833391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.833441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.833741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.833781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.834129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.834145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.834442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.834484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.834781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.834821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.835152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.835165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.835175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.835186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.837281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.837333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.837633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.837671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.838030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.838046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.838340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.838375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.838666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.838703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.838993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.839007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.839017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.839027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.841060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.841107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.841392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.841432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.841732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.841747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.439 [2024-07-23 08:50:24.842050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.842089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.842372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.842407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.842781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.842798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.842810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.842821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.844796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.844844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.845140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.845175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.845530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.845545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.846632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.846679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.846971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.847016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.847323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.847337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.847347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.847357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.850144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.850196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.850237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.850268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.850500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.850515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.850849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.850891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.850925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.850956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.851247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.851261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.851271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.851281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.853451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.853498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.853528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.853560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.853819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.853832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.853880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.853912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.853944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.853974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.854204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.854218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.854228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.854238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.855580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.855630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.855662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.855692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.855918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.855931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.855977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.856009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.856046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.856078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.856387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.856405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.856415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.856424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.858179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.858223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.858253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.858289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.858518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.858531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.858575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.858623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.858658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.858689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.858912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.858926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.858935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.858945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.860267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.860319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.860607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.860653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.860986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.861001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.923864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.924152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.926849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.926900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.926947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.927927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.928210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.440 [2024-07-23 08:50:24.928265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.929007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.929044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.929092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.929359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.929721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.929738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.932422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.932489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.933138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.933177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.933223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.934022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.934268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.934284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.934334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.934381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.935383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.935420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.935471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.935747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.936098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.936114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.936125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.936137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.937842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.938873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.938915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.939917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.940241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.940257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.940314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.941418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.941457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.942561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.942811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.942826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.942836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.942847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.945869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.946675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.946715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.947718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.947963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.947978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.948029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.948390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.948429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.949245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.949490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.949504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.949514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.441 [2024-07-23 08:50:24.949525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.951194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.951512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.951551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.951849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.952095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.952110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.952161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.953261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.953309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.954381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.954633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.954649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.954659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.954670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.958084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.958392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.958430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.958729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.959059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.959075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.959121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.959939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.959979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.960987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.961232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.961247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.961256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.961268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.962650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.963685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.963725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.964010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.964348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.964364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.964412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.964706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.964742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.965026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.965292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.965309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.965320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.965330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.967872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.968900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.968940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.969619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.969962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.969977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.970038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.970333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.970370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.970679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.971052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.971067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.971078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.971091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.972381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.973193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.973233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.974047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.974299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.974314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.974365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.975372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.975410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.975711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.976074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.976089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.976099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.976111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.978813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.979446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.979484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.980583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.980831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.980846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.704 [2024-07-23 08:50:24.980894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.982080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.982122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.983118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.983432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.983447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.983456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.983467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.985216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.986103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.986144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.987199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.987441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.987456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.987505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.988057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.988095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.988886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.989122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.989137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.989147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.989158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.992389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.993375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.993421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.994445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.994700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.994715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.994767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.995614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.995652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.996590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.996850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.996866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.996875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.996887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.998458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.998775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.999077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.999664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.999937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:24.999953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.000004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.001017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.002027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.002757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.003014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.003028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.003038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.003048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.006108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.006485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.007402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.008499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.008752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.008772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.009875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.010501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.011329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.012346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.012597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.012620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.012632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.012642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.014722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.015917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.016896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.017903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.018150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.018165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.018530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.019477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.020591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.021683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.021931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.021945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.021956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.021967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.025225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.026248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.026875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.027958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.028242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.028257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.029304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.030329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.030640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.030936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.031286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.031301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.031312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.705 [2024-07-23 08:50:25.031323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.034030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.034998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.035777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.035820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.036088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.036103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.037146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.038121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.038606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.038655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.039024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.039040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.039051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.039062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.043140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.043189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.044029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.044069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.044355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.044369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.045417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.045459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.046481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.046523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.046844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.046860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.046870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.046881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.049467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.049518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.050536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.050575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.050931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.050947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.052055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.052104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.053202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.053248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.053487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.053501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.053511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.053522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.057306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.057358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.058362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.058400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.058654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.058669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.059037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.059077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.059918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.059956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.060198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.060211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.060225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.060235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.062364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.062414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.062717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.062759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.063014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.063029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.063331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.063374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.064241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.064281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.064527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.064541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.064551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.064561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.067253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.067303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.067595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.067637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.067961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.067975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.068281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.068319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.068621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.068663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.069015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.069030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.069040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.069051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.071129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.071601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.071902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.072766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.073122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.073136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.073435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.706 [2024-07-23 08:50:25.073749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.073789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.074081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.074409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.074423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.074433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.074445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.077735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.077785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.078080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.078379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.078734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.078750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.079064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.079105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.079388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.079680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.080013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.080028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.080038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.080049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.081870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.082174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.082472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.082514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.082810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.082827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.082880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.083173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.083467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.083504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.083889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.083904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.083914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.083925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.086622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.086939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.086981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.087268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.087580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.087594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.087901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.088189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.088227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.088511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.088868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.088883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.088893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.088904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.091002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.091049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.091337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.091637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.091969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.091984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.092287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.092333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.092624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.092912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.093286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.093300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.093311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.093322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.095855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.096164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.096464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.096503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.096854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.096870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.096917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.097210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.097504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.097541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.097879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.097895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.097905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.097916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.100022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.100330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.100372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.100674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.101003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.101018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.101325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.101627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.101667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.101961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.102290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.102305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.102315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.102326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.707 [2024-07-23 08:50:25.105041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.105108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.105399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.105699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.106037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.106051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.106355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.106394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.106697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.106999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.107325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.107339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.107351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.107362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.109321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.109643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.109947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.109988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.110357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.110372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.110418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.110718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.111012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.111049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.111382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.111399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.111409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.111420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.114185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.114496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.114537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.114837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.115186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.115201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.115505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.115811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.115853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.116151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.116529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.116544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.116554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.116565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.118778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.118835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.119134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.119437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.119812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.119828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.120139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.120179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.120473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.120786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.121109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.121124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.121135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.121145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.123574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.123888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.124182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.124231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.124644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.124660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.124707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.125000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.125298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.125346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.125642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.125657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.125667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.125679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.127897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.127954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.128251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.708 [2024-07-23 08:50:25.128292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.128657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.128673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.128977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.129017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.129308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.129344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.129720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.129735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.129757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.129767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.132489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.132545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.132857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.132899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.133251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.133267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.133571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.133617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.133908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.133944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.134271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.134287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.134297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.134306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.136297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.136347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.136645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.136685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.136987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.137002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.137314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.137357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.137657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.137693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.138061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.138077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.138088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.138099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.141115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.141169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.142271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.142312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.142554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.142571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.143520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.143560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.144363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.144401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.144726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.144741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.144751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.144761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.147629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.147687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.148804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.148849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.149091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.149104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.150252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.150294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.151050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.151088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.151362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.151375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.151385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.151395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.154695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.154748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.155557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.155594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.155857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.155871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.156929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.156969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.157600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.157643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.157890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.157903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.157913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.157922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.159779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.159829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.160122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.160156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.160402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.160415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.160793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.160832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.161124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.161158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.161407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.709 [2024-07-23 08:50:25.161419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.161429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.161438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.164580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.164636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.164929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.164963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.165286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.165299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.165606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.165660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.166836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.166880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.167206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.167219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.167229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.167238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.169675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.169724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.170016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.170050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.170379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.170392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.171399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.171437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.171736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.171772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.172096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.172109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.172119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.172128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.175881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.175932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.176758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.176795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.177144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.177157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.177463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.177499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.177799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.177837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.178185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.178199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.178210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.178224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.180234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.180283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.180316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.180348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.180626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.180642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.181691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.181730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.181765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.181807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.182048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.182061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.182071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.182081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.185342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.185388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.185421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.185457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.185722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.185736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.185786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.185824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.185856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.185888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.186133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.186146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.186156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.186166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.187575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.187629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.187679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.187710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.187950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.187963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.188012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.188045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.188077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.188117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.188455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.188467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.188477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.188487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.191081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.191126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.191159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.191190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.710 [2024-07-23 08:50:25.191437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.191449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.191504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.191543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.191575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.191607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.191869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.191881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.191891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.191900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.193331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.193376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.193415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.193825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.194180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.194194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.194240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.194274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.194313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.195388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.195761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.195774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.195785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.195795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.198721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.199541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.199579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.200593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.200854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.200868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.200918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.201246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.201281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.201576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.201927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.201941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.201950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.201961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.203633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.204659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.204715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.205268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.205524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.205537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.205594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.206704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.206747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.207851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.208116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.208128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.208139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.208149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.211471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.212427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.212468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.213377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.213634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.213648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.213700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.214774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.214819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.215377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.215657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.215671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.215683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.215693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.217264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.217571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.217615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.711 [2024-07-23 08:50:25.217910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.218265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.218279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.218329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.219324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.219365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.220413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.220674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.220687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.220697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.220706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.224049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.224808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.224878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.225171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.225461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.225473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.225522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.226118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.226153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.226444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.226745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.226759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.226768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.226778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.228216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.228261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.229094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.229132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.229376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.229388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.229439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.229472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.230497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.230538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.230874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.230891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.230902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.230911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.233349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.234376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.234415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.234448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.234785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.234798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.234849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.235737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.975 [2024-07-23 08:50:25.235775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.235807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.236059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.236072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.236081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.236091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.237933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.237982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.238016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.238912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.239275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.239288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.239596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.239639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.239672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.240478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.240736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.240749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.240760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.240769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.244166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.244213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.244501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.244533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.244912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.244926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.244972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.245005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.245290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.245323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.245685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.245699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.245710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.245721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.247078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.247867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.247920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.247953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.248232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.248244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.248311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.249332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.249371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.249419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.249690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.249703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.249714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.249724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.253053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.253103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.253136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.254057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.254309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.254322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.255335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.255373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.255412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.255989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.256262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.256274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.256285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.256295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.257862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.257922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.258210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.258243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.258594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.258607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.258677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.258712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.259336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.259372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.259652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.259665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.259675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.259685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.261446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.262494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.262533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.262566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.262952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.262969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.263021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.263314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.263348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.263381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.263724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.263738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.263748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.263758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.266374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.266423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.266456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.976 [2024-07-23 08:50:25.267477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.267827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.267841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.268995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.269031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.269069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.270103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.270355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.270374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.270383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.270393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.272568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.272626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.273662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.273714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.273970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.273984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.274034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.274067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.275091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.275129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.275411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.275423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.275434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.275444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.277998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.278306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.278344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.278377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.278631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.278647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.278695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.278993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.279028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.279061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.279418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.279432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.279444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.279455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.282726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.282779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.282819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.283884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.284142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.284155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.284465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.284502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.284540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.284841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.285155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.285167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.285177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.285187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.287542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.287997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.288037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.288967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.289218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.289231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.289285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.290305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.290344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.291118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.291432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.291445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.291455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.291465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.293552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.294402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.294442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.295462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.295723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.295736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.295786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.296301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.296337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.297161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.297415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.297428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.297438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.297453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.299439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.299753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.299794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.300810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.301064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.301077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.301128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.302223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.302261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.303181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.303457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.977 [2024-07-23 08:50:25.303470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.303481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.303490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.306095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.306424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.306462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.307306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.307670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.307684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.307733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.308148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.308183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.308979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.309236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.309248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.309259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.309269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.311757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.312816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.312854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.313146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.313481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.313495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.313544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.313845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.313879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.314169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.314469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.314482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.314492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.314502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.317018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.318051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.318090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.319101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.319390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.319403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.319468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.319770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.319810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.320102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.320355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.320367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.320377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.320387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.323143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.324332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.324372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.325057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.325336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.325349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.325401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.326427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.326464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.327479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.327811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.327827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.327837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.327847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.329843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.330868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.330908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.331915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.332198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.332210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.332263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.333349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.333392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.334444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.334702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.334715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.334726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.334735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.336558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.336874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.336912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.337630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.337909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.337922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.337975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.339000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.339038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.340032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.340322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.340335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.340345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.340356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.342937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.343245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.343282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.343575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.343933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.343948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.978 [2024-07-23 08:50:25.343996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.344912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.344949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.345712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.345966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.345979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.345989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.345999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.348572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.348887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.349185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.349849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.350147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.350159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.350212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.350507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.351143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.351950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.352203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.352216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.352226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.352236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.354730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.355035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.355328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.355628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.355983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.355997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.356311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.356615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.356910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.357205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.357476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.357489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.357499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.357509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.359774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.360079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.360373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.360676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.361007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.361021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.362016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.362318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.362620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.362914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.363165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.363177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.363191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.363201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.366812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.367951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.368247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.368538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.368862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.368876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.369185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.369605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.370468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.371509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.371766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.371780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.371790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.371801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.373861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.374175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.374478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.374538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.374968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.374981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.375294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.375594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.375903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.375940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.376256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.376267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.376277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.376287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.378311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.378360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.378666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.378716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.379034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.979 [2024-07-23 08:50:25.379047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.379358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.379399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.379695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.379730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.380040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.380053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.380063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.380073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.382059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.382124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.382416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.382452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.382810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.382824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.383133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.383190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.383488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.383535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.383849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.383863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.383873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.383883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.385898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.385953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.386247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.386289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.386669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.386684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.386986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.387024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.387308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.387341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.387660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.387673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.387683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.387693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.389638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.389687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.389975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.390034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.390384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.390396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.390708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.390750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.391033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.391068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.391367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.391379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.391389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.391399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.393341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.393389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.393681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.393714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.394068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.394084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.394385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.394433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.394733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.394788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.395093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.395105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.395117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.395126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.396980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.397283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.397325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.397614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.397924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.397937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.398234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.398519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.398568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.398866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.399198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.399210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.399221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.399231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.401239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.401290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.401579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.401874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.402226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.402238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.402539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.980 [2024-07-23 08:50:25.402575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.402871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.403162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.403492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.403505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.403515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.403524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.405336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.405657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.405960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.406000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.406347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.406361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.406408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.406710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.407001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.407037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.407384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.407397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.407407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.407417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.409325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.409638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.409686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.409989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.410298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.410311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.410639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.410932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.410968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.411259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.411621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.411637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.411647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.411658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.413586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.413641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.413938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.414236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.414623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.414637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.414947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.414985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.415275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.415566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.415897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.415922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.415932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.415943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.417509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.417823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.418120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.418172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.418505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.418518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.418573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.418880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.419173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.419212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.419566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.419578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.419589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.419602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.421333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.421646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.421687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.421979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.422286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.422299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.422617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.422915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.422951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.423240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.423617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.423630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.423642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.423652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.425344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.425393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.425693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.425989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.426296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.426308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.426624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.426666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.426957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.427249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.427650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.427664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.427675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.427687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.429029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.981 [2024-07-23 08:50:25.429336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.429643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.429684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.430045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.430058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.430105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.430402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.431519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.431559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.431811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.431826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.431836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.431846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.433460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.433767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.433806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.434813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.435115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.435128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.436174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.437143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.437182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.437504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.437751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.437764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.437774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.437784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.439239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.439288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.439581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.439889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.440175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.440189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.441010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.441049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.442031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.443027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.443376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.443389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.443399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.443410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.444521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.445046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.445345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.445384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.445708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.445722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.445771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.446057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.446779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.446815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.447099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.447111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.447121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.447131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.449041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.449090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.449764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.449805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.450117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.450128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.450427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.450471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.450782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.450818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.451132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.451145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.451168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.451178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.452992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.453042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.453328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.453366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.453692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.453705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.454005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.454041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.454326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.454364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.454603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.454623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.454633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.454660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.456727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.456781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.457070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.457105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.457447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.457460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.457764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.457803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.458090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.982 [2024-07-23 08:50:25.458130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.458375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.458387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.458396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.458406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.460664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.460718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.461877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.461921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.462163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.462176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.463255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.463293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.463586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.463628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.463970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.463984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.463995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.464005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.466236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.466285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.467291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.467329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.467622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.467635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.468782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.468828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.469920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.469963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.470214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.470226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.470240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.470250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.471919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.471967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.472590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.472631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.472898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.472910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.473955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.473994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.475011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.475048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.475339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.475352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.475361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.475371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.477327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.477376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.477675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.477712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.478048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.478062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.478369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.478405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.479062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.479099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.479363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.479376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.479385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.479395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.481513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.481577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.482647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.482685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.482935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.482947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.483934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.483973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.484267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.484303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.484656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.484670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.484681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.484691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.486869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.486917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.487925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.487962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.488272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:12.983 [2024-07-23 08:50:25.488283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.489469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.489513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.490701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.490747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.490991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.491004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.491013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.491023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.492619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.492667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.493307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.493345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.493600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.493618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.494658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.494697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.495678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.495714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.495991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.496003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.496013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.496023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.498025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.498072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.498107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.498139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.498445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.498457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.498764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.498804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.498835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.498867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.499210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.499224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.499234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.499245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.500193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.500237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.500281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.500313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.500550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.243 [2024-07-23 08:50:25.500563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.500621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.500654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.500686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.500718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.500955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.500966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.500976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.500985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.501884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.501929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.501961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.501993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.502286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.502297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.502344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.502376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.502409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.502440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.502735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.502748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.502758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.502768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.503832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.503877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.503917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.503955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.504199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.504211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.504261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.504295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.504329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.504361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.504662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.504677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.504687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.504697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.505693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.505742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.505781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.506601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.506930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.506943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.506996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.507028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.507064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.507350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.507702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.507716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.507727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.507738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.508692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.508768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.509089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.509787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.509825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.510622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.510866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.510879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.510888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.510898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.511938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.512013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.512674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.513597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.513855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.513870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.514861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.514933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.518783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 [2024-07-23 08:50:25.518855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:13.244 00:37:13.244 Latency(us) 00:37:13.244 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:13.244 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:37:13.244 Verification LBA range: start 0x0 length 0x100 00:37:13.244 crypto_ram : 5.47 55.08 3.44 0.00 0.00 2224160.69 9799.19 1725656.50 00:37:13.244 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:37:13.244 Verification LBA range: start 0x100 length 0x100 00:37:13.244 crypto_ram : 5.43 51.53 3.22 0.00 0.00 2396366.42 12607.88 1829515.46 00:37:13.244 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:37:13.244 Verification LBA range: start 0x0 length 0x100 00:37:13.244 crypto_ram1 : 5.48 58.56 3.66 0.00 0.00 2060499.28 16352.79 1589840.94 00:37:13.244 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:37:13.244 Verification LBA range: start 0x100 length 0x100 00:37:13.244 crypto_ram1 : 5.44 53.45 3.34 0.00 0.00 2261169.59 3042.74 1693699.90 00:37:13.244 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:37:13.245 Verification LBA range: start 0x0 length 0x100 00:37:13.245 crypto_ram2 : 5.38 404.49 25.28 0.00 0.00 294587.33 30084.14 443397.85 00:37:13.245 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:37:13.245 Verification LBA range: start 0x100 length 0x100 00:37:13.245 crypto_ram2 : 5.38 391.42 24.46 0.00 0.00 303702.50 12982.37 453384.29 00:37:13.245 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:37:13.245 Verification LBA range: start 0x0 length 0x100 00:37:13.245 crypto_ram3 : 5.43 420.30 26.27 0.00 0.00 278046.17 16477.62 337541.61 00:37:13.245 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:37:13.245 Verification LBA range: start 0x100 length 0x100 00:37:13.245 crypto_ram3 : 5.43 404.74 25.30 0.00 0.00 287989.56 1310.72 299593.14 00:37:13.245 =================================================================================================================== 00:37:13.245 Total : 1839.58 114.97 0.00 0.00 523173.99 1310.72 1829515.46 00:37:15.786 00:37:15.786 real 0m11.741s 00:37:15.786 user 0m22.029s 00:37:15.786 sys 0m0.565s 00:37:15.786 08:50:28 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:15.786 08:50:28 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:37:15.786 ************************************ 00:37:15.786 END TEST bdev_verify_big_io 00:37:15.786 ************************************ 00:37:15.786 08:50:28 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:37:15.787 08:50:28 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:15.787 08:50:28 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:37:15.787 08:50:28 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:37:15.787 08:50:28 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:37:15.787 ************************************ 00:37:15.787 START TEST bdev_write_zeroes 00:37:15.787 ************************************ 00:37:15.787 08:50:28 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:16.068 [2024-07-23 08:50:28.331189] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:37:16.068 [2024-07-23 08:50:28.331274] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1700136 ] 00:37:16.068 [2024-07-23 08:50:28.453461] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:16.326 [2024-07-23 08:50:28.664673] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:16.326 [2024-07-23 08:50:28.685871] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:37:16.326 [2024-07-23 08:50:28.693891] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:37:16.326 [2024-07-23 08:50:28.701909] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:37:16.585 [2024-07-23 08:50:28.992527] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:37:19.871 [2024-07-23 08:50:31.687383] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:37:19.871 [2024-07-23 08:50:31.687468] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:37:19.871 [2024-07-23 08:50:31.687481] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:19.871 [2024-07-23 08:50:31.695403] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:37:19.871 [2024-07-23 08:50:31.695434] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:37:19.871 [2024-07-23 08:50:31.695443] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:19.871 [2024-07-23 08:50:31.703415] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:37:19.871 [2024-07-23 08:50:31.703438] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:37:19.871 [2024-07-23 08:50:31.703446] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:19.871 [2024-07-23 08:50:31.711437] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:37:19.871 [2024-07-23 08:50:31.711461] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:37:19.871 [2024-07-23 08:50:31.711469] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:19.871 Running I/O for 1 seconds... 00:37:20.806 00:37:20.806 Latency(us) 00:37:20.806 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:20.806 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:37:20.806 crypto_ram : 1.02 2564.99 10.02 0.00 0.00 49495.63 5835.82 66909.14 00:37:20.806 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:37:20.806 crypto_ram1 : 1.02 2578.32 10.07 0.00 0.00 49009.59 5523.75 61166.93 00:37:20.806 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:37:20.806 crypto_ram2 : 1.02 20001.06 78.13 0.00 0.00 6304.39 2262.55 10048.85 00:37:20.806 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:37:20.806 crypto_ram3 : 1.02 20069.07 78.39 0.00 0.00 6263.99 983.04 8238.81 00:37:20.806 =================================================================================================================== 00:37:20.806 Total : 45213.43 176.61 0.00 0.00 11190.89 983.04 66909.14 00:37:23.337 00:37:23.337 real 0m7.001s 00:37:23.337 user 0m6.436s 00:37:23.337 sys 0m0.487s 00:37:23.337 08:50:35 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:23.337 08:50:35 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:37:23.337 ************************************ 00:37:23.337 END TEST bdev_write_zeroes 00:37:23.337 ************************************ 00:37:23.337 08:50:35 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:37:23.337 08:50:35 blockdev_crypto_qat -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:23.337 08:50:35 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:37:23.337 08:50:35 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:37:23.337 08:50:35 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:37:23.337 ************************************ 00:37:23.337 START TEST bdev_json_nonenclosed 00:37:23.337 ************************************ 00:37:23.337 08:50:35 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:23.337 [2024-07-23 08:50:35.398093] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:37:23.337 [2024-07-23 08:50:35.398188] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1701409 ] 00:37:23.337 [2024-07-23 08:50:35.519081] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:23.337 [2024-07-23 08:50:35.720663] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:23.337 [2024-07-23 08:50:35.720758] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:37:23.337 [2024-07-23 08:50:35.720779] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:37:23.337 [2024-07-23 08:50:35.720789] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:37:23.904 00:37:23.904 real 0m0.830s 00:37:23.904 user 0m0.658s 00:37:23.904 sys 0m0.168s 00:37:23.904 08:50:36 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:37:23.904 08:50:36 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:23.904 08:50:36 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:37:23.904 ************************************ 00:37:23.904 END TEST bdev_json_nonenclosed 00:37:23.904 ************************************ 00:37:23.904 08:50:36 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:37:23.904 08:50:36 blockdev_crypto_qat -- bdev/blockdev.sh@781 -- # true 00:37:23.904 08:50:36 blockdev_crypto_qat -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:23.904 08:50:36 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:37:23.904 08:50:36 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:37:23.904 08:50:36 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:37:23.904 ************************************ 00:37:23.904 START TEST bdev_json_nonarray 00:37:23.904 ************************************ 00:37:23.904 08:50:36 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:23.904 [2024-07-23 08:50:36.298737] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:37:23.904 [2024-07-23 08:50:36.298827] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1701579 ] 00:37:24.163 [2024-07-23 08:50:36.425619] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:24.163 [2024-07-23 08:50:36.633201] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:24.163 [2024-07-23 08:50:36.633297] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:37:24.163 [2024-07-23 08:50:36.633319] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:37:24.163 [2024-07-23 08:50:36.633334] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:37:24.731 00:37:24.731 real 0m0.824s 00:37:24.731 user 0m0.650s 00:37:24.731 sys 0m0.170s 00:37:24.731 08:50:37 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:37:24.731 08:50:37 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:24.731 08:50:37 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:37:24.731 ************************************ 00:37:24.731 END TEST bdev_json_nonarray 00:37:24.731 ************************************ 00:37:24.731 08:50:37 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:37:24.731 08:50:37 blockdev_crypto_qat -- bdev/blockdev.sh@784 -- # true 00:37:24.731 08:50:37 blockdev_crypto_qat -- bdev/blockdev.sh@786 -- # [[ crypto_qat == bdev ]] 00:37:24.731 08:50:37 blockdev_crypto_qat -- bdev/blockdev.sh@793 -- # [[ crypto_qat == gpt ]] 00:37:24.731 08:50:37 blockdev_crypto_qat -- bdev/blockdev.sh@797 -- # [[ crypto_qat == crypto_sw ]] 00:37:24.731 08:50:37 blockdev_crypto_qat -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:37:24.731 08:50:37 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # cleanup 00:37:24.731 08:50:37 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:37:24.731 08:50:37 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:37:24.731 08:50:37 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:37:24.731 08:50:37 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:37:24.731 08:50:37 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:37:24.731 08:50:37 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:37:24.731 00:37:24.731 real 1m38.998s 00:37:24.731 user 3m21.516s 00:37:24.731 sys 0m8.441s 00:37:24.731 08:50:37 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:24.731 08:50:37 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:37:24.731 ************************************ 00:37:24.731 END TEST blockdev_crypto_qat 00:37:24.731 ************************************ 00:37:24.731 08:50:37 -- common/autotest_common.sh@1142 -- # return 0 00:37:24.731 08:50:37 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:37:24.731 08:50:37 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:37:24.731 08:50:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:37:24.731 08:50:37 -- common/autotest_common.sh@10 -- # set +x 00:37:24.731 ************************************ 00:37:24.731 START TEST chaining 00:37:24.731 ************************************ 00:37:24.731 08:50:37 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:37:24.731 * Looking for test storage... 00:37:24.731 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:37:24.732 08:50:37 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@7 -- # uname -s 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:800e967b-538f-e911-906e-001635649f5c 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=800e967b-538f-e911-906e-001635649f5c 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:37:24.732 08:50:37 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:37:24.732 08:50:37 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:37:24.732 08:50:37 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:37:24.732 08:50:37 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:24.732 08:50:37 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:24.732 08:50:37 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:24.732 08:50:37 chaining -- paths/export.sh@5 -- # export PATH 00:37:24.732 08:50:37 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@47 -- # : 0 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:37:24.732 08:50:37 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:37:24.732 08:50:37 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:37:24.732 08:50:37 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:37:24.732 08:50:37 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:37:24.732 08:50:37 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:37:24.732 08:50:37 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:37:24.732 08:50:37 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:37:24.732 08:50:37 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:37:24.732 08:50:37 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:37:24.732 08:50:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@296 -- # e810=() 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@297 -- # x722=() 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@298 -- # mlx=() 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:18:00.0 (0x8086 - 0x159b)' 00:37:31.296 Found 0000:18:00.0 (0x8086 - 0x159b) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:18:00.1 (0x8086 - 0x159b)' 00:37:31.296 Found 0000:18:00.1 (0x8086 - 0x159b) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:18:00.0: cvl_0_0' 00:37:31.296 Found net devices under 0000:18:00.0: cvl_0_0 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:18:00.1: cvl_0_1' 00:37:31.296 Found net devices under 0000:18:00.1: cvl_0_1 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:37:31.296 08:50:43 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:37:31.297 08:50:43 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:37:31.297 08:50:43 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:37:31.297 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:37:31.297 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.167 ms 00:37:31.297 00:37:31.297 --- 10.0.0.2 ping statistics --- 00:37:31.297 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:37:31.297 rtt min/avg/max/mdev = 0.167/0.167/0.167/0.000 ms 00:37:31.297 08:50:43 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:37:31.297 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:37:31.297 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.137 ms 00:37:31.297 00:37:31.297 --- 10.0.0.1 ping statistics --- 00:37:31.297 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:37:31.297 rtt min/avg/max/mdev = 0.137/0.137/0.137/0.000 ms 00:37:31.297 08:50:43 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:37:31.297 08:50:43 chaining -- nvmf/common.sh@422 -- # return 0 00:37:31.297 08:50:43 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:37:31.297 08:50:43 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:37:31.297 08:50:43 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:37:31.297 08:50:43 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:37:31.297 08:50:43 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:37:31.297 08:50:43 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:37:31.297 08:50:43 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:37:31.297 08:50:43 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:37:31.297 08:50:43 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:37:31.297 08:50:43 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:37:31.297 08:50:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:31.297 08:50:43 chaining -- nvmf/common.sh@481 -- # nvmfpid=1705352 00:37:31.297 08:50:43 chaining -- nvmf/common.sh@482 -- # waitforlisten 1705352 00:37:31.297 08:50:43 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:37:31.297 08:50:43 chaining -- common/autotest_common.sh@829 -- # '[' -z 1705352 ']' 00:37:31.297 08:50:43 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:37:31.297 08:50:43 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:37:31.297 08:50:43 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:37:31.297 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:37:31.297 08:50:43 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:37:31.297 08:50:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:31.297 [2024-07-23 08:50:43.463170] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:37:31.297 [2024-07-23 08:50:43.463264] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:37:31.297 [2024-07-23 08:50:43.594863] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:31.297 [2024-07-23 08:50:43.802581] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:37:31.297 [2024-07-23 08:50:43.802631] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:37:31.297 [2024-07-23 08:50:43.802643] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:37:31.297 [2024-07-23 08:50:43.802670] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:37:31.297 [2024-07-23 08:50:43.802681] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:37:31.297 [2024-07-23 08:50:43.802715] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:37:31.864 08:50:44 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:37:31.864 08:50:44 chaining -- common/autotest_common.sh@862 -- # return 0 00:37:31.864 08:50:44 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:37:31.864 08:50:44 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:37:31.864 08:50:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:31.864 08:50:44 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:37:31.864 08:50:44 chaining -- bdev/chaining.sh@69 -- # mktemp 00:37:31.864 08:50:44 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.I2bN0Giee1 00:37:31.864 08:50:44 chaining -- bdev/chaining.sh@69 -- # mktemp 00:37:31.864 08:50:44 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.6smHlvJnqH 00:37:31.864 08:50:44 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:37:31.865 08:50:44 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:37:31.865 08:50:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:31.865 08:50:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:31.865 malloc0 00:37:31.865 true 00:37:31.865 true 00:37:31.865 [2024-07-23 08:50:44.345294] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:37:31.865 crypto0 00:37:31.865 [2024-07-23 08:50:44.353293] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:37:31.865 crypto1 00:37:31.865 [2024-07-23 08:50:44.361409] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:37:31.865 [2024-07-23 08:50:44.377575] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:37:32.122 08:50:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@85 -- # update_stats 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@39 -- # opcode= 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:37:32.123 08:50:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:32.123 08:50:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:32.123 08:50:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:37:32.123 08:50:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:32.123 08:50:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:32.123 08:50:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:32.123 08:50:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:32.123 08:50:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:37:32.123 08:50:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:32.123 08:50:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:32.123 08:50:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:37:32.123 08:50:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.I2bN0Giee1 bs=1K count=64 00:37:32.123 64+0 records in 00:37:32.123 64+0 records out 00:37:32.123 65536 bytes (66 kB, 64 KiB) copied, 0.000263747 s, 248 MB/s 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.I2bN0Giee1 --ob Nvme0n1 --bs 65536 --count 1 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@25 -- # local config 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:37:32.123 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@31 -- # config='{ 00:37:32.123 "subsystems": [ 00:37:32.123 { 00:37:32.123 "subsystem": "bdev", 00:37:32.123 "config": [ 00:37:32.123 { 00:37:32.123 "method": "bdev_nvme_attach_controller", 00:37:32.123 "params": { 00:37:32.123 "trtype": "tcp", 00:37:32.123 "adrfam": "IPv4", 00:37:32.123 "name": "Nvme0", 00:37:32.123 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:37:32.123 "traddr": "10.0.0.2", 00:37:32.123 "trsvcid": "4420" 00:37:32.123 } 00:37:32.123 }, 00:37:32.123 { 00:37:32.123 "method": "bdev_set_options", 00:37:32.123 "params": { 00:37:32.123 "bdev_auto_examine": false 00:37:32.123 } 00:37:32.123 } 00:37:32.123 ] 00:37:32.123 } 00:37:32.123 ] 00:37:32.123 }' 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:37:32.123 "subsystems": [ 00:37:32.123 { 00:37:32.123 "subsystem": "bdev", 00:37:32.123 "config": [ 00:37:32.123 { 00:37:32.123 "method": "bdev_nvme_attach_controller", 00:37:32.123 "params": { 00:37:32.123 "trtype": "tcp", 00:37:32.123 "adrfam": "IPv4", 00:37:32.123 "name": "Nvme0", 00:37:32.123 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:37:32.123 "traddr": "10.0.0.2", 00:37:32.123 "trsvcid": "4420" 00:37:32.123 } 00:37:32.123 }, 00:37:32.123 { 00:37:32.123 "method": "bdev_set_options", 00:37:32.123 "params": { 00:37:32.123 "bdev_auto_examine": false 00:37:32.123 } 00:37:32.123 } 00:37:32.123 ] 00:37:32.123 } 00:37:32.123 ] 00:37:32.123 }' 00:37:32.123 08:50:44 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.I2bN0Giee1 --ob Nvme0n1 --bs 65536 --count 1 00:37:32.380 [2024-07-23 08:50:44.696116] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:37:32.380 [2024-07-23 08:50:44.696216] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1705576 ] 00:37:32.380 [2024-07-23 08:50:44.816962] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:32.638 [2024-07-23 08:50:45.035694] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:34.579  Copying: 64/64 [kB] (average 12 MBps) 00:37:34.579 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@39 -- # opcode= 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:37:34.579 08:50:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:34.579 08:50:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:34.579 08:50:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:37:34.579 08:50:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:34.579 08:50:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:34.579 08:50:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:37:34.579 08:50:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:34.579 08:50:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:34.579 08:50:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:37:34.579 08:50:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:37:34.580 08:50:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:34.580 08:50:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:34.580 08:50:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:34.580 08:50:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@96 -- # update_stats 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@39 -- # opcode= 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:37:34.580 08:50:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:34.580 08:50:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:37:34.580 08:50:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:34.580 08:50:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:37:34.580 08:50:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:34.580 08:50:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:34.580 08:50:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:34.837 08:50:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:34.837 08:50:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:34.837 08:50:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:37:34.837 08:50:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:34.837 08:50:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:34.837 08:50:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.6smHlvJnqH --ib Nvme0n1 --bs 65536 --count 1 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@25 -- # local config 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:37:34.837 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:37:34.837 08:50:47 chaining -- bdev/chaining.sh@31 -- # config='{ 00:37:34.837 "subsystems": [ 00:37:34.837 { 00:37:34.838 "subsystem": "bdev", 00:37:34.838 "config": [ 00:37:34.838 { 00:37:34.838 "method": "bdev_nvme_attach_controller", 00:37:34.838 "params": { 00:37:34.838 "trtype": "tcp", 00:37:34.838 "adrfam": "IPv4", 00:37:34.838 "name": "Nvme0", 00:37:34.838 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:37:34.838 "traddr": "10.0.0.2", 00:37:34.838 "trsvcid": "4420" 00:37:34.838 } 00:37:34.838 }, 00:37:34.838 { 00:37:34.838 "method": "bdev_set_options", 00:37:34.838 "params": { 00:37:34.838 "bdev_auto_examine": false 00:37:34.838 } 00:37:34.838 } 00:37:34.838 ] 00:37:34.838 } 00:37:34.838 ] 00:37:34.838 }' 00:37:34.838 08:50:47 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.6smHlvJnqH --ib Nvme0n1 --bs 65536 --count 1 00:37:34.838 08:50:47 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:37:34.838 "subsystems": [ 00:37:34.838 { 00:37:34.838 "subsystem": "bdev", 00:37:34.838 "config": [ 00:37:34.838 { 00:37:34.838 "method": "bdev_nvme_attach_controller", 00:37:34.838 "params": { 00:37:34.838 "trtype": "tcp", 00:37:34.838 "adrfam": "IPv4", 00:37:34.838 "name": "Nvme0", 00:37:34.838 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:37:34.838 "traddr": "10.0.0.2", 00:37:34.838 "trsvcid": "4420" 00:37:34.838 } 00:37:34.838 }, 00:37:34.838 { 00:37:34.838 "method": "bdev_set_options", 00:37:34.838 "params": { 00:37:34.838 "bdev_auto_examine": false 00:37:34.838 } 00:37:34.838 } 00:37:34.838 ] 00:37:34.838 } 00:37:34.838 ] 00:37:34.838 }' 00:37:34.838 [2024-07-23 08:50:47.297600] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:37:34.838 [2024-07-23 08:50:47.297691] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1706040 ] 00:37:35.095 [2024-07-23 08:50:47.418586] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:35.353 [2024-07-23 08:50:47.639402] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:37.296  Copying: 64/64 [kB] (average 12 MBps) 00:37:37.296 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@39 -- # opcode= 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:37:37.296 08:50:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:37.296 08:50:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:37.296 08:50:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:37:37.296 08:50:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:37.296 08:50:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:37.296 08:50:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:37:37.296 08:50:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:37.296 08:50:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:37.296 08:50:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:37.296 08:50:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:37.296 08:50:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:37.296 08:50:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:37:37.296 08:50:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:37.555 08:50:49 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:37:37.555 08:50:49 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.I2bN0Giee1 /tmp/tmp.6smHlvJnqH 00:37:37.555 08:50:49 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:37:37.555 08:50:49 chaining -- bdev/chaining.sh@25 -- # local config 00:37:37.555 08:50:49 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:37:37.555 08:50:49 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:37:37.555 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:37:37.555 08:50:49 chaining -- bdev/chaining.sh@31 -- # config='{ 00:37:37.555 "subsystems": [ 00:37:37.555 { 00:37:37.555 "subsystem": "bdev", 00:37:37.555 "config": [ 00:37:37.555 { 00:37:37.555 "method": "bdev_nvme_attach_controller", 00:37:37.555 "params": { 00:37:37.555 "trtype": "tcp", 00:37:37.555 "adrfam": "IPv4", 00:37:37.555 "name": "Nvme0", 00:37:37.555 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:37:37.555 "traddr": "10.0.0.2", 00:37:37.555 "trsvcid": "4420" 00:37:37.555 } 00:37:37.555 }, 00:37:37.555 { 00:37:37.555 "method": "bdev_set_options", 00:37:37.555 "params": { 00:37:37.556 "bdev_auto_examine": false 00:37:37.556 } 00:37:37.556 } 00:37:37.556 ] 00:37:37.556 } 00:37:37.556 ] 00:37:37.556 }' 00:37:37.556 08:50:49 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:37:37.556 08:50:49 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:37:37.556 "subsystems": [ 00:37:37.556 { 00:37:37.556 "subsystem": "bdev", 00:37:37.556 "config": [ 00:37:37.556 { 00:37:37.556 "method": "bdev_nvme_attach_controller", 00:37:37.556 "params": { 00:37:37.556 "trtype": "tcp", 00:37:37.556 "adrfam": "IPv4", 00:37:37.556 "name": "Nvme0", 00:37:37.556 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:37:37.556 "traddr": "10.0.0.2", 00:37:37.556 "trsvcid": "4420" 00:37:37.556 } 00:37:37.556 }, 00:37:37.556 { 00:37:37.556 "method": "bdev_set_options", 00:37:37.556 "params": { 00:37:37.556 "bdev_auto_examine": false 00:37:37.556 } 00:37:37.556 } 00:37:37.556 ] 00:37:37.556 } 00:37:37.556 ] 00:37:37.556 }' 00:37:37.556 [2024-07-23 08:50:49.952695] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:37:37.556 [2024-07-23 08:50:49.952779] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1706565 ] 00:37:37.815 [2024-07-23 08:50:50.077122] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:37.815 [2024-07-23 08:50:50.291470] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:39.788  Copying: 64/64 [kB] (average 62 MBps) 00:37:39.788 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@106 -- # update_stats 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@39 -- # opcode= 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:37:39.788 08:50:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:39.788 08:50:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:39.788 08:50:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:39.788 08:50:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:37:39.788 08:50:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:39.788 08:50:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:39.788 08:50:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:40.047 08:50:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:40.047 08:50:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:37:40.047 08:50:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:40.047 08:50:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:40.047 08:50:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:37:40.047 08:50:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.I2bN0Giee1 --ob Nvme0n1 --bs 4096 --count 16 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@25 -- # local config 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:37:40.047 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@31 -- # config='{ 00:37:40.047 "subsystems": [ 00:37:40.047 { 00:37:40.047 "subsystem": "bdev", 00:37:40.047 "config": [ 00:37:40.047 { 00:37:40.047 "method": "bdev_nvme_attach_controller", 00:37:40.047 "params": { 00:37:40.047 "trtype": "tcp", 00:37:40.047 "adrfam": "IPv4", 00:37:40.047 "name": "Nvme0", 00:37:40.047 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:37:40.047 "traddr": "10.0.0.2", 00:37:40.047 "trsvcid": "4420" 00:37:40.047 } 00:37:40.047 }, 00:37:40.047 { 00:37:40.047 "method": "bdev_set_options", 00:37:40.047 "params": { 00:37:40.047 "bdev_auto_examine": false 00:37:40.047 } 00:37:40.047 } 00:37:40.047 ] 00:37:40.047 } 00:37:40.047 ] 00:37:40.047 }' 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.I2bN0Giee1 --ob Nvme0n1 --bs 4096 --count 16 00:37:40.047 08:50:52 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:37:40.047 "subsystems": [ 00:37:40.047 { 00:37:40.047 "subsystem": "bdev", 00:37:40.047 "config": [ 00:37:40.047 { 00:37:40.047 "method": "bdev_nvme_attach_controller", 00:37:40.047 "params": { 00:37:40.047 "trtype": "tcp", 00:37:40.047 "adrfam": "IPv4", 00:37:40.047 "name": "Nvme0", 00:37:40.047 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:37:40.047 "traddr": "10.0.0.2", 00:37:40.047 "trsvcid": "4420" 00:37:40.047 } 00:37:40.047 }, 00:37:40.047 { 00:37:40.047 "method": "bdev_set_options", 00:37:40.047 "params": { 00:37:40.047 "bdev_auto_examine": false 00:37:40.047 } 00:37:40.047 } 00:37:40.047 ] 00:37:40.047 } 00:37:40.047 ] 00:37:40.047 }' 00:37:40.047 [2024-07-23 08:50:52.534732] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:37:40.047 [2024-07-23 08:50:52.534816] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1707082 ] 00:37:40.306 [2024-07-23 08:50:52.652970] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:40.565 [2024-07-23 08:50:52.874051] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:42.199  Copying: 64/64 [kB] (average 10 MBps) 00:37:42.199 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@39 -- # opcode= 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:37:42.199 08:50:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:42.199 08:50:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:42.199 08:50:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:42.199 08:50:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:37:42.199 08:50:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:42.200 08:50:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:42.200 08:50:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:37:42.458 08:50:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:42.458 08:50:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:42.458 08:50:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:42.458 08:50:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:42.458 08:50:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:37:42.458 08:50:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@114 -- # update_stats 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@39 -- # opcode= 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:37:42.458 08:50:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:42.458 08:50:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:42.458 08:50:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:42.458 08:50:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:37:42.458 08:50:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:42.458 08:50:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:37:42.458 08:50:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:42.459 08:50:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:42.459 08:50:54 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:37:42.459 08:50:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:42.459 08:50:54 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:37:42.459 08:50:54 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:42.459 08:50:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:37:42.459 08:50:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:42.459 08:50:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:42.459 08:50:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:42.459 08:50:54 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:37:42.459 08:50:54 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:37:42.459 08:50:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:42.459 08:50:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:42.459 08:50:54 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:37:42.459 08:50:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:42.459 08:50:54 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:37:42.459 08:50:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:37:42.459 08:50:54 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:42.459 08:50:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:42.459 08:50:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:42.459 08:50:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:42.718 08:50:54 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:37:42.718 08:50:54 chaining -- bdev/chaining.sh@117 -- # : 00:37:42.718 08:50:54 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.6smHlvJnqH --ib Nvme0n1 --bs 4096 --count 16 00:37:42.718 08:50:54 chaining -- bdev/chaining.sh@25 -- # local config 00:37:42.718 08:50:54 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:37:42.718 08:50:54 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:37:42.718 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:37:42.718 08:50:55 chaining -- bdev/chaining.sh@31 -- # config='{ 00:37:42.718 "subsystems": [ 00:37:42.718 { 00:37:42.718 "subsystem": "bdev", 00:37:42.718 "config": [ 00:37:42.718 { 00:37:42.718 "method": "bdev_nvme_attach_controller", 00:37:42.718 "params": { 00:37:42.718 "trtype": "tcp", 00:37:42.718 "adrfam": "IPv4", 00:37:42.718 "name": "Nvme0", 00:37:42.718 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:37:42.718 "traddr": "10.0.0.2", 00:37:42.718 "trsvcid": "4420" 00:37:42.718 } 00:37:42.718 }, 00:37:42.718 { 00:37:42.718 "method": "bdev_set_options", 00:37:42.718 "params": { 00:37:42.718 "bdev_auto_examine": false 00:37:42.718 } 00:37:42.718 } 00:37:42.718 ] 00:37:42.718 } 00:37:42.718 ] 00:37:42.718 }' 00:37:42.718 08:50:55 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.6smHlvJnqH --ib Nvme0n1 --bs 4096 --count 16 00:37:42.718 08:50:55 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:37:42.718 "subsystems": [ 00:37:42.718 { 00:37:42.718 "subsystem": "bdev", 00:37:42.718 "config": [ 00:37:42.718 { 00:37:42.718 "method": "bdev_nvme_attach_controller", 00:37:42.718 "params": { 00:37:42.718 "trtype": "tcp", 00:37:42.718 "adrfam": "IPv4", 00:37:42.718 "name": "Nvme0", 00:37:42.718 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:37:42.718 "traddr": "10.0.0.2", 00:37:42.718 "trsvcid": "4420" 00:37:42.718 } 00:37:42.718 }, 00:37:42.718 { 00:37:42.718 "method": "bdev_set_options", 00:37:42.718 "params": { 00:37:42.718 "bdev_auto_examine": false 00:37:42.718 } 00:37:42.718 } 00:37:42.718 ] 00:37:42.718 } 00:37:42.718 ] 00:37:42.718 }' 00:37:42.718 [2024-07-23 08:50:55.104094] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:37:42.718 [2024-07-23 08:50:55.104178] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1707617 ] 00:37:42.718 [2024-07-23 08:50:55.225130] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:42.977 [2024-07-23 08:50:55.443913] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:45.288  Copying: 64/64 [kB] (average 492 kBps) 00:37:45.288 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@39 -- # opcode= 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.I2bN0Giee1 /tmp/tmp.6smHlvJnqH 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.I2bN0Giee1 /tmp/tmp.6smHlvJnqH 00:37:45.288 08:50:57 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:37:45.288 08:50:57 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:37:45.288 08:50:57 chaining -- nvmf/common.sh@117 -- # sync 00:37:45.288 08:50:57 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:37:45.288 08:50:57 chaining -- nvmf/common.sh@120 -- # set +e 00:37:45.288 08:50:57 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:37:45.288 08:50:57 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:37:45.288 rmmod nvme_tcp 00:37:45.288 rmmod nvme_fabrics 00:37:45.288 rmmod nvme_keyring 00:37:45.288 08:50:57 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:37:45.288 08:50:57 chaining -- nvmf/common.sh@124 -- # set -e 00:37:45.288 08:50:57 chaining -- nvmf/common.sh@125 -- # return 0 00:37:45.288 08:50:57 chaining -- nvmf/common.sh@489 -- # '[' -n 1705352 ']' 00:37:45.288 08:50:57 chaining -- nvmf/common.sh@490 -- # killprocess 1705352 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@948 -- # '[' -z 1705352 ']' 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@952 -- # kill -0 1705352 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@953 -- # uname 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1705352 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1705352' 00:37:45.288 killing process with pid 1705352 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@967 -- # kill 1705352 00:37:45.288 08:50:57 chaining -- common/autotest_common.sh@972 -- # wait 1705352 00:37:46.665 08:50:59 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:37:46.665 08:50:59 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:37:46.665 08:50:59 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:37:46.665 08:50:59 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:37:46.665 08:50:59 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:37:46.665 08:50:59 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:37:46.665 08:50:59 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:37:46.665 08:50:59 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:37:49.199 08:51:01 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:37:49.199 08:51:01 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:37:49.199 08:51:01 chaining -- bdev/chaining.sh@132 -- # bperfpid=1708696 00:37:49.199 08:51:01 chaining -- bdev/chaining.sh@134 -- # waitforlisten 1708696 00:37:49.199 08:51:01 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:37:49.199 08:51:01 chaining -- common/autotest_common.sh@829 -- # '[' -z 1708696 ']' 00:37:49.199 08:51:01 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:37:49.199 08:51:01 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:37:49.199 08:51:01 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:37:49.199 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:37:49.199 08:51:01 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:37:49.199 08:51:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:49.199 [2024-07-23 08:51:01.235695] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:37:49.199 [2024-07-23 08:51:01.235797] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1708696 ] 00:37:49.199 [2024-07-23 08:51:01.360014] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:49.199 [2024-07-23 08:51:01.596682] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:49.766 08:51:01 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:37:49.766 08:51:02 chaining -- common/autotest_common.sh@862 -- # return 0 00:37:49.766 08:51:02 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:37:49.766 08:51:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:49.766 08:51:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:50.025 malloc0 00:37:50.025 true 00:37:50.025 true 00:37:50.025 [2024-07-23 08:51:02.419936] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:37:50.025 crypto0 00:37:50.025 [2024-07-23 08:51:02.427957] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:37:50.025 crypto1 00:37:50.025 08:51:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:50.025 08:51:02 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:37:50.025 Running I/O for 5 seconds... 00:37:55.295 00:37:55.295 Latency(us) 00:37:55.295 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:55.295 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:37:55.295 Verification LBA range: start 0x0 length 0x2000 00:37:55.295 crypto1 : 5.01 15594.94 60.92 0.00 0.00 16377.71 4493.90 12795.12 00:37:55.295 =================================================================================================================== 00:37:55.295 Total : 15594.94 60.92 0.00 0.00 16377.71 4493.90 12795.12 00:37:55.295 0 00:37:55.295 08:51:07 chaining -- bdev/chaining.sh@146 -- # killprocess 1708696 00:37:55.295 08:51:07 chaining -- common/autotest_common.sh@948 -- # '[' -z 1708696 ']' 00:37:55.295 08:51:07 chaining -- common/autotest_common.sh@952 -- # kill -0 1708696 00:37:55.295 08:51:07 chaining -- common/autotest_common.sh@953 -- # uname 00:37:55.295 08:51:07 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:37:55.295 08:51:07 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1708696 00:37:55.295 08:51:07 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:37:55.295 08:51:07 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:37:55.295 08:51:07 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1708696' 00:37:55.295 killing process with pid 1708696 00:37:55.295 08:51:07 chaining -- common/autotest_common.sh@967 -- # kill 1708696 00:37:55.295 Received shutdown signal, test time was about 5.000000 seconds 00:37:55.295 00:37:55.295 Latency(us) 00:37:55.295 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:55.295 =================================================================================================================== 00:37:55.295 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:37:55.295 08:51:07 chaining -- common/autotest_common.sh@972 -- # wait 1708696 00:37:56.693 08:51:08 chaining -- bdev/chaining.sh@152 -- # bperfpid=1710190 00:37:56.693 08:51:08 chaining -- bdev/chaining.sh@154 -- # waitforlisten 1710190 00:37:56.693 08:51:08 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:37:56.693 08:51:08 chaining -- common/autotest_common.sh@829 -- # '[' -z 1710190 ']' 00:37:56.693 08:51:08 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:37:56.693 08:51:08 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:37:56.693 08:51:08 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:37:56.693 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:37:56.693 08:51:08 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:37:56.693 08:51:08 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:56.693 [2024-07-23 08:51:08.986789] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:37:56.693 [2024-07-23 08:51:08.986885] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1710190 ] 00:37:56.693 [2024-07-23 08:51:09.109578] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:56.950 [2024-07-23 08:51:09.322913] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:57.515 08:51:09 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:37:57.515 08:51:09 chaining -- common/autotest_common.sh@862 -- # return 0 00:37:57.515 08:51:09 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:37:57.515 08:51:09 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:57.515 08:51:09 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:57.773 malloc0 00:37:57.773 true 00:37:57.773 true 00:37:57.773 [2024-07-23 08:51:10.224271] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:37:57.773 [2024-07-23 08:51:10.224323] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:57.773 [2024-07-23 08:51:10.224343] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000034880 00:37:57.773 [2024-07-23 08:51:10.224354] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:57.773 [2024-07-23 08:51:10.225540] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:57.773 [2024-07-23 08:51:10.225571] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:37:57.773 pt0 00:37:57.773 [2024-07-23 08:51:10.232303] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:37:57.773 crypto0 00:37:57.773 [2024-07-23 08:51:10.240330] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:37:57.773 crypto1 00:37:57.773 08:51:10 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:57.773 08:51:10 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:37:58.031 Running I/O for 5 seconds... 00:38:03.296 00:38:03.296 Latency(us) 00:38:03.296 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:03.296 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:38:03.296 Verification LBA range: start 0x0 length 0x2000 00:38:03.296 crypto1 : 5.02 12146.29 47.45 0.00 0.00 21028.54 4649.94 15478.98 00:38:03.296 =================================================================================================================== 00:38:03.296 Total : 12146.29 47.45 0.00 0.00 21028.54 4649.94 15478.98 00:38:03.296 0 00:38:03.296 08:51:15 chaining -- bdev/chaining.sh@167 -- # killprocess 1710190 00:38:03.296 08:51:15 chaining -- common/autotest_common.sh@948 -- # '[' -z 1710190 ']' 00:38:03.296 08:51:15 chaining -- common/autotest_common.sh@952 -- # kill -0 1710190 00:38:03.296 08:51:15 chaining -- common/autotest_common.sh@953 -- # uname 00:38:03.296 08:51:15 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:38:03.296 08:51:15 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1710190 00:38:03.296 08:51:15 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:38:03.296 08:51:15 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:38:03.296 08:51:15 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1710190' 00:38:03.296 killing process with pid 1710190 00:38:03.296 08:51:15 chaining -- common/autotest_common.sh@967 -- # kill 1710190 00:38:03.296 Received shutdown signal, test time was about 5.000000 seconds 00:38:03.296 00:38:03.296 Latency(us) 00:38:03.296 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:03.296 =================================================================================================================== 00:38:03.296 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:38:03.296 08:51:15 chaining -- common/autotest_common.sh@972 -- # wait 1710190 00:38:04.230 08:51:16 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:38:04.230 08:51:16 chaining -- bdev/chaining.sh@170 -- # killprocess 1710190 00:38:04.230 08:51:16 chaining -- common/autotest_common.sh@948 -- # '[' -z 1710190 ']' 00:38:04.230 08:51:16 chaining -- common/autotest_common.sh@952 -- # kill -0 1710190 00:38:04.230 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (1710190) - No such process 00:38:04.230 08:51:16 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 1710190 is not found' 00:38:04.230 Process with pid 1710190 is not found 00:38:04.230 08:51:16 chaining -- bdev/chaining.sh@171 -- # wait 1710190 00:38:04.230 08:51:16 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:38:04.231 08:51:16 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:38:04.231 08:51:16 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:38:04.231 08:51:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@296 -- # e810=() 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@297 -- # x722=() 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@298 -- # mlx=() 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:18:00.0 (0x8086 - 0x159b)' 00:38:04.231 Found 0000:18:00.0 (0x8086 - 0x159b) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:18:00.1 (0x8086 - 0x159b)' 00:38:04.231 Found 0000:18:00.1 (0x8086 - 0x159b) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:18:00.0: cvl_0_0' 00:38:04.231 Found net devices under 0000:18:00.0: cvl_0_0 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:18:00.1: cvl_0_1' 00:38:04.231 Found net devices under 0000:18:00.1: cvl_0_1 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:38:04.231 08:51:16 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:38:04.490 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:38:04.490 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.153 ms 00:38:04.490 00:38:04.490 --- 10.0.0.2 ping statistics --- 00:38:04.490 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:38:04.490 rtt min/avg/max/mdev = 0.153/0.153/0.153/0.000 ms 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:38:04.490 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:38:04.490 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.114 ms 00:38:04.490 00:38:04.490 --- 10.0.0.1 ping statistics --- 00:38:04.490 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:38:04.490 rtt min/avg/max/mdev = 0.114/0.114/0.114/0.000 ms 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@422 -- # return 0 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:38:04.490 08:51:16 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:38:04.749 08:51:17 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:38:04.749 08:51:17 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:38:04.749 08:51:17 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:38:04.749 08:51:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:04.749 08:51:17 chaining -- nvmf/common.sh@481 -- # nvmfpid=1711576 00:38:04.749 08:51:17 chaining -- nvmf/common.sh@482 -- # waitforlisten 1711576 00:38:04.749 08:51:17 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:38:04.749 08:51:17 chaining -- common/autotest_common.sh@829 -- # '[' -z 1711576 ']' 00:38:04.749 08:51:17 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:38:04.749 08:51:17 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:38:04.749 08:51:17 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:38:04.749 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:38:04.749 08:51:17 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:38:04.749 08:51:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:04.749 [2024-07-23 08:51:17.112981] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:38:04.749 [2024-07-23 08:51:17.113074] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:38:04.749 [2024-07-23 08:51:17.248012] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:05.007 [2024-07-23 08:51:17.461516] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:38:05.007 [2024-07-23 08:51:17.461561] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:38:05.007 [2024-07-23 08:51:17.461573] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:38:05.007 [2024-07-23 08:51:17.461587] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:38:05.007 [2024-07-23 08:51:17.461596] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:38:05.007 [2024-07-23 08:51:17.461626] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:38:05.574 08:51:17 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:38:05.574 08:51:17 chaining -- common/autotest_common.sh@862 -- # return 0 00:38:05.574 08:51:17 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:38:05.574 08:51:17 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:38:05.574 08:51:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:05.574 08:51:17 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:38:05.574 08:51:17 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:38:05.574 08:51:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:05.574 08:51:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:05.574 malloc0 00:38:05.574 [2024-07-23 08:51:17.966109] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:38:05.574 [2024-07-23 08:51:17.982286] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:38:05.574 08:51:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:05.574 08:51:17 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:38:05.574 08:51:17 chaining -- bdev/chaining.sh@189 -- # bperfpid=1711786 00:38:05.574 08:51:17 chaining -- bdev/chaining.sh@191 -- # waitforlisten 1711786 /var/tmp/bperf.sock 00:38:05.574 08:51:17 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:38:05.574 08:51:17 chaining -- common/autotest_common.sh@829 -- # '[' -z 1711786 ']' 00:38:05.574 08:51:17 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:38:05.574 08:51:17 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:38:05.574 08:51:17 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:38:05.574 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:38:05.574 08:51:17 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:38:05.574 08:51:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:05.574 [2024-07-23 08:51:18.070484] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:38:05.574 [2024-07-23 08:51:18.070578] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1711786 ] 00:38:05.832 [2024-07-23 08:51:18.192665] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:06.090 [2024-07-23 08:51:18.410804] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:06.347 08:51:18 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:38:06.347 08:51:18 chaining -- common/autotest_common.sh@862 -- # return 0 00:38:06.347 08:51:18 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:38:06.347 08:51:18 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:38:07.280 [2024-07-23 08:51:19.486654] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:38:07.280 nvme0n1 00:38:07.280 true 00:38:07.280 crypto0 00:38:07.280 08:51:19 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:38:07.280 Running I/O for 5 seconds... 00:38:12.542 00:38:12.542 Latency(us) 00:38:12.542 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:12.542 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:38:12.542 Verification LBA range: start 0x0 length 0x2000 00:38:12.542 crypto0 : 5.02 11139.39 43.51 0.00 0.00 22915.83 3308.01 19348.72 00:38:12.543 =================================================================================================================== 00:38:12.543 Total : 11139.39 43.51 0.00 0.00 22915.83 3308.01 19348.72 00:38:12.543 0 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@39 -- # opcode= 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@205 -- # sequence=111784 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:38:12.543 08:51:24 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:38:12.543 08:51:25 chaining -- bdev/chaining.sh@206 -- # encrypt=55892 00:38:12.543 08:51:25 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:38:12.543 08:51:25 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:38:12.543 08:51:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:12.543 08:51:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:12.543 08:51:25 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:38:12.543 08:51:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:38:12.543 08:51:25 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:38:12.543 08:51:25 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:38:12.543 08:51:25 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:38:12.543 08:51:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:38:12.801 08:51:25 chaining -- bdev/chaining.sh@207 -- # decrypt=55892 00:38:12.801 08:51:25 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:38:12.801 08:51:25 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:38:12.801 08:51:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:12.801 08:51:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:12.801 08:51:25 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:38:12.801 08:51:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:38:12.801 08:51:25 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:38:12.801 08:51:25 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:38:12.801 08:51:25 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:38:12.801 08:51:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:38:13.060 08:51:25 chaining -- bdev/chaining.sh@208 -- # crc32c=111784 00:38:13.060 08:51:25 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:38:13.060 08:51:25 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:38:13.060 08:51:25 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:38:13.060 08:51:25 chaining -- bdev/chaining.sh@214 -- # killprocess 1711786 00:38:13.060 08:51:25 chaining -- common/autotest_common.sh@948 -- # '[' -z 1711786 ']' 00:38:13.060 08:51:25 chaining -- common/autotest_common.sh@952 -- # kill -0 1711786 00:38:13.060 08:51:25 chaining -- common/autotest_common.sh@953 -- # uname 00:38:13.060 08:51:25 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:38:13.060 08:51:25 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1711786 00:38:13.060 08:51:25 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:38:13.060 08:51:25 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:38:13.060 08:51:25 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1711786' 00:38:13.060 killing process with pid 1711786 00:38:13.060 08:51:25 chaining -- common/autotest_common.sh@967 -- # kill 1711786 00:38:13.060 Received shutdown signal, test time was about 5.000000 seconds 00:38:13.060 00:38:13.060 Latency(us) 00:38:13.060 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:13.060 =================================================================================================================== 00:38:13.060 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:38:13.060 08:51:25 chaining -- common/autotest_common.sh@972 -- # wait 1711786 00:38:14.436 08:51:26 chaining -- bdev/chaining.sh@219 -- # bperfpid=1713320 00:38:14.436 08:51:26 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:38:14.436 08:51:26 chaining -- bdev/chaining.sh@221 -- # waitforlisten 1713320 /var/tmp/bperf.sock 00:38:14.436 08:51:26 chaining -- common/autotest_common.sh@829 -- # '[' -z 1713320 ']' 00:38:14.436 08:51:26 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:38:14.436 08:51:26 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:38:14.436 08:51:26 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:38:14.436 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:38:14.436 08:51:26 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:38:14.436 08:51:26 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:14.436 [2024-07-23 08:51:26.780370] Starting SPDK v24.09-pre git sha1 f7b31b2b9 / DPDK 24.03.0 initialization... 00:38:14.436 [2024-07-23 08:51:26.780483] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1713320 ] 00:38:14.436 [2024-07-23 08:51:26.904301] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:14.695 [2024-07-23 08:51:27.138136] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:15.261 08:51:27 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:38:15.261 08:51:27 chaining -- common/autotest_common.sh@862 -- # return 0 00:38:15.261 08:51:27 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:38:15.261 08:51:27 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:38:15.828 [2024-07-23 08:51:28.161280] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:38:15.828 nvme0n1 00:38:15.828 true 00:38:15.828 crypto0 00:38:15.828 08:51:28 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:38:15.828 Running I/O for 5 seconds... 00:38:21.132 00:38:21.132 Latency(us) 00:38:21.132 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:21.132 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:38:21.132 Verification LBA range: start 0x0 length 0x200 00:38:21.132 crypto0 : 5.01 2217.07 138.57 0.00 0.00 14159.21 1427.75 15166.90 00:38:21.132 =================================================================================================================== 00:38:21.132 Total : 2217.07 138.57 0.00 0.00 14159.21 1427.75 15166.90 00:38:21.132 0 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@39 -- # opcode= 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@233 -- # sequence=22196 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:38:21.132 08:51:33 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@234 -- # encrypt=11098 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@235 -- # decrypt=11098 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:38:21.391 08:51:33 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:38:21.649 08:51:34 chaining -- bdev/chaining.sh@236 -- # crc32c=22196 00:38:21.649 08:51:34 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:38:21.649 08:51:34 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:38:21.649 08:51:34 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:38:21.649 08:51:34 chaining -- bdev/chaining.sh@242 -- # killprocess 1713320 00:38:21.649 08:51:34 chaining -- common/autotest_common.sh@948 -- # '[' -z 1713320 ']' 00:38:21.649 08:51:34 chaining -- common/autotest_common.sh@952 -- # kill -0 1713320 00:38:21.649 08:51:34 chaining -- common/autotest_common.sh@953 -- # uname 00:38:21.649 08:51:34 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:38:21.649 08:51:34 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1713320 00:38:21.650 08:51:34 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:38:21.650 08:51:34 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:38:21.650 08:51:34 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1713320' 00:38:21.650 killing process with pid 1713320 00:38:21.650 08:51:34 chaining -- common/autotest_common.sh@967 -- # kill 1713320 00:38:21.650 Received shutdown signal, test time was about 5.000000 seconds 00:38:21.650 00:38:21.650 Latency(us) 00:38:21.650 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:21.650 =================================================================================================================== 00:38:21.650 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:38:21.650 08:51:34 chaining -- common/autotest_common.sh@972 -- # wait 1713320 00:38:23.024 08:51:35 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:38:23.024 08:51:35 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:38:23.024 08:51:35 chaining -- nvmf/common.sh@117 -- # sync 00:38:23.024 08:51:35 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:38:23.024 08:51:35 chaining -- nvmf/common.sh@120 -- # set +e 00:38:23.024 08:51:35 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:38:23.024 08:51:35 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:38:23.024 rmmod nvme_tcp 00:38:23.024 rmmod nvme_fabrics 00:38:23.024 rmmod nvme_keyring 00:38:23.024 08:51:35 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:38:23.024 08:51:35 chaining -- nvmf/common.sh@124 -- # set -e 00:38:23.024 08:51:35 chaining -- nvmf/common.sh@125 -- # return 0 00:38:23.024 08:51:35 chaining -- nvmf/common.sh@489 -- # '[' -n 1711576 ']' 00:38:23.024 08:51:35 chaining -- nvmf/common.sh@490 -- # killprocess 1711576 00:38:23.024 08:51:35 chaining -- common/autotest_common.sh@948 -- # '[' -z 1711576 ']' 00:38:23.024 08:51:35 chaining -- common/autotest_common.sh@952 -- # kill -0 1711576 00:38:23.024 08:51:35 chaining -- common/autotest_common.sh@953 -- # uname 00:38:23.024 08:51:35 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:38:23.024 08:51:35 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1711576 00:38:23.282 08:51:35 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:38:23.282 08:51:35 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:38:23.283 08:51:35 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1711576' 00:38:23.283 killing process with pid 1711576 00:38:23.283 08:51:35 chaining -- common/autotest_common.sh@967 -- # kill 1711576 00:38:23.283 08:51:35 chaining -- common/autotest_common.sh@972 -- # wait 1711576 00:38:24.660 08:51:36 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:38:24.660 08:51:36 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:38:24.660 08:51:36 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:38:24.660 08:51:36 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:38:24.660 08:51:36 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:38:24.660 08:51:36 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:38:24.660 08:51:36 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:38:24.660 08:51:36 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:38:26.563 08:51:38 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:38:26.563 08:51:38 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:38:26.563 00:38:26.563 real 1m1.789s 00:38:26.563 user 1m22.224s 00:38:26.563 sys 0m10.458s 00:38:26.563 08:51:38 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:38:26.563 08:51:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:26.563 ************************************ 00:38:26.563 END TEST chaining 00:38:26.563 ************************************ 00:38:26.563 08:51:38 -- common/autotest_common.sh@1142 -- # return 0 00:38:26.563 08:51:38 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:38:26.563 08:51:38 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:38:26.563 08:51:38 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:38:26.563 08:51:38 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:38:26.563 08:51:38 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:38:26.563 08:51:38 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:38:26.563 08:51:38 -- common/autotest_common.sh@722 -- # xtrace_disable 00:38:26.563 08:51:38 -- common/autotest_common.sh@10 -- # set +x 00:38:26.563 08:51:38 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:38:26.563 08:51:38 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:38:26.563 08:51:38 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:38:26.563 08:51:38 -- common/autotest_common.sh@10 -- # set +x 00:38:30.749 INFO: APP EXITING 00:38:30.749 INFO: killing all VMs 00:38:30.749 INFO: killing vhost app 00:38:30.749 INFO: EXIT DONE 00:38:33.280 Waiting for block devices as requested 00:38:33.280 0000:60:00.0 (8086 0a54): vfio-pci -> nvme 00:38:33.280 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:38:33.280 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:38:33.280 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:38:33.280 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:38:33.280 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:38:33.280 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:38:33.538 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:38:33.538 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:38:33.538 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:38:33.538 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:38:33.797 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:38:33.797 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:38:33.797 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:38:34.055 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:38:34.055 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:38:34.055 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:38:37.340 Cleaning 00:38:37.340 Removing: /var/run/dpdk/spdk0/config 00:38:37.340 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:38:37.340 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:38:37.340 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:38:37.340 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:38:37.340 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:38:37.340 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:38:37.340 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:38:37.340 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:38:37.340 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:38:37.340 Removing: /var/run/dpdk/spdk0/hugepage_info 00:38:37.341 Removing: /dev/shm/nvmf_trace.0 00:38:37.341 Removing: /dev/shm/spdk_tgt_trace.pid1324369 00:38:37.341 Removing: /var/run/dpdk/spdk0 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1317518 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1321063 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1324369 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1325399 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1326901 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1327668 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1329216 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1329467 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1330302 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1335271 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1338373 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1338926 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1339735 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1340567 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1341379 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1341865 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1342172 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1342489 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1343735 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1347344 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1347782 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1348118 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1348942 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1349397 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1349729 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1350250 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1350772 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1351294 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1351820 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1352343 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1352867 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1353389 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1353919 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1354437 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1354960 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1355482 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1356004 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1356535 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1357055 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1357596 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1358157 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1358720 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1359306 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1359844 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1360358 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1361180 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1361779 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1362444 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1362979 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1363518 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1364222 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1364797 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1365338 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1365882 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1366724 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1367683 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1368879 00:38:37.341 Removing: /var/run/dpdk/spdk_pid1369648 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1374817 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1377694 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1380557 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1382325 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1384347 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1385350 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1385402 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1385663 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1391266 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1392064 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1393813 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1394359 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1401661 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1403641 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1404969 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1410201 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1412205 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1413397 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1418709 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1421533 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1422672 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1433787 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1436317 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1437693 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1448838 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1451398 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1452763 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1463880 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1467766 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1469180 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1481726 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1484584 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1485977 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1498500 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1501486 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1502930 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1515457 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1519736 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1521322 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1522702 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1526451 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1532444 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1535690 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1541387 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1545644 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1551959 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1555715 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1563601 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1566325 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1573817 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1576572 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1583899 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1586707 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1591731 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1592499 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1593470 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1594250 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1595174 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1596346 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1597518 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1598161 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1600966 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1603981 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1606791 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1609309 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1618958 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1625304 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1628178 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1631072 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1633902 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1636193 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1645624 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1652183 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1653706 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1654740 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1658491 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1661607 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1664731 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1666766 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1668805 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1670089 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1670338 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1670639 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1671422 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1671924 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1673629 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1676014 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1678400 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1679812 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1681179 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1681769 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1681980 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1682228 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1683569 00:38:37.600 Removing: /var/run/dpdk/spdk_pid1685092 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1686131 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1689854 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1692980 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1696069 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1698044 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1700136 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1701409 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1701579 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1705576 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1706040 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1706565 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1707082 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1707617 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1708696 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1710190 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1711786 00:38:37.859 Removing: /var/run/dpdk/spdk_pid1713320 00:38:37.859 Clean 00:38:37.859 08:51:50 -- common/autotest_common.sh@1451 -- # return 0 00:38:37.859 08:51:50 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:38:37.859 08:51:50 -- common/autotest_common.sh@728 -- # xtrace_disable 00:38:37.859 08:51:50 -- common/autotest_common.sh@10 -- # set +x 00:38:37.859 08:51:50 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:38:37.859 08:51:50 -- common/autotest_common.sh@728 -- # xtrace_disable 00:38:37.859 08:51:50 -- common/autotest_common.sh@10 -- # set +x 00:38:37.859 08:51:50 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:38:37.859 08:51:50 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:38:37.859 08:51:50 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:38:37.859 08:51:50 -- spdk/autotest.sh@391 -- # hash lcov 00:38:37.859 08:51:50 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:38:37.859 08:51:50 -- spdk/autotest.sh@393 -- # hostname 00:38:37.859 08:51:50 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-23 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:38:38.147 geninfo: WARNING: invalid characters removed from testname! 00:39:00.082 08:52:09 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:39:00.082 08:52:12 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:39:01.462 08:52:13 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:39:03.367 08:52:15 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:39:04.744 08:52:17 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:39:06.648 08:52:18 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:39:08.025 08:52:20 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:39:08.284 08:52:20 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:39:08.284 08:52:20 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:39:08.284 08:52:20 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:39:08.284 08:52:20 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:39:08.284 08:52:20 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:39:08.284 08:52:20 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:39:08.284 08:52:20 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:39:08.284 08:52:20 -- paths/export.sh@5 -- $ export PATH 00:39:08.284 08:52:20 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:39:08.284 08:52:20 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:39:08.284 08:52:20 -- common/autobuild_common.sh@447 -- $ date +%s 00:39:08.284 08:52:20 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721717540.XXXXXX 00:39:08.284 08:52:20 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721717540.cUXBL6 00:39:08.284 08:52:20 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:39:08.284 08:52:20 -- common/autobuild_common.sh@453 -- $ '[' -n '' ']' 00:39:08.284 08:52:20 -- common/autobuild_common.sh@456 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:39:08.284 08:52:20 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:39:08.284 08:52:20 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:39:08.284 08:52:20 -- common/autobuild_common.sh@463 -- $ get_config_params 00:39:08.284 08:52:20 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:39:08.284 08:52:20 -- common/autotest_common.sh@10 -- $ set +x 00:39:08.284 08:52:20 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-asan --enable-coverage --with-ublk' 00:39:08.284 08:52:20 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:39:08.284 08:52:20 -- pm/common@17 -- $ local monitor 00:39:08.284 08:52:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:39:08.284 08:52:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:39:08.284 08:52:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:39:08.284 08:52:20 -- pm/common@21 -- $ date +%s 00:39:08.284 08:52:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:39:08.284 08:52:20 -- pm/common@21 -- $ date +%s 00:39:08.284 08:52:20 -- pm/common@25 -- $ sleep 1 00:39:08.284 08:52:20 -- pm/common@21 -- $ date +%s 00:39:08.284 08:52:20 -- pm/common@21 -- $ date +%s 00:39:08.284 08:52:20 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721717540 00:39:08.284 08:52:20 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721717540 00:39:08.284 08:52:20 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721717540 00:39:08.284 08:52:20 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721717540 00:39:08.284 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721717540_collect-vmstat.pm.log 00:39:08.284 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721717540_collect-cpu-load.pm.log 00:39:08.284 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721717540_collect-cpu-temp.pm.log 00:39:08.284 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721717540_collect-bmc-pm.bmc.pm.log 00:39:09.222 08:52:21 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:39:09.222 08:52:21 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j96 00:39:09.222 08:52:21 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:39:09.223 08:52:21 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:39:09.223 08:52:21 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:39:09.223 08:52:21 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:39:09.223 08:52:21 -- spdk/autopackage.sh@19 -- $ timing_finish 00:39:09.223 08:52:21 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:39:09.223 08:52:21 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:39:09.223 08:52:21 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:39:09.223 08:52:21 -- spdk/autopackage.sh@20 -- $ exit 0 00:39:09.223 08:52:21 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:39:09.223 08:52:21 -- pm/common@29 -- $ signal_monitor_resources TERM 00:39:09.223 08:52:21 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:39:09.223 08:52:21 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:39:09.223 08:52:21 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:39:09.223 08:52:21 -- pm/common@44 -- $ pid=1726885 00:39:09.223 08:52:21 -- pm/common@50 -- $ kill -TERM 1726885 00:39:09.223 08:52:21 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:39:09.223 08:52:21 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:39:09.223 08:52:21 -- pm/common@44 -- $ pid=1726886 00:39:09.223 08:52:21 -- pm/common@50 -- $ kill -TERM 1726886 00:39:09.223 08:52:21 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:39:09.223 08:52:21 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:39:09.223 08:52:21 -- pm/common@44 -- $ pid=1726889 00:39:09.223 08:52:21 -- pm/common@50 -- $ kill -TERM 1726889 00:39:09.223 08:52:21 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:39:09.223 08:52:21 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:39:09.223 08:52:21 -- pm/common@44 -- $ pid=1726909 00:39:09.223 08:52:21 -- pm/common@50 -- $ sudo -E kill -TERM 1726909 00:39:09.223 + [[ -n 1197315 ]] 00:39:09.223 + sudo kill 1197315 00:39:09.233 [Pipeline] } 00:39:09.254 [Pipeline] // stage 00:39:09.261 [Pipeline] } 00:39:09.281 [Pipeline] // timeout 00:39:09.288 [Pipeline] } 00:39:09.307 [Pipeline] // catchError 00:39:09.314 [Pipeline] } 00:39:09.334 [Pipeline] // wrap 00:39:09.342 [Pipeline] } 00:39:09.359 [Pipeline] // catchError 00:39:09.371 [Pipeline] stage 00:39:09.373 [Pipeline] { (Epilogue) 00:39:09.391 [Pipeline] catchError 00:39:09.393 [Pipeline] { 00:39:09.409 [Pipeline] echo 00:39:09.411 Cleanup processes 00:39:09.419 [Pipeline] sh 00:39:09.734 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:39:09.734 1727031 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:39:09.734 1727308 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:39:09.747 [Pipeline] sh 00:39:10.030 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:39:10.030 ++ grep -v 'sudo pgrep' 00:39:10.030 ++ awk '{print $1}' 00:39:10.030 + sudo kill -9 1727031 00:39:10.042 [Pipeline] sh 00:39:10.327 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:39:10.327 xz: Reduced the number of threads from 96 to 87 to not exceed the memory usage limit of 14,468 MiB 00:39:15.620 xz: Reduced the number of threads from 96 to 87 to not exceed the memory usage limit of 14,468 MiB 00:39:19.823 [Pipeline] sh 00:39:20.106 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:39:20.106 Artifacts sizes are good 00:39:20.121 [Pipeline] archiveArtifacts 00:39:20.129 Archiving artifacts 00:39:20.283 [Pipeline] sh 00:39:20.568 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:39:20.583 [Pipeline] cleanWs 00:39:20.593 [WS-CLEANUP] Deleting project workspace... 00:39:20.593 [WS-CLEANUP] Deferred wipeout is used... 00:39:20.599 [WS-CLEANUP] done 00:39:20.601 [Pipeline] } 00:39:20.622 [Pipeline] // catchError 00:39:20.636 [Pipeline] sh 00:39:20.918 + logger -p user.info -t JENKINS-CI 00:39:20.927 [Pipeline] } 00:39:20.944 [Pipeline] // stage 00:39:20.951 [Pipeline] } 00:39:20.970 [Pipeline] // node 00:39:20.976 [Pipeline] End of Pipeline 00:39:21.015 Finished: SUCCESS